<VideoMode N1,N3,N4,N5 [options]>
<vm N1,N3,N4,N5 [options]>
Parameter to set the DirectX video mode. The video mode text must contain the 5 numbers that TimeDX uses to describe any given video mode, N1 is the horizontal size, N2 is the optionally reprogrammed vertical size, N3 is the real vertical DirectX video size that is re-programmed to N2's size, N4 is the color depth and N5 is the refresh rate. The desktop usage detects the current desktop settings and uses them, the video mode however still has to have been timed with TimeDX unless Auto mode is being used. If <VideoMode> is not provided then 640x480 (480) 8 bpp 0Hz is used, in effect a default <vm 640,480,480,8,0> is in effect for all item files unless DMDX is running in Auto mode in which case the desktop video mode is used as the default. As of version 5 1024x768 16 bpp 60 Hz has become the default video mode when auto mode isn't in use. There are only two valid values for options to date, the first is nvidia3D that invokes the NVIDIA 3D shutter glasses technology and the second one is freesync that's detailed in the FreeSync section below (both of which only work with the Direct3D renderer).
Seeing as re-programming the display's vertical size is rapidly becoming a thing of the past (it's not possible to do under Windows 2000 or XP) the variants are provided that obviate needing to have the vertical size entered twice -- so <vm 640,480,8,0> is the same as <vm 640,480,480,8,0>.
The refresh rate is always 0Hz under Windows 9X and ME which means the default refresh rate and that's up to the video card manufacturer (or whatever it's set to in display properties). Under 2000 and XP video drivers have started providing the ability to specify a refresh rate, however to maintain compatibility with old item files you can still use 0Hz if you choose to. If 0Hz is specified under Auto or EZ mode where there is no synchronization with the raster then 60Hz is used regardless of whether the display is running at different rate.
So with the advent of adaptive sync flat panels (be it AMD's FreeSync or NVidia's G-Sync) presenting displays of any arbitrary duration as long as it's as long as or longer than the minimum refresh interval of the display itself is now possible (typically these devices appear to have a native refresh rate of 144 Hz which opens up displays down to 7 or so milliseconds, I have however recently seen 240 Hz devices offering 4 ms displays). For the uninitiated there is no longer any fixed raster as such, instead retraces are generated whenever there's a change in the display and if there isn't a change when a panel nears it's maximum retention period the hardware (NVidia) or software (AMD) refreshes the display with a duplicate of whatever was last displayed. Note that you're probably not going to want to use the freesync option with digital video. Such flexibility doesn't come without some cost, first off because DMDX is no longer deliberately pacing the display it's possible to request what is impossible even under the best of circumstances (there's code to detect this and issue a warning but it's just that, a warning). Secondly there's some question about what happens at the maximum end of the refresh interval as the displays will at some point need refreshing when new frames are no longer displayed. TimeDX provides a FreeSync test to probe what happens but I suspect worst case if you want display durations in the order of the maximum refresh interval you'll have to break your display into two or more frames forcing a refresh when the display appears static. You could also probe the minimum refresh rate with the freesync tachistoscopic acid test mentioned below but I suspect the buffering of the video hardware is likely to obscure any display errors that might occur. This is because when the hardware (or software in AMD's case) forces a refresh of the display it's probably going to block DMDX presenting a new display till the refresh is completed (again, assuming a 144 Hz panel that's about 7 milliseconds) and typical minimum refresh rates I'm seeing are in the order of 20 to 50 Hz so a good deal of care is needed for displays over 20 milliseconds. Another consideration is the display that occurs before the controlled one in that if it's longer than the maximum retrace interval it will need to end with a short frame (a bit longer than the minimum retrace interval and below the maximum retrace interval) otherwise a refresh of the frame before the controlled one might block the controlled frame coming after. Some monitors may not suffer from this (AMD's FreeSync 2 specs claim there is no minimum refresh rate), again TimeDX's FreeSync test should be able to probe this. For instance if you say wanted a 37 millisecond prime with a forward mask you might require an item like this if you had a 144 Hz monitor with a minimum refresh rate of 50 Hz:
+100 "mask" / <msfd 9> <noerase> / <msfd 14> "prime" / <msfd 14> <noerase> / <msfd 9> <noerase> / * "target" ;
Needless to say you're going to want one of those 20 Hz maximum refresh displays and assuming you can find one of these magical monitors with no minimum refresh rate that same item would be:
+100 "mask" / <msfd 37> "prime" / * "target" ;
So much for the hardware, now we have to consider what this means for DMDX. Basically the raster is no longer tracked and internally the refresh interval becomes 1 millisecond so a display can be fired off any old time, however because the display still actually has a minimum refresh interval and people might want to use it keywords that specified durations in ticks (so <fd> and <delay> for instance) in the past still do, they specify duration in terms of the maximum refresh rate (typically 144 Hz). If you don't know what the minimum retrace interval is for your video mode use TimeDX's Refresh Rate test. Another reason to keep the ticks in there is that old item files are suddenly going be specifying berserk intervals that no display currently made can meet. Instead the millisecond variants (so <msfd> and <msdelay> for instance) can be used and instead of these millisecond intervals being divided up into integer refresh intervals when freesync is active they're left alone (indeed, the tick based specifications are all multiplied by the refresh interval internally).
So how are you going to verify that your setup and item file is actually presenting frames for the desired interval? Kilohertz cameras is how. DMDX will warn of obvious scheduling errors where one frame is scheduled to be displayed before the minimum retrace interval has expired but because graphics chips buffer hell of display requests and DMDX is no longer throttled by the retrace interval the only way to be really sure is to take a video of the screen with a camera that runs in the kilohertz range and watch the frames go by on the display. You can investigate this with the freesync tachistoscopic acid test I wrote and dial up all sorts of display durations quickly and easily to see how they perform however you'll probably find like I did when requesting silly display durations well below what was achievable on the hardware at hand that it took requests of five sequential impossible frame durations before DMDX could see a display error occurring because Present() had finally started blocking. So even the weak Intel video chipset in my laptop was perfectly happy caching four or five requests before letting on that it couldn't in fact meet them. Realistically one can argue that a kilohertz camera has always been the only way you can be sure of what you're requesting is being done but in the past I'd tested displays with my test apparatus, these days that's not the case any more. Fortunately for you these devices are relatively cheap these days. Test mode 14 will tell you if DMDX running on your hardware is basically capable of controlling frame durations to the millisecond but I'm betting almost all hardware is these days, if your millisecond callbacks are good test mode 14 should be good too. Test mode 15 gives you an idea of how rapidly you can shift displays but it's basically just millisecond latencies (test mode 2) combined with Present() latencies (test mode 13) with a little bit of overhead and unless you're presenting a series of 1 tick frames like the tachistoscopic acid test even quite pokey hardware these days will deal with 144 Hz as far as I can tell. The big question is what that display is doing and that's downstream of what DMDX can sense so the camera is pretty much the only way to be sure of what's happening.
Data gathered in freesync mode will generate files with "Freesync" in the video mode spew (assuming you're using auto mode):
! D3D Freesync Video Mode 1920,1080,24,59
Note that the freesync modifier probably has uses without an AMD FreeSync or NVidia G-Sync display where synchronizing DMDX's display to the raster isn't so critical such as instances where people would use EZ mode, here they would no longer need to make an EZ shortcut, just running DMDX with the freesync modifier will have much the same effect. On top of that if the timing of the audio is paramount then using freesync will allow the scheduling of audio to the millisecond instead of the usual tick. Not that I recommend DMDX for pure audio work but numbers of people throughout the years have done so and this marks a significant increase in accuracy there.