With the help of another DSP user who had a similar issue, I have solved the y shortening issue at speeds < 110mm/s.
The DSP software allows you to set the distance for each step, which is how you dial in your system to produce exactly 1" output for a 1" vector. For my system, that is 12.67 micro meters (.01267 mm).
When scanning a vector object, you can set the scan interval (distance between each scan line). By default, this is .100 mm. Setting the scan interval to some multiple of the step size causes the step size and the scan interval to be in sync. for my system, setting the scan interval to .038 mm (3 x the step size) makes it all work perfectly.
While it was certainly losing steps, I am guessing it was because with scan interval of .100, it was (about 20% of the time) asking the head to move a distance that was less than one full step, thus losing a step.
None of this explains why it only happens at the slower speeds, but does solve the problem. It may be that the software adjusts the way it calculates scans for different speeds, I don't know.
When scanning bitmaps, the scan interval setting is disabled. However, ensuring that the bitmap resolution (dpi) is set at some multiple of the conversion between mm and inches (25.4 mm/inch), like 127, 254, 508, etc, works.
Like most problems, the answer appears to be obvious and easy, once you know what it is.
tim