William it depend on what parameters the software designers let you change and by how much.
Again, your assumptions are correct, but the ability to put them into practice maybe limited (necessarily?) by the designers of the software encoder.
You picked Ligos as an example. From what I've read of the Ligos stuff you get to select the search range or Motion Estimation as a number from 0 to 20. But I have no idea how much distance each step actually means.
From Ligos Doc's:
Quality (Motion Estimation) ? Along with video data rate, the motion estimation setting in our applications has the greatest effect on the quality of the video output. This setting may appear as a number, or a quality rating such as ?Slow? or ?Highest Quality?. The quality setting equates to how much of your processor resources are given to the Motion Estimation algorithm during encoding. Slow modes of motion estimation should generally provide better quality. The results can be simply defined as follows: - ?0? Low quality, ?Very Fast? encoding, lowest processor utilization - ?5? Normal quality, ?Fast? encoding, low processor utilization - ?10? High quality, ?Normal? speed, default processor utilization - ?15? Very high quality, ?Slow? speed, more processor utilization - ?20? Maximum quality, ?Very Slow? speed, maximum processor utilization
ligos.com
If you could tweak all the important parameters their full range and let the software run a couple of weeks on an hour or two of video, you could probably do better than hardware that is optimized for the most common scenario's.
But then, who wants a VCR that can't capture real time. Or a news clip that's a week late... Guess it depends on the application. For DVD it could make sense barring deadlines. In practice, using a Software encoder on some VOD stuff I worked on, waiting days to find out if the parameters I tweaked made the video looks better, hampered help my ablility to meet deadlines... |