View Single Post
  #31   Report Post  
Tim May
 
Posts: n/a
Default

In article , Lawrence
Glickman wrote:

So to be fair, we do benchmarking using different algorithms, to give
us an idea of SYSTEM performance, not raw Mips. But again, I am not
going to buy an elephant rifle to kill a mosquito. Maybe scientific
laboratories can profit from these high-end machines, but unless I get
into analog to digital interpolation, it is highly doubtful I am going
to benefit from having a cannon on board where a pellet pistol would
do the job just as well.

And even then, the throughput of the A/D converter is going to
determine to a large extent what limitations my *system* would have.

I try to stay digital all the way. Keep analog out of it, and you've
solved a lot of problems before they even got off the launch pad.


You're missing the point. The ADC is _not_ what makes rendering take
many, many hours. In fact, the output of a 13.7 GB digital video (DV)
is fed directly into a computer via Firewire ports. Digital at all
times once it's in the camcorder, so digital at all times in the
computer.

What takes time is the resampling/resizing/reformatting/rewhatevering
to "fit" 13.7 GB of digital data, for example, into 4.7 GB of space on
the the blank DVD. This is akin to when a printer driver says "The
image will not fit on this page...would you like to crop, scale, or
abort?" Except with 13.7 GB that's a lot of data, a lot of swapping to
disk, a lot of work.

And I mentioned rendering only as one frequently-encountered "overnight
job." There are a lot of things which actually take measurable time.
The point being that your "I can do a factorial in the blink of an eye
so I obviously don't need a faster computer!" example is silly.

And if you never need a faster computer, why did you even buy a 2.5 GHz
machine in the _first_ place?


--Tim May