View Single Post
  #83   Report Post  
Posted to misc.consumers,alt.home.repair,rec.food.cooking
Paul M. Cook Paul M. Cook is offline
external usenet poster
 
Posts: 58
Default OT Wrong advertised specifications


"Doug Miller" wrote in message
...
In article LyLDi.1770$s06.155@trnddc04, "Paul M. Cook"

wrote:

"Doug Miller" wrote in message
et...
In article INEDi.12859$sf1.7349@trnddc01, "Paul M. Cook"
wrote:

"Doug Miller" wrote in message
gy.net...
In article o2EDi.7076$3R5.943@trnddc05, "Paul M. Cook"
wrote:

"Doug Miller" wrote in message
t...
In article , "HeyBub"
wrote:
Doug Miller wrote:
In article ORxDi.12718$sf1.3859@trnddc01, "Paul M. Cook"
wrote:

Nope. It is not. Shortness means nothing, speed means
everything.

You think the length of the signal path has nothing to do with
speed?

Waves hand!

I do! I do! Pick me!

You are aware, aren't you, that the speed of signal propagation is
finite?

First, you have something called a clock in the computer. All
computers
have a clock, they cannot run without one. Second, the signals can
only
be
passed during a clock cycle. The speed of light is far faster than

any
clock we can employ

You think so, do you?

1GHz clock rate = 1 nanosecond cycle length. How far do you suppose
light
moves in a nanosecond?

[Lack of response noted]


11.8 inches I happen to have one of Grace Hoppers nanoseconds. It is a
length of wire 11.8 inches long. I got it from her when I attended a

speech
she gave at the DODARPA office I worked at in 1985.



therefore we are not dealing with theoretical limits we
are dealing with practical limits i.e. the duration of each clock
cycle.
So
in the case of a 2 inch wire trace, it would not matter if the trace
were
1
inch because you can't get the data into the CPU any faster than it
already
is.

I won't argue that the difference between one inch and two doesn't
matter
at
all -- YET -- but I'll leave it as an exercise for you to compute the
approximate clock speed at which the difference between two inches

and
three
*does*, and then invite you to explore the availability of existing
processors
in that range.

[Lack of substantive response noted]


What you should not is that you do not understand what I am saying

because
you do not know what you are talking about. Did I mention I studied
computer science in college? We learned all kinds of stuff.


Oh, the old "argument from authority" fallacy. Too bad that formal logic
wasn't part of *your* computer science curriculum; it was in *mine*.


Doug, you lost the argument. You claimed that the shorter bus length

made
for a faster data transfer.

No, I didn't. I disagreed -- and still do -- with your claim that
"shortness means nothing".


You lost the argument. Your claim is patently incorrect. It is wrong.

It
sufferes from a dearth of correctnes. It is truth challenged. It is
factually insufficient. It's BS. You made a statement that was just

plain
wrong.


So say you. You've provided nothing to back that up, though.

If we were talking photon switches (a
theoretical possibility) then you'd be right. Someday, someday - you

will
be right. For today, you are wrong. The bottleneck in any computer is
the
CPUs ability to stay cool while you ramp up the clock speed. Silicon
melts
into a puddle of molten glass at the temperature generated by just the
speeds we are talking about today. Try running your computer without a
heat
sink and cooling fan and you'll see what I mean.

We are nowhere near, not even close, to being able to run CPUs so fast
they
can run at the speed of light *per* channel. Think of 186,000 mps

raised
to
the 32nd power then raise it by factors of 5286.

Again:

How far do you suppose light moves in a nanosecond?


11.8 inches

At what clock speed, approximately, does the difference between a

two-inch
and
three-inch signal path make a difference?


186,000 *2^(-32) That should get close enough.


Lack of accurate response noted.

What is the clock speed of the fastest processor on the market today?


Which manufacturer? AMD and Intel are not the only manufacturers, you
know?


Lack of response noted.

Give it up, Paul. You've lost the argument.



Yeah, ok whatever.
Paul