View Single Post
  #122   Report Post  
Posted to alt.engineering.electrical,alt.home.repair
VWWall[_2_] VWWall[_2_] is offline
external usenet poster
 
Posts: 22
Default Constitutionality of light bulb ban questioned - EnvironmentalProtection Agency must be called for a broken bulb

Andrew Gabriel wrote:
In article ,
VWWall writes:
Beyond a certain CPU speed, other factors have a greater influence on
thru-put. Connection lengths become important, as do parasitic circuit
elements. AMD first exploited this in emphasizing CPU architecture
rather than brute speed.


Possibly in the x86 arena, but this idea originated elsewhere;
Sun UltraSPARC IV predated it, and the earlier work of Afara Websystems
which eventually led to Sun's original 8-core Niagra SPARC chip.


I was referring to CPUs in common use in PCs. There have been a lot of
special purpose CPUs, none of which has a large following. I recall one
that didn't even have an increment instruction. Instead of x++ you used
x=x+1! They claimed the addition instruction was faster.

Multiple CPUs and cache memory on chip are
good examples of this. A CPU cannot operate faster than the rate at
which data is supplied to it.

Present 32bit operating systems are not even capable of directly
addressing over 4GB of memory, even as memory is becoming faster and


32 bit OS's have been accessing over 4GB memory for well over a decade.
Even PC's, which were probably the last hardware platform to do so,
introduced Intel's PAE with the Pentium Pro (1995?).


By "directly" I meant having a register capable of holding the address
of 4GB memory. There have been "segmented" memory programs since Bill
Gates said 940KB was plenty for anyone.

cheaper. There are very few applications that can use the advantage of
a 64bit OS, even when it's limited to using more memory.


Databases and other applications accessing over 4Gb of data are
not exactly rare.


PhotoShop in the last few releases, can, with a 64bit OS, use memory
instead of writing to disk.

In Windows XP x64, MS resorted to WoW, (Windows on Windows), to allow 32
bit application to work properly. (It's still one of the better OSs
Microsoft has produced.)


OK, 64 bit Windows might be of limited use, but don't tarnish all
OS's with such a claim. The x86/PC architecture allows 32 bit and
64 bit applications to run together on the same OS (OS permitting).


This is written using 32bit Linux, (PCLOS), on a 64bit AMD CPU. The
same box multi-boots WinXP x64 as well as a couple of 64bit Linux
distros, openSUSE 11.0 being the latest. Actually, Win XP x64 runs most
of my applications better than the 32bit version. The few 64bit drivers
available are an improvement.

I've been running Linux with kernels capable of 4GB memory use for some
time. The lack of 64bit drivers in addition to applications, limits its
usefulness. I had great hopes for Vists 64bit, but it looks like it's
not doing much to encourage 64bit development.

Many of the less expensive motherboards cannot handle 8GB 0f memory. I
have seen a couple, that had the slots, but slowed memory access when
fully populated.

With the present crop of PCs, the eventual bottleneck may become the
BIOS. It's been twiddled, patched, augmented but still is much like the
one produced by IBM for the first "personal computer".


I don't think any PC OS's still use the BIOS once booted for
at least a decade, and in some cases nearer 2 decades.


They've added LBA 48, ext 13, APCI. My old DOS debug still runs on it!

Getting our computers to do more faster, will depend more on better
input-output mechanisms and better applications, rather than on faster CPUs.


and better OS's (in multiple respects).

Someone once said the reason God could create the universe in six days
was because it didn't have to be backward compatible! :-)

--
Virg Wall