Metalworking (rec.crafts.metalworking) Discuss various aspects of working with metal, such as machining, welding, metal joining, screwing, casting, hardening/tempering, blacksmithing/forging, spinning and hammer work, sheet metal work.

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
  #81   Report Post  
Posted to rec.crafts.metalworking,misc.survivalism
Gunner
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

On Wed, 04 Jan 2006 13:57:54 GMT, SomeBody
wrote:

What I would do at this point, once you connect to your ISP, open a
command prompt (terminal window) and "ping yahoo.com" if it times out, try
to "ping 66.94.234.13" (yahoo), if it works then you have a problem with
DNS lookup, Look at /etc/resolv.conf, should have your ISP DNS servers
listed,

i.e.

nameserver xxx.xxx.xxx.xxx
nameserver xxx.xxx.xxx.xxx

substitute your ISP DNS servers for xxx.xxx.xxx.xxx

It should of updated this /etc/reslov.conf once connected. You could also
look in /etc/ppp for the config files for ppp

Good Luck..


What appears to be showing is local host isps, rather than the ISPs
name servers.

Network live, eth0 active

Contents of resolve.config

nameserver 206.13.28.12
nameserver 205.152.0.20
==========

Doing a route call from super user mode

root@1[~]# route
Kernel IP routing table
Destination Gateway Genmask Flags Metric Ref
Use Iface
192.168.0.0 * 255.255.255.0 U 0 0
0 eth0
==========
Network stopped, eth0 stopped

Connected via modem (kppp) to earthlin
Contents of resolve.config


domain earthlink.net #kppp temp entry
# nameserver 206.13.28.12 #entry disabled by kppp
# nameserver 205.152.0.20 #entry disabled by kppp
nameserver 207.217.126.81 #kppp temp entry
nameserver 207.217.77.82 #kppp temp entry
nameserver 207.217.95.8 #kppp temp entry
===============
Doing a route call from terminal

gunner@1[~]$ route
Kernel IP routing table
Destination Gateway Genmask Flags Metric Ref
Use Iface
acn09.ca-pasade * 255.255.255.255 UH 0 0
0 ppp0
default acn09.ca-pasade 0.0.0.0 UG 0 0
0 ppp0
gunner@1[~]$





Im running dialup, and the Box is configured to connect to the local
network here at home, via nic card on eth0. For some reason..it wants
to send internet requests to the lan, rather than the ISP (via
dialup), unless I turn off eth0. I run static IPs assigned to each
machine in the LAN in the 192.168.0,1-192.168.0.15 range, with a mask
of 255.255.255.0, not having a DCHP server assigning them.

Im rather like a moderately retarded kitten flailing around with a
ball of yarn..and the yarn is winning.

Ok..here is a big question...in dos, we have the config.sys and
autoexec.bat files, where house keeping settings are made, and then
various programs are run on boot. What is the Linux equivelent? What
is the boot sequence, and what is looked for as it boots?

Example..it makes a call and trys to look up ntpdate..outside time
server on bootup. Where is this and how do I edit it out?

I really really need to get some books on this damit...

Im tempted to reinstall, without the nic card, let it autoconfigure
the dialup, then add the nic card after.






On Mon, 02 Jan 2006 10:02:57 +0000, Gunner wrote:

Ok..for all you Linux junkies...this has been driving me nuts for
months and months and ... well you get the idea.
Originally..I thought this issue was hardware....but now...

Installing a number of distros of Linux:

Simply Mepis
Knoppix
Knottix
Fedo
Damned small Linux
Beatrix

When configureing PPP for dialup..its simple..set up your account,
modem, comm port..do a query...let it check..ok..no problem

However..in each and every one of those distros..using 3 differnt
kinds of USR external modems, A Supra 56 external, a Speed modem and
even 2 differnt kinds of internal ISA and PCI modems...

I can dial out. The ISP connects, I get the proper password etc
etc..it says Ive connected at x speed, all the proper lights are lit
on the modem(s), I open my browsers (4)...and it just ****ing sits
there.

Eventually it times out and says Unable to connect to bla bla.com or
whatever I was trying to open..but thats all

I open the details window of the PPP prog...and in the Received
box..it (received) incriments higer every so often..but the transmit
window..normally shows it stalled at 148 packets. And there she stays.

I was thinking this was something unique to my box...but today, I
farted around with two completly different boxes..a Compaq 700, and a
CopperMine clone.

All do the same thing. Ive tried every browser configureation known
to me..etc etc

Every thing works just hunky dorey if I set up a proxy on another
Winblows machine, set the Linux browsers to the proper proxy
settings..then I can go whereever I want. Upload, down load,
newsgroups bla bla bla..

Ive run off of cds, done full hd installs..the freaking works... for
about 7 months now off and on. Id get ****ed, and use it for a
server..then curiosity gets me by the shorthairs..and I try again.

Sometimes..I do notice the activity light on the switch blinking a bit
more often than normal when Im trying to connect. Like its trying to
surf the local network rather than the internet..but not alll the
time.

What the hell am I doing wrong???????

Mommy!!!!!!!!!!!!!!!!!!!

Gunner

"Pax Americana is a philosophy. Hardly an empire.
Making sure other people play nice and dont kill each other (and us)
off in job lots is hardly empire building, particularly when you give
them self determination under "play nice" rules.

Think of it as having your older brother knock the **** out of you
for torturing the cat." Gunner


"Pax Americana is a philosophy. Hardly an empire.
Making sure other people play nice and dont kill each other (and us)
off in job lots is hardly empire building, particularly when you give
them self determination under "play nice" rules.

Think of it as having your older brother knock the **** out of you
for torturing the cat." Gunner
  #82   Report Post  
Posted to rec.crafts.metalworking,misc.survivalism
Tim Wescott
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

Gunner wrote:

On Wed, 04 Jan 2006 13:57:54 GMT, SomeBody
wrote:


What I would do at this point, once you connect to your ISP, open a
command prompt (terminal window) and "ping yahoo.com" if it times out, try
to "ping 66.94.234.13" (yahoo), if it works then you have a problem with
DNS lookup, Look at /etc/resolv.conf, should have your ISP DNS servers
listed,

i.e.

nameserver xxx.xxx.xxx.xxx
nameserver xxx.xxx.xxx.xxx

substitute your ISP DNS servers for xxx.xxx.xxx.xxx

It should of updated this /etc/reslov.conf once connected. You could also
look in /etc/ppp for the config files for ppp

Good Luck..



What appears to be showing is local host isps, rather than the ISPs
name servers.

Network live, eth0 active

Contents of resolve.config

nameserver 206.13.28.12
nameserver 205.152.0.20
==========

Doing a route call from super user mode

root@1[~]# route
Kernel IP routing table
Destination Gateway Genmask Flags Metric Ref
Use Iface
192.168.0.0 * 255.255.255.0 U 0 0
0 eth0
==========
Network stopped, eth0 stopped

Connected via modem (kppp) to earthlin
Contents of resolve.config


domain earthlink.net #kppp temp entry
# nameserver 206.13.28.12 #entry disabled by kppp
# nameserver 205.152.0.20 #entry disabled by kppp
nameserver 207.217.126.81 #kppp temp entry
nameserver 207.217.77.82 #kppp temp entry
nameserver 207.217.95.8 #kppp temp entry
===============
Doing a route call from terminal

gunner@1[~]$ route
Kernel IP routing table
Destination Gateway Genmask Flags Metric Ref
Use Iface
acn09.ca-pasade * 255.255.255.255 UH 0 0
0 ppp0
default acn09.ca-pasade 0.0.0.0 UG 0 0
0 ppp0
gunner@1[~]$





Im running dialup, and the Box is configured to connect to the local
network here at home, via nic card on eth0. For some reason..it wants
to send internet requests to the lan, rather than the ISP (via
dialup), unless I turn off eth0. I run static IPs assigned to each
machine in the LAN in the 192.168.0,1-192.168.0.15 range, with a mask
of 255.255.255.0, not having a DCHP server assigning them.

Im rather like a moderately retarded kitten flailing around with a
ball of yarn..and the yarn is winning.

Ok..here is a big question...in dos, we have the config.sys and
autoexec.bat files, where house keeping settings are made, and then
various programs are run on boot. What is the Linux equivelent? What
is the boot sequence, and what is looked for as it boots?

Example..it makes a call and trys to look up ntpdate..outside time
server on bootup. Where is this and how do I edit it out?

I really really need to get some books on this damit...

Im tempted to reinstall, without the nic card, let it autoconfigure
the dialup, then add the nic card after.


Mid-posting. How charming.

"Linux In a Nutshell" by O'Reilly is a good one. I haven't tried it on
any networking problems yet -- at the moment I'm a Linux wannabe who
uses the Cygwin command line tools for software development in Windows
-- but so far it's answered almost all of my questions.

--

Tim Wescott
Wescott Design Services
http://www.wescottdesign.com
  #83   Report Post  
Posted to rec.crafts.metalworking,misc.survivalism
Gunner
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

On Wed, 04 Jan 2006 09:07:31 -0800, Tim Wescott
wrote:

Mid-posting. How charming.

"Linux In a Nutshell" by O'Reilly is a good one. I haven't tried it on
any networking problems yet -- at the moment I'm a Linux wannabe who
uses the Cygwin command line tools for software development in Windows
-- but so far it's answered almost all of my questions.


I just downloaded it from one of the ebook-flood binary groups. Ill
dig into it time permitting

Gunner

"Pax Americana is a philosophy. Hardly an empire.
Making sure other people play nice and dont kill each other (and us)
off in job lots is hardly empire building, particularly when you give
them self determination under "play nice" rules.

Think of it as having your older brother knock the **** out of you
for torturing the cat." Gunner
  #84   Report Post  
Posted to rec.crafts.metalworking
J. Clarke
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

Pete C. wrote:

"J. Clarke" wrote:

I was enjoying this until you started ranting about "stolen alpha
technology". At that point you started coming across as a loon.


I guess you never followed the legal battle and eventual out of court
settlement between Intel and DEC (or was it Compaq by then, I forget).


I don't recall any "legal battle". Researching it I find that DECPaqard
claimed that Intel infringed on several of their patents and then rather
than collecting any kind of damages just sold the whole shebang to Intel.

Pete C.


--
--John
to email, dial "usenet" and validate
(was jclarke at eye bee em dot net)
  #85   Report Post  
Posted to rec.crafts.metalworking,misc.survivalism
Pete C.
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

Joseph Gwinn wrote:

In article ,
"Pete C." wrote:

Joseph Gwinn wrote:


snipped


On that, you are fundamentally wrong. They are indeed both computers,
and regardless of the UI on top, any even remotely useable OS operates
on the same fundamentals. If you understand one OS, you understand
essentially any other, it is only the UI that really differs.

It's true that down deep they must all do the same thing, but this level
is quite remote from ordinary users, who tear their hair trying to
figure out under which GUI rock the needed control is hidden. Or even
which control or controls cause the current annoying misbehaviour. And
Microsoft has a different theory of rocks than Apple, so if you have
spent too little time with such a GUI, it will all be so very
frustrating.


It's more a function of Apple having an incorrect theory that rocks
don't matter. When you know what is causing the problem, but the Mac UI
won't let you fix it 'cause Apple doesn't think anyone needs to adjust
that "rock" then there is a fundamental problem with the UI.


This isn't a problem I've encountered. Could you give some specific
examples?

Apple's usual strategy is the make the technical controls invisible to
the ordinary user, but one can do damn pretty much anything from a
terminal window. Or, if one enables it, the root console.


I can't give you the exact details since it was a good year ago, but
much of the annoyances were related to network configuration and WiFi
configuration. The terminal window was about the only place I was able
to accomplish anything as the UI dialogs were either completely
mislabeled, refused to accept perfectly valid entries, or were missing
the setting options entirely.


On every attempt to actually accomplish anything on a Mac (usually
trying to help a Mac user who couldn't figure it out either) I have
consistently found that the language, structure and in many cases simply
the existence of proper configuration options was a significant issue on
Macs. These were not simple UI differences.

Um. I have no idea why you have such problems, but theorize that it's
simple lack of sufficient experience with the Mac GUI.


Experience with the Mac UI doesn't explain their mangling of the english
language or their complete mislabeling of some configuration items (I
seem to recall them calling encryption keys "passwords"), or having
certain settings that are part of a standard completely missing.


In other words, Apple jargon differs from Microsoft jargon? This is
sort of like complaining that Americans don't use quite the same words
as Brits.


Encryption keys and passwords are *not* the same thing, it's not just a
function of "jargon".


I do however have several friends that use Macs to
varying degrees and all have had plenty of problems. One friend is a
teacher who uses both Macs and PCs extensively and reports that the
Macs
crash at least as often as the PCs.

With students messing with them? In education, that has been the
prime
problem. Schools have always liked Macs because the teachers could
keep
them running without needing an IT guy.

This teacher *is* an IT guy and reports no difference in the frequency
of crashes between the PCs and Macs.

My point was that most teachers are *not* IT guys, and are happy that
way.


That's nice, however it has nothing to do with the fact that an IT
literate teacher with significant experience with both Macs and PCs
reported no difference in the rate of crashes between the two.


It that teacher isn't the only one mucking with the computers, one would
see just this pattern. In fact, students mucking things up is *the*
problem that even university IT shops live with, never mind grade school.


Which once again proves my point that Macs are not more stable in
general use than Windows PCs.


Another friend uses Macs almost
exclusively and in 5 years and like three Macs she had a ratio of
about
20:1 to the Windoze problems I had during that time. I did not see
any
decrease in the frequency of problems with the switch to OSX either.

My experience was and is the exact opposite.

Which only goes to show that the stability of either system is most
dependent on the operator, not the OS.

I don't know what she was doing, but clearly you are far more the IT
guru than she. An aggressive or merely clumsy user with admin privilege
can make themselves lots of trouble.


Indeed that is what I've concluded. The myth that Macs are more stable
than PCs is simply that, a myth. I've also noticed that many Mac users
seem to under report the number of system issues, somehow not counting
the need to reinstall an application to get it to work properly as a
system problem.


So we return to the mainstream problem, that Windows systems are far
more fragile in practice than Macs, once one removes the effects of
clumsy meddling. PCs isolated from the world actually work tolerably
well these days, once correctly set up, but how many people want to be
isolated, or to spend their time managing multiple anti-virus and
anti-spyware programs?


Again you are pushing a myth that Windoze users need to spend all their
time managing multiple security programs, this simply isn't true. The
only time Windoze users need to do that is if they want to rely on
multiple freeware security packages vs. spend $40 on a single commercial
product. The fact that Windoze users have more options to chose from is
an asset, not a liability.



MacOS was total crap up until Apple finally
realized they lacked the expertise to write an OS and put their
UI over
someone else's Unix core. Now instead of being a crappy UI on top
of
a crappy OS, it's a crappy UI on top of a so-so OS.

Don't mistake me for a Windoze bigot either, ...

Could have fooled me. Listen to yourself, listen to the music.

How do you figure that? Anyone with any technical knowledge knows
that
the pre OSX versions of MacOS were hopelessly deficient in many
areas,
particularly the lack of memory management. OSX fixed many of the
core
problems, but the UI that I can't stand (I hated the UI on the first
Lisa as well) remains. If I wanted an alternative to Windoze it
certainly would not be Mac as there is simply no advantage whatsoever
to
MacOS over Linux or another Unix variant.

I submit that your answer above proves my point in spades. Listen to
the tone of voice, and parse the implicit assumptions.

Huh? Hardly. I find the Windoze UI vastly more tolerable than the Mac
UI, largely because I can customize the Windoze UI sufficiently to
eliminate the most annoying parts. This does not in any way indicate
that I am a Windoze fan or bigot, simply that I hate the Mac UI. My OS
preference is VMS, however there is a bit of a shortage of affordable
applications for the things I do.

You really don't hear it, do you? OK, I'll parse it a little:

We'll set the stage with such dispassionate, value neutral statements
like "MacOS was total crap until Apple finally realized they lacked the
expertise to write an OS" - It may be crap, but 25 million users rather
like it, and were known to say similarly unemotional things about DOS
and Windows.


How is an OS that had -no- memory management until the entire OS core
was scrapped and replaced with a Unix core not crap? Windows was
evolving memory management (which I consider to be a fundamental concept
for an OS) in the Win 3.1 days and had it working reasonably well long
before Apple gave up on their OS core.


Because neither original Windows (pre NT) nor original MacOS (pre 10)
had memory management.


Indeed, but Microsoft addressed the problem long (years) before Apple
did.


Microsoft solved the problem by stealing VMS technology form DEC,
yielding the original NT core. Books were written about this deathmarch.


Indeed, it's amazing how much of the computing world originated from
DEC. I didn't follow this history in detail, but I though that Microsoft
hired former DEC folks vs. the more outright stealing of Intel / Alpha
that resulted in a lawsuit.


Apple solved the problem by buying NeXT (getting Jobs back as part of
the package deal). NeXT was built from scratch by Jobs after he was
tossed out of Apple some ten years earlier. The OS core of NeXT was BSD
UNIX, from which Jobs and company built a through object oriented
software platform. When NeXT came back, the solution was obvious -
replace the NeXT GUI with a Mac-like GUI, retaining the UNIX core from
NeXT.


That NeXT was an odd little thing that never seemed to go anywhere.


And will end with "Anyone with any technical knowledge knows that...".
In other words, anyone who disagrees by definition cannot have any
technical knowledge. Aside from the implicit ad hominem attack, this
assumes the truth of the very thing to be demonstrated, and thus is
circular.


Huh? It doesn't take a lot of technical knowledge to realize the
significant failings of pre OSX MacOS. While people complained about
Windoze bloat and the need to throw more CPU and memory at Windows, the
Mac world somehow accepted the need to throw more memory at a Mac, not
because of bloat, but because there was no memory management.


I give up. The above response is not relevant to the issue raised. You
don't seem to be able to separate technical issues from emotional issues.


What emotional issue? The Mac UI does not fit my though process so I
have never liked it. The pre OSX Mac OS had massive technical failings
as well. The Mac hardware was always overpriced relative to the rest of
the desktop computing market. The Mac hardware was always a closed
architecture and lacking many options that I required. The list goes on
and on, there are no shortage of technical reasons why a Mac would be
useless to me.


Ah. Now we come to the core. Keep your machine away from the internet,
and all is well. Well, Macs don't need to be protected against the web.

Huh? Where did you come up with that idea? Every one of my machines has
Internet access.

Downloading stuff (including napster) is very much a part of the net.
It shouldn't be possible for this to cause such problems, even if some
users are naive and some people out there are evil, because such people
have always existed, and always will.


That is perhaps one of the most absurd statements I have ever heard.
Nowhere else in life do people have such an absurd expectation that they
should be magically protected against their reckless actions.

Go walking through a dark alley in the bad part of town at night and you
will probably be mugged and nobody will say that shouldn't be possible.
Hop in your car and go careening down the road ignoring safety rules and
you're going to get in an accident and nobody will say that shouldn't be
possible.


Bad analogy. A better analogy would be to ask if you expect the Police
to keep bad people from coming to your town and mugging people, or
breaking into their homes.


That doesn't work in the real world either. The Police are only there
after the fact, not to prevent anything.


To say that it shouldn't be possible for a users careless actions to
cause problems on a computer is utterly absurd. If you want that level
of "big brother" protection, then your Mac would simply prevent you from
downloading programs like Napster that had not been certified by Apple.


No way do I want some company, especially Microsoft but even Apple, from
deciding what software I can and cannot run.


*Bingo* and that is the only way one platform could be more "protective"
than the other.


Most users are experts at something other than the computers that they
must use as a tool. It's expecting far too much to think that this will
ever change, if for no other reason that by definition the clueless
vastly outnumber the priesthood. But the money from vast unwashed
clueless masses folds and spends pretty good, and keeps the priesthood
in beer.


True, but has little to do with the technical merits of the Mac OS vs.
Windoze vs. whatever else.


The fact is that whether you are using a Mac or a PC, downloading (and
running) questionable junk such as Napster *will* cause problems and I
have seen plenty of examples of this on both platforms. The Mac provides
no more protection from these careless user actions than Windoze does.


Tell me, would using a VMS host to download stuff from even the worst
sites cause any danger? If not, why can't Windows do that too?


If you were to find a "worst site" that actually had malicious VMS code
and were to download and run it as a privileged user it certainly would.


For whatever reason, Macs are immune to this stuff. You can argue that
it's mere insignificance that protects the Macs, but the fact remains
that Macs are immune.


The Macs are in no way immune, there are simply fewer bugs for them to
catch currently. They are the "bubble boy" who does just fine while
isolated from the germs but will quickly die in the real world due to
their lack of defenses.


Your belief that Macs don't need to be "protected" from
the 'net is also false.

Most Macs do not run with any virus protection whatsoever, and are none
the worse for it.


And that is simply a function of threat volume and statistics, not a
function of platform security. Because PCs running Windoze outnumber
Macs 20:1 the volume of viruses targeting Windoze outnumbers those
targeting Macs by an even larger ratio and the probability of a
particular Windoze user being hit by one of those viruses is
consequently higher.


Whatever. See above.


Immunity and lack of exposure are not the same thing.


Do you not wear a seat belt in a Volvo because they are perceived as
safer? Do you not follow traffic safety rules because your Volvo will
protect you? You might get away that false sense of security for a while
just due to statistics, but you *will* get nailed eventually.


I wear a seat belt because they demonstrably reduce injuries, regardless
of the make and model of car in question. It's still a car, and an
unbelted person will collide with something in a crash.

But how does this analogy apply to computers? In Macs, no such "seat
belt" is needed.


Completely false. Macs need security products just like Windoze PCs need
them and any other OS needs them. The relative threat level is lower,
but the threat still exists.




That has nothing to do with "isolationism", it has to do with product
quality. Do you purchase brake shoes for your car from some guy in a
dark alley? Would you expect them to be safe? Why would you expect any
different if you get your software from equally questionable sources?

I don't see the analogy. Are you claiming that Macs are bought only
with small unmarked bills from junkies in dark and fetid alleys? This
is quite the scoop - I always wondered about them.


Load garbage software (Napster et al) from suspect sources onto a
computer (Mac or Windows or any other OS for that matter) and you *will*
have problems just as surely as putting cheap counterfeit parts on your
car *will* cause problems. Even a top grade ultra secure OSs such as VMS
will have problems if a privileged user were to execute malicious code
on them.


Rehash. See above.


Rehash and still true. A privileged user loading malicious code *will*
hose any OS. The only alternative to this is to keep the users
non-priveleged and/or prevent the loading of non-certified code.


If you read the PC magazines (yes, PC magazines), you will see that they
consistently rate the reliability and quality of Macs at the top, with
Dell close behind. Apple beats all PC makers on quality of user
support. Consumer Reports backs these findings up.


Consumer Reports has -zero- credibility with me in -any- subject area.
As for reliability and quality, the PC magazines use some questionable
criteria and also exclude many PC lines from their comparisons. The same
goes for support as the comparisons typically exclude the "business"
lines from the PC manufacturers.


So, who do you believe?


Not CR certainly.


Consumer Reports sends out a survey to their subscribers every year. In
this survey, the subscribers report their experience with all manner of
products, and this experience is summarized and reported.

Specifically, on page 35 of the December 2005 issue of Consumer Reports
appears the "Brand Repair History" (based on 85,000 desktop systems and
49,000 laptops) and "Tech Support" (based on 6,500 responses for
desktops and 4,200 responses for laptops).

There is no better publically available source of by-brand reliability
data. And the scores don't vary that much from year to year, as
reliability and tech support don't just happen, they are achieved if and
only if the company management thinks them important.


I'll have to look in the store to see the report.


As for security problems, there are tens of thousands of viruses et al
for Windows, maybe ten for MacOS (none that still work), and
essentially zero for most flavors of UNIX.

There are many, many security problems that affect most flavors of
Unix.

Yes and no. While it's true that no commonly used OS can long resist
knowing attack by experts, some are far harder than others, and the
first-order question is resistance to automated attack.

Er, please qualify that with "consumer" OS as there are a number of "non
consumer" OSs that do just fine against all attacks. Try my favorite VMS
which can give you C2 qualified security "out of the box".

Um. It's true that most consumer OSs lack Orange-Book (DoD 5200.28-STD)
certification while most server platforms do have such certs, but why is
that important? Nor do I see the relevance of VMS in this discussion.


The relevance is that you indicated that no commonly used OS can resist
attacks by experts which is not true. There are many commonly used OSs
that withstand attacks quite well and VMS is one of them (although it is
unfortunately slowly becoming less common).


There is a big difference between "quite well" and perfection. No CAPP
certified system has failed to yield to knowing attack by experts,
although in some cases they had to work at it. Nor are 5200.28 B-level
machines completely safe. If you read DoD 5200.28-STD, you will see
that it does not promise that penetration will be impossible, it instead
implicitly promises that penetration will take a lot of specific OS
knowledge, time, and general expertise. The levels in 5200.28 basically
specify the level of penetration effort to be thrown at the system under
test.


As we all know, there is no such thing as perfection.


Actually, the old DoD 5200.28 family of standards have been withdrawn by
the DoD, replaced by Common Criteria and DoD 5200.1 and 5200.2. The
formal equivalent to 5200.25 C2-Level is CAPP (Controlled Access
protection level) EAL (Evaluated Assurance Level) 3 or better.

Recent versions of Windows have CAPP EAL3 certs, as do two Linux
distributions. All the major UNIX platforms have EAL3 or EAL4. I
haven't checked MacOS, but I imagine that Apple will get or has gotten
the certs, just so they can sell to DoD. With a BSD base, it won't be
hard.


The fact that Apple had to scrap their entire OS core speaks volumes to
their software expertise. I can't see the DoD buying Macs for what
essentially is nothing more than a UI.


Microsoft scrapped their entire Windows 3.x OS core as well, in favor of
NT.


Yes, but Microsoft rebuilt their core with newly hired expertise.


Apple would seek certification to be able to sell to DoD. If this will
work or not is quite another matter. But don't dismiss it out of hand:

There was a big flurry a few years ago when the US Army (at Ft. Monmouth
if I recall) dumped all their Windows webservers in favor of Mac
servers, mainly because the Army was tired of being hacked despite their
heroic efforts to keep the WinNT servers patched and running.


Oddly enough I run a Windoze web server, mostly 'cause I need that
machine up anyway to run an IVR server and general storage services.
It's been up 24x7 for a couple years now and despite frequent attack
attempts it's still doing fine. Nothing particularly special or heroic
on it either, I do the MS patch thing roughly monthly, have an ordinary
commercial software firewall on it and this server along with the rest
of my network is behind an ordinary Linksys NAT/Firewall router with
only port 80 mapped through to the web server.


As for the vulnerability of Macs, that is a false sense of security
simply based on the volume of attempted attacks. If the virus kiddies
decided there were enough Macs to be worth attacking on any scale that
sense of security would evaporate very quickly.

While it's true that Macs are less of a target because they are a
fraction of the market, it's also true that Macs are harder to
compromise, especially by script kiddies.


Simply because the folks who write the Windoze attack utilities for the
script kiddies aren't spending much time writing attack utilities
targeting Macs.


Rehash. See above.

A lot of this is due to the
the fact that the security base of MacOS is BSD UNIX, and a lot is due
to the fact that most dangerous things in MacOS are locked down by
default, and/or require an administrator password to access. Windows
has just started to implement this, with fanfare.


This is nothing recent to Windoze, it has been common for a long time to
"lock down" Windows in business environments so that the users are less
able to compromise the systems.


The problem has been that a fully locked down Windows system is close to
useless, as many Windows apps won't run as anything other than admin.
This has just now started to change, but will take years to achieve what
MacOS now has.


I can't say as I've encountered any Windoze apps that have to run as
administrator. Sure they need to be installed as administrator, but the
ones I've used run fine as a regular user.


As I noted earlier, careless users will cause problems on both platforms
and unless the Mac prevents the user from loading the non certified junk
(Napster et al) than the Mac is not going to "protect" the user. I think
even the Mac users would have a fit if their beloved Macs gave them the
"I can't let you do that" response when they tried to load the latest
music piracy tools.


See above.


Simply put, viruses et al are practical problems only for Windows.
Because such malware spreads itself, the problem grows exponentially
and far faster than systems can be attacked manually.

See above. If Macs were more than a single digit percentage of the
computing world the issues would be vastly different. The perceived
security of a Mac is a function of their scarcity, not their security.

Not quite; see above. And below.

And as a class, Macs and UNIX boxes are far harder to manually
compromise than Windows anything, but none are totally secure. Nothing
that complex ever will be.

I can't recall the last time I heard of a VMS or Tandem or Stratus
system being compromised.

By your own logic, this must be only because with their miniscule market
share compared to Windows (and the Mac for that matter), they just were
not worth the trouble to break.


Hardly, since those three OSs control a sizable portion of the financial
world.


Ah. A new issue emerges.

So, we should use only these machines for web surfing. And if they can
achieve safety despite naive users, why can't Windows do the same?


For the same reason the Macs give the false appearance of being immune
to these issues - the simple fact that there are few threats targeting
them. Like I said, a privileged user downloading and running malicious
code will hose any OS.


If you want a secure OS, look at VMS or the Tandem and Stratus OSs.

Oh my, a blast from the past. True enough. VMS was my favorite
command-line OS of that era. If Ken Olsen had had a religious
conversion and had made VMS open in time, he might have killed UNIX in
the crib. But it didn't happen. So, now VMS has the security of the
dead.

Digital, Compaq and now HP have had no clue how to market VMS, indeed
Comapq and HP have no concept whatsoever of the "enterprise class" world
outside the PC realm.

I'd have to agree here. But it's also true that the whole idea of
"OpenVMS" arrived about ten years too late, so it's not clear that
Compaq and HP could do anything to reverse DEC's blunder.


I think Compaq could have if they had understood what they had and the
enterprise world. HP was hopelessly screwed up with Carley running
around hyping every low margin consumer toy while selling off or
otherwise destroying the diverse underpinnings of the company.


I suspect that you are right.


It's pretty sad, especially with the $45M "thanks for destroying the
company" parting gift.


Tandem and Stratus are still around I think, but sell into a very
specific niche, where perfect hardware reliability is needed. These
were used in some air traffic control systems, but have a key
conceptual
flaw - the custom-built application software is the common cause of
failures, not hardware failures. So most ATC systems have total dual
or
triple redundancy, and the hardware is just another (minor) cause of
failure and subsequent switchover (within one tenth of a second
typically).

VMS is still around as well. You don't hear about it much (like Tandem
or Stratus) because it's in applications that don't get hyped, and it
also "just works".

Software is most often the cause of problems and it's only getting worse
as the software gets both more complex and more poorly engineered.

Yes, but reliability is still a problem even if well engineered. It
usually takes a few years of intense post-delivery bug fixing to achieve
reliability.


Indeed, and this is something that MacOS is just as subject to as
Windoze.


Yes, they are still computers. But Apple seems to push it much farther
than Microsoft, and Apple has better control of the Mac ecosystem than
MS has of the Windows ecosystem.


Macs are socialists and PCs are capitalists, and that debate has gone on
for years with no resolution. Each can work, it's just a function of how
much liberty the user is willing to sacrifice in the name of security.


Because MacOS is only for the clueless, it cannot be that the lack
of
trouble on Macs is due to clued-in users. So there must be some
other,
simpler explanation.

I have not seen this purported lack of trouble on Macs. Every single
Mac
user I have known (dozens) has reported plenty of problems.

You need a better grade of Mac users. By your own analysis, the
clueless make their own trouble; this will be platform independent.

Indeed, and this will likely be found in users who treat the machine as
a tool and not a toy. Since the Mac UI generally seems to appeal more to
the "creative" types vs. the "technical" types, presumably there are
plenty of writers and such that have Macs that are perfectly stable and
also devoid of questionable software. Presumably also if you gave these
same people the same applications on a PC and they treated it the same
(no questionable junk), they would likely see the same stability.

This is a bit self contradictory. Those flighty non-technical creative
types love the Mac but are clueless about IT, have no internet
discipline whatsoever, and yet they prosper.


I think the ones that "prosper" are the ones that keep work and personal
machines separate.


Should not be necessary, as mentioned above in multiple places.


Well, it *is* necessary to either keep the two separate, or have the
knowledge / self control to avoid what should be obvious threats.


Just think what stolid
uncreative technical types could do with such a tool.


Nothing, absolutely nothing, because the whole Mac concept is to prevent
anyone from doing anything technical.


Not so. The technical controls are at the terminal window (and root
console) level, and many controls are only at that level. One can argue
that this or that control should or should not be GUI accessible, and
I'm sure that there will be some migration in the coming years, but the
controls are there, but mostly kept away from naive users.


And that only came about in the past couple years. Pre OSX there was no
ability to do anything technical. Post OSX there is some, but post OSX
MacOS is just another Unix variant with a particular flavor of GUI
shell.


With the growth of Linux in the market, more commercial apps will
support
Linux, so this advantage is likely to erode over time.

The upcoming homogenization of the hardware market will help this a
lot.
The switch to OSX was one step towards Apple getting out of the
hardware
business which they have never been very good at. Now they have
announced they are abandoning IBM's antiquated CPUs.

The PowerPC architecture is hardly "antiquated", and is about twice as
fast per CPU clock cycle than Intel.

The PPC architecture has been around for quite some time and was a
rehash of a retired workstation processor. Neither the Intel x86 nor PPC
CPUs are remotely as fast / efficient as the (DEC/Compaq/HP) Alpha and
indeed that's why Intel stole much of the Alpha design for the Itanium
before eventually reaching a "settlement" over the theft.

Be careful about who you call a "rehash" (nice neutral word that):
Intel processors are by the same token an absolute hash, retaining
compatibility with every past version, with bits encrusted upon bits.


They had been until the infusion of stolen Alpha technology.


Not exactly. IBM invented the RISK processor architecture, and the
first RISC CPU was the IBM P801.


Ok, but Intel's boost came / is coming from what they stole from Alpha.


Nor is the Alpha really a RISC machine, as it had to execute the VMS
instruction set. I no longer recall the details of what was done in
silicon and what was done by the compilers, but the VMS instruction set
is pretty big and complex, even if it is nicely designed. My
recollection is that DEC implemented all the one and two address
instructions in Alpha silicon, and everything else was emulated.


Microcode I believe. I never dug all that deep into it though since I'm
not a programmer.


Just like every ancient architecture. Did you ever write assembly on a
Univac 1100-series computer? They never threw anything away; every
instruction bit had three meanings, controlled by the current state of
the processor.


Nope, I don't go back that far. I did write assembler on VIC-20s and
II+s though. These days it's just the occasional stuff on something
small in the PIC family. I'm not a programmer and only write a little
assembler now and then to go with some hardware I built.


My first computer language was Basic, my second was Fortran, and my
third was Univac 1100 assembler. And embedded programmers almost
universally used assembly in those days, although C (not C++) is real
common nowdays.


I started with Basic of course and then went to assembler on the VIC and
II+ in order to get some speed in some subroutines like stepper driver
code.


The PowerPC is a clean new (in relative terms) design, with no
encrustations of prior architectures. That's why it's able to do twice
the computational work per clock cycle.


The stolen Alpha technology is bringing that to the Intel line.


Not exactly. See above.

And one part of the Alpha one would not wish to steal is the power
demand. I recall an effort to use Alpha chips in a line of MIL-SPEC
single board computers that failed because it was too hard to get that
much heat off the board and out of the cabinet. The comment of the day
was that we would have to cool the thing with boiling seawater.


You don't like the Frankenstein bolts coming out of the top of the Alpha
CPU (at least the earlier ones)? If I recall the process / thermal /
clock speed things are what Intel used in their argument to convince was
it Compaq or HP then? that Intel could produce a better product by
combining the (stolen) Alpha technology with their better fab processes.


The problem is that IBM is more interested in making large massively
multiprocessor servers the size of commercial refrigerators than little
desktop systems, and so IBM's direction increasingly deviated from what
Apple needed to win the CPU horsepower races.

More importantly Apple has been realizing that they need to get off
proprietary hardware which regardless of any technical merit, they can
never be economically competitive with. OSX was the first step towards
making their OS portable to a generic hardware platform. The
announcement of the switch to Intel CPUs was the next step. In the not
too distant future will be the announcement of MacOS for the PC,
followed later by the announcement of the end of proprietary Mac
hardware.

When the consumer is able to select a "generic" computer platform of the
size, scalability and fault tolerance for their application, and then
independently select from a dozen of so OSes depending on their
preferences, the consumer will be well served.

It's true that Apple is arranging things so they can take advantage of
the whole PC hardware ecosystem, but it does not follow that Apple will
allow the MacOS to run on generic PCs.


I guess we'll have to wait and see, but my Apple predictions so far have
come true. I was predicting many years ago that Apple would eventually
have to face the fact that they did not have the expertise to write a
good OS core and would have to take their UI elsewhere and this has
happened. I was predicting years ago that Apple would have to abandon
their hardware platforms where they were typically behind and this has
happened in stages (first PPC now Intel).


OK. The market will tell, soon enough.

The common hardware platform will both drive down the hardware cost and
also let each OS stand on it's own merits independent of hardware
differences.


This deviation was particularly acute in laptops.

Also, as part of their "fit in but stand out" strategy, Apple wanted
Macs to be able to run Windows apps at full speed, rather than in
emulation at a fraction of full speed.

The need for emulation / Windoze support of course being a function of
market share. Few companies can afford to write Mac only software and
ignore 95% of the market.

Almost true. There are a few Mac only companies, but a common pattern
has been to develop their first product on the Mac (where the smaller
market means less competition and greater margins), and use the profits
from the Mac market to fund the launch into the much larger Windows
market.


True for the Mac-centric companies. For the Windoze-centric companies
it's (unfortunately for the Mac users) quite easy for them to stick with
the Windoze versions and ignore the small Mac base.


Yes and no. There are people switching. Like my kid sister. And
frustrated Windows users ask me from time to time, but it's more often
an initial choice than a later choice.


Of course, people switch one way or another for a variety of reasons and
I know at least three or four people that switched from Mac to PC. Since
the installed base of PCs is very high and since there are a number of
alternatives to Windoze that will run nicely on that existing hardware,
the bulk of people looking to switch off of Windoze are likely to follow
the low cost path to keep their existing hardware. When Apple introduces
MacOS for PC hardware then those users might consider it or by then they
may well be happy with one of the other Unix offerings and not be
enticed by the Mac UI.


The PowerPC architecture (from both IBM and Motorola) basically rules
the military and industrial embedded realtime markets, with something
like 70% market share.

Not sure where you got that figure, I follow the embedded world to some
extent and I see very few PPCs. In fact, a flip through the Dec '05
Circuit Cellar magazine revealed -0- references to PPC.

Um. Circuit Cellar is for hobbiests, not the military industrial
complex.


Hardly. Circuit Cellar is for the embedded engineering world, but uses a
format with the hobby projects of those embedded engineers to highlight
a lot of the new stuff and keep the magazine interesting. As a useless
side note, I've provided some of the props and ideas for the covers in
recent years.


A lot of embedded projects are for microwave ovens and the like. These
are very small systems, and typically use very slow CPUs (because they
don't need anything faster). The Military Industrial Complex is solving
a very different set of problems.


Unfortunately those MI solutions keep getting blown up on the end of a
cruise missile.

There is a lot of embedded process control stuff as well.


If you look through magazines like Embedded Systems
Programming, you'll get a far different picture. For instance, the bulk
of the VMEbus SBCs (single-board computers) sold are made by Motorola and
use the PowerPC processor. The runner-up is Intel processors, and DOS
isn't dead.


Been quite a while since I've looked at those magazines.


OK. Some libraries have them.


Hmmm, I'm not even sure where the local library is since I moved a year
and a half ago. I should probably find it.


The Intel architecture is actually older than the PowerPC architecture,
by many years, so by longevity alone, Intel is antiquated. So what
exactly do you mean by "antiquated"?

Antiquated in large part means weighed down by "compatibility barnacles"
which limit the ability to adopt significant architectural changes. This
problem has affected both the Intel x86 and the IBM PPC lines.

Yes, the Intel is very much encrusted by backward compatibility. The
PPC is not yet encrusted, but give them time.


With the hardware abstraction trends the backward compatibility
barnacles should slowly evaporate across all the processor lines.


Um. Hardware abstraction layers are another form of barnacle, and kill
performance. We live with the performance cost for practical reasons,
but there is nonetheless a cost. Look into the history of microkernels
in operating-system design.


It's certainly a barnacle, but it's a bit less problematic than the
hardware version. At least you can get the better hardware and then apps
that need the performance can native code.


In the near future you will simply select a generic hardware platform
from the vendor of your choice and in the size / expandability /
fault
tolerance for your application, and then select your favorite OS to
run
on it from a field of dozens of variants that all run on the same
hardware platform.

For MacOS, it won't happen soon, as Apple makes far too much money on
hardware. Probably one will be able to run Windows on Mac intel
hardware, but will not be able to run MacOS on generic intel PCs.

I predict that MacOS will be available to run on generic PC hardware
within another 2 or 3 years. One of Apple's big problems is that the have
to make large profits on the Mac hardware since they sell so little of
it compared to the PC world. This causes them to either have to price
the product too high relative to the competition and try to hype reasons
it's worth the extra money, or to try to compromise to cut manufacturing
cost and risk reliability problems. We've seen examples of both paths
from Apple.

It won't happen anytime soon. This has been suggested for years, and
Steve Jobs (a founder and the current CEO of Apple) always says that
allowing MacOS to run on generic PC hardware would put Apple out of
business. I see no reason not to take him at his word.


Given Apple's various product duds and reversals of concepts like open
architecture to closed architecture and back to semi-open architecture,
I see no reason they won't eventually decide to exit the hardware arena.


You mean like the iPod?


I was thinking more of the IIc and Newton for duds. I also don't
consider the iPod or any other "consumer toy" in my overall analysis.


Anyway, let's wait and see.

Mac hardware is far less trouble to assemble and configure,

That's because it is largely non-assembleable and non-configurable. You
get saddled with a generic box, you have few choices for options and you
have to pay for included items you may never use.

Macs are about as configurable as Dell PCs, right down to configuring
and ordering from the Apple website. If you like, go to
http://www.apple.com/ and click on the Store tab. You can walk
through the entire chose and configure and price process without having
to register or provide a credit card. (What's in the stores is a
fraction of the configurations available from Apple.)


And exactly how much of that is *user* configurable?


I don't get your point. The Apple store cannot tell if you are a
unwashed user or an administrator, so long as your money spends good.


How much of that can you as the end user at home configure and
reconfigure as your needs change? My PCs go through many incarnations as
I get newer machines and migrate good hardware from older machines. I
also combine leftover bits of old machines into usable machines for
specific tasks. I don't need a GHz machine with dual monitors to run a
PIC programmer for example.


All products are packaged in some manner, so you always get more than
you absolutely wanted or needed. I guess I don't see your point.


In the PC world (regardless of OS), I have far greater flexibility to
configure a hardware platform to my exact needs.


If this is simply another way to observe that the Windows ecosystem is a
factor larger than the Mac ecosystem, I agree. But the question was
adequacy for a purpose, not ecosystem size per se.


See my comment above about the "investment protection" the ability to
migrate hardware from older systems and between systems provides.



and is far
more reliable than most PCs

I've not seen any hard data showing any greater hardware reliability for
a Mac vs. PC. All computer hardware these days is far more reliable than
any of the software that runs on it.

Look into Consumer Reports and also the PC (not Mac) magazines. The Mac
magazines also say this, but what else would they say - they must be
True Believers.


As noted, Consumer Reports is *not* a credible source for anything. Also
as noted, the magazine comparisons tend to leave out entire lines of PC
hardware making them inaccurate. And again, all hardware is pretty damn
reliable these days so a quality PC and a quality Mac should have little
difference in hardware reliability.


See above. Better yet, get the December 2005 issue and look, so we can
debate from the same page.


Depends what the cover price is, I'll check later since I have to go out
for groceries anyway.


, largely because in Macs there is a single
design agent, Apple, ensuring that it all fits together and meets
minimum standards of design and implementation quality.

... and incompatibility with the rest of the computing world.

If that's another way of saying that Macs are not Windows, OK. But both
are able to perform the same tasks, just like there are many brands of
truck, but they all use the public roads.


Yes and no, Macs could do the same tasks as PCs, but in some cases they
lack the available options (both hardware and software) to do so. It
also too Apple quite a while to join that "public road" and abandon
their proprietary networks and busses (SCSI being the only notable
exception).


And Firewire. And ethernet.


Firewire is far more recent. As for Ethernet, it took Apple an
excruciatingly long time to realize that Ethernet had to be a standard
feature. They clung to their Appletalk stuff for a long time, kind of
like IBM clinging to Token Ring. I also recall the $400 Ethernet cards
for the few Macs that could take them when the PC world had them for
$50.


Standards, quality and compatibility were issues in the PC world more
than a decade ago. These days quality and interoperability are quite
high. Only on the most complex systems do you run into any configuration
issues and that is infrequent and in areas where Macs simply aren't
applicable anyway.

Well, the PCs have gotten far better it's true, but the Macs were always
there.


Um, there have been plenty of problems with Macs along the way as well.


In absolute terms perhaps, but Macs are far less trouble in total.

And Windows interoperates well only with Windows. There have been a
number of court cases on this issue, and Microsoft is slowly yielding
ground.


That's not interoperability, that's openness to third party software.
Interoperability is working with established standards, something that
Macs have been loathe to do.


Um. Have you been following Microsoft's tangles with the EU antitrust
regulators? They are proposing fines of a million dollars a day.


The EU antitrust folks are about as far out in left field as they can
get. If I were Bill I would pull all MS products from the EU and void
the license agreements, but that's just me and I can be a bit harsh.


This is a major reason that people have been willing to pay somewhat
more for Apple hardware. It's simply less trouble.

That's the myth, not the reality. These days very few problems on either
platform are a result of hardware problems. Come to think of it, my Mac
friend did have a 17" Powerbook replaced under warranty when it failed
after about 3 months use. I don't have details on what actually failed,
but I know the machine was not physically abused.

Even the best of laptops have about a 10% failure rate, according to
Consumer Reports, so one can always find someone with a dead laptop.


I think Apple is the only one who had melting laptops though.


True, but lots of PC laptops get too hot to have on one's lap. In any
event, Consumer Reports shows that Apple desktops are by far more
reliable than any PC, but that Apple laptops are in the middle of the PC
range (which isn't that wide).


Dunno, I have or have had a Armada 1590-something, an Armada M700, a
Thinkpad 600X and a Dell 600-something and none of them seem to have any
thermal issues.

Every Mac user I know which is about five has had hardware problems at
one point or another. A small sample granted, but still a 100% problem
rate.



My kid sister just switched from Windows to MacOS. She is a Graphic
Artist, and that field is dominated by Macs, but her first husband was a
self-described DOS Bigot. Anyway, she got the big FedX box just after
Christmas, and called to tell me how easy it was to get set up and
running. (Her current husband is not a computer guy at all.) It took
all of an hour.


Macs have been loosing some ground in parts of the graphic world, due to
a number of factors. Since most of the same software is available for
Windoze and works just as well there, and the fact that Windoze is still
a business essential and less expensive per-seat than Macs has led some
companies to ditch Macs in favor of PCs / Windoze (CCI did this).


Macs are something like 60-70% of the Graphics Arts world.


And at one point they were like 95% As noted CCI dumped Macs to their
publishing work since they could get the same apps for Windoze and
eliminate the more expensive machines and additional support issues.


At a previous job we had one advertising / PR / graphics person who used
a Mac. It was a pain for us to support since it was the only Mac in an
office of several hundred PCs. We eventually got a decent PC, loaded it
with all the same applications that she used on the Mac and put it on
her desk next to the Mac with the hopes that she would give it a try and
eventually switch to it.

The end result is that after a few months with no extra prodding or
support other than instruction on how to access the same work files from
either platform, this user switched to the PC. Obviously she knew that's
what we were hoping for, but she did indicate that she found that some
tasks were easier on the PC and none were more difficult.


The social pressure must have been immense.


Probably, but she did not indicate anything was more difficult on the
PC.


I was the sole holdout at work for many years, long enough to have
managed to miss all the Windows 3.x and NT dramas my coworkers told me
about over lunch. It was a lot of fun, at least for me. After a while,
someone would notice that I had been uncommonly quiet, and would ask me
a question. I would reply with some variant of "I have a Mac, and it
just works, so I have nothing to report". Getting Windows to support
CD/ROM drives was a leading cause of wasted weekends.


That rather mirrors me watching the numerous problems on the zillion
Unix systems while my VMS systems "just work". Of course I also noted
how my Win95 desktop at a previous job would happily run for 45+ days at
a time while my coworkers identical desktops would crash every few days.


Yep. I'll probably get one of those $700 Dell boxes. Already got the
hardware firewall.

An old Optiplex GX100 (P3/733) runs Mach3 just fine under W2K on a
machine that will do 60IPM or so. Nice and cheap used as well.

Yes, but my heart is set on a Dell. For one thing, I want one company
to yell at.


The old Optiplex GX100 *is* a Dell. Just one I got used from a corporate
surplus source. Since most large companies cycle PCs out every three
years to coincide with the 3 yr warranty that is standard on most
manufacturers "business" lines, there is a steady stream of good cheap
PCs that are plenty capable for all but the most intensive applications.


OK.

It will be interesting to see what happens in the market when Macs can
run all these Windows-only apps at full speed, so there is some real
competition between platforms.

I don't think that will cause any real competition. What it will mostly
do is remove a handicap from those who prefer the Mac UI. I don't think
there are any significant numbers of people wanting to migrate to a Mac
but being held back by a lack of apps. Those wishing to migrate away
from Windows are more likely to explore free options like Linux that
will run on their existing hardware.

It will certainly remove that handicap. But it will also expose lots of
people to the Mac, and comparisons will be made.


Comparisons will be interesting as I see the Mac in its current OSX
form as nothing more than another UI shell available for a standard Unix
base.


The standard retort is that by the same token there is no difference
between a Porche and a Chevy; they are both cars.


Nope, the comparison would be a Chevy and a Ford (or Dodge or Toyota,
etc.). Comparing a Chevy to a Porche is like comparing an ordinary PC to
a multiprocessor, hot swappable, fault tolerant PC server, and a Mac is
not in that class either.


Microsoft itself does not agree that lack of applications is what
prevents migration away from Windows. This came out in spades in the
antitrust case, where they were caught doing all manner of illegal
things to preserve this barrier. It's all in the opinion handed down by
the Federal Appeals Court.


Any successful, dominant company is going to be attacked and for some of
the most bogus reasons. Yes Microsquish has done a few things that were
wrong, but they have also been bashed for doing things that they have
every right to do as far as I'm concerned.


More than a few things. It's all there in the court rulings.

How can you possibly justify forcing Microsquish to include competitors
products with their distributions? Is GM required to include Ford
products with the cars they sell just because some users may prefer to
put a Ford dashboard in their GM car? That's about on par with some of
the stuff pushed on Microsquish.


Any company that achieves ~90% market share in an important industry
will find its freedom of action curtailed. The classic example is AT&T,
which achieved a similar market share by methods that are now illegal,
but were common back then. The solution was to turn telephones into a
regulated utility. This is probably one of Microsoft's biggest fears,
but their current conduct makes such an outcome more and more likely.


Except that there is no comparison between a telephone utility and a
commodity OS. There was no choice for the telephone service since it was
delivered by a fixed network, anyone can run whatever OS they chose on
their PC hardware and they have quite a few choices. The fact that they
choose one particular brand of OS overwhelmingly does not make a
monopoly in my book.

Now that there are alternatives to traditional Telcos for voice
services, there is no longer a Telco monopoly either and interestingly
enough we see the old "Bell bits" recombining.


Datapoint: At its peak, IBM had only a 70% market share.


In which market? They cover many markets with vastly different needs.


Anyway, when barriers are removed, migration happens. Some will go to
Linux (if they like that unpolished an environment), and some will go to
MacOS (polished exterior, real UNIX available below).


Yep, Apple might reach 8% market share while Windows drops to 67% and
Linux/Unix rises to 25%.


Truth is, Apple (and Mac users) would be perfectly happy with ~10%
market share.


And that's likely what they will have. I just don't see any compelling
reasons why they would achieve more than this regardless of what
Microsoft does. In fact given totally free hardware and software, I
don't think the MacOS could ever capture more than 50% of the market
since it primarily appeals to the creative types and they are perhaps
50% of the population.


I think we are mixing unlike things here. The desire for independence
and freedom from lock-in exists regardless of the skill of the
programmer, especially as the programmer becomes experienced (and has
been screwed when something he depended upon is made unavailable).
Freedom from lock-in and abuse by marketing-driven companies is its own
good.

The pseudo-programmers I reference are not concerned with such things,
they exist to glue purchased MS code libraries into horrendous "business
apps" for just long enough for them to migrate into the "management"
world.

OK.


I've seen the headaches the poor "real" programmers who come along later
have trying to fix and maintain these horrible apps. It's not pretty and
makes me happy I'm on the systems end of things.


True enough, but it will never change.


Yep and it's a sad thing. There will always be "pretenders" in every
field just faking it to make their way into the management world where
they can live off the work of others.

Pete C.


  #86   Report Post  
Posted to rec.crafts.metalworking
Pete C.
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

"J. Clarke" wrote:

Pete C. wrote:

"J. Clarke" wrote:

I was enjoying this until you started ranting about "stolen alpha
technology". At that point you started coming across as a loon.


I guess you never followed the legal battle and eventual out of court
settlement between Intel and DEC (or was it Compaq by then, I forget).


I don't recall any "legal battle". Researching it I find that DECPaqard
claimed that Intel infringed on several of their patents and then rather
than collecting any kind of damages just sold the whole shebang to Intel.

Pete C.


--
--John
to email, dial "usenet" and validate
(was jclarke at eye bee em dot net)


Ah, yes, that would be it, clueless low margin consumer gadget oriented
Carley was hoodwinked by Intel into selling the whole lot instead of
putting Intel in their place for the theft / infringement.

Pete C.
  #87   Report Post  
Posted to rec.crafts.metalworking
Larry Jaques
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

On Wed, 04 Jan 2006 09:44:38 -0500, with neither quill nor qualm, "J.
Clarke" quickly quoth:

I was enjoying this until you started ranting about "stolen alpha
technology". At that point you started coming across as a loon.


I like the part where you guys quote all 861 lines of crap before
top-posting a 2 line reply. sigh

Some people's kids, I swear.


--
If you turn the United States on its side,
everything loose will fall to California.
--Frank Lloyd Wright
  #88   Report Post  
Posted to rec.crafts.metalworking
Larry Jaques
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

On Wed, 04 Jan 2006 06:54:30 GMT, with neither quill nor qualm, Gunner
quickly quoth:

On Tue, 03 Jan 2006 09:24:55 -0800, Larry Jaques
wrote:

On Tue, 03 Jan 2006 00:32:56 GMT, with neither quill nor qualm, "Pete
C." quickly quoth:

Gunner wrote:
Lots of suggestions..and Ill work on them later today. Im busy trying
to repair my roof..of which a fair amount blew off in the the 70mph
winds we had Sunday. Damnit


Hey, Gunner, tried Linspire yet?

No..I sure havent. Tell me a bit about it?


I dunno. I just saw it advertised on cheap computers.
One example, and I don't know 'em from Adam:
http://www.koobox.com/ Pretty good deal!

The OS:
www.linspire.com

I'm waiting for my copies of Ubuntu I ordered last year, too.


--
If you turn the United States on its side,
everything loose will fall to California.
--Frank Lloyd Wright
  #89   Report Post  
Posted to rec.crafts.metalworking,misc.survivalism
Steve Ackman
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

["Followup-To:" header set to rec.crafts.metalworking.]
In , on Wed, 04 Jan 2006
16:56:10 GMT, Gunner wrote:

Im running dialup, and the Box is configured to connect to the local
network here at home, via nic card on eth0. For some reason..it wants
to send internet requests to the lan, rather than the ISP (via
dialup), unless I turn off eth0. I run static IPs assigned to each
machine in the LAN in the 192.168.0,1-192.168.0.15 range, with a mask
of 255.255.255.0, not having a DCHP server assigning them.

Im rather like a moderately retarded kitten flailing around with a
ball of yarn..and the yarn is winning.


Late entry into the thread, but, some of this may
prove useful anyway:

Checking back in my notes when I used to do the same
thing, I always configured by hand (machine ran 24x7,
so reboots only came when power outages outlasted my
UPS)
(note: # is the prompt indicating you're logged in as
root and $ is the prompt for an ordinary user... only
use the former when necessary)

# /sbin/ifconfig eth0 192.168.24.13 netmask 255.255.255.0 up
# /sbin/route add -net 192.168.24.0 netmask 255.255.255.0 eth0
# /sbin/route add default ppp0
# /sbin/ipchains -A forward -p all -s 192.168.24.0/24 -d 0.0.0.0./0 -i ppp0 -j MASQ

Adjust as necessary and give 'er a go. (you'll most
likely need an iptable command in place of the ipchains
command)

This is all fairly universal, so hopefully there
won't be any distribution-specific gotchas.

Ok..here is a big question...in dos, we have the config.sys and
autoexec.bat files, where house keeping settings are made, and then
various programs are run on boot. What is the Linux equivelent?


There isn't really a direct equivalent. They do things
quite differently. Generally system config files are found
in /etc while user specific commands are found by

$ ls ~/.*rc

What
is the boot sequence, and what is looked for as it boots?


Somewhat dependent on the particular distribution you're
running, but generally
$ man init and
$ man inittab should be instructive.

Daemons to start on boot will be found in /etc/rc*
files. i.e. for run level 3,

Red Hat - /etc/rc.d/rc3.d/S25sshd
Debian - /etc/rc3.d/S20ssh

These are generally links to the actual daemon scripts
in /etc/*/init.d

Example..it makes a call and trys to look up ntpdate..outside time
server on bootup. Where is this and how do I edit it out?


$ locate ntpdate | grep rc

should be able to find it for you. I can't test it
because I don't have ntpdate on either of my Linux
boxes. Change it from S99ntpdate to K99ntpdate (99
will most likely be a lower number)

I really really need to get some books on this damit...


Google for rute.pdf. I don't know if it's still being
maintained/updated, but no matter. It's an excellent
reference, and there are lots of sites that have it.
If you're running a recent distribution, try to get
one of the later versions of rute.pdf.
  #90   Report Post  
Posted to rec.crafts.metalworking,misc.survivalism
Joseph Gwinn
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

I have snipped areas where the debate has become cyclic.

In article ,
"Pete C." wrote:

Joseph Gwinn wrote:

In article ,
"Pete C." wrote:

Joseph Gwinn wrote:


snipped


Apple's usual strategy is the make the technical controls invisible to
the ordinary user, but one can do damn pretty much anything from a
terminal window. Or, if one enables it, the root console.


I can't give you the exact details since it was a good year ago, but
much of the annoyances were related to network configuration and WiFi
configuration. The terminal window was about the only place I was able
to accomplish anything as the UI dialogs were either completely
mislabeled, refused to accept perfectly valid entries, or were missing
the setting options entirely.


I recall that WiFi was a problem for everybody, although the complaints
have died down.


I don't know what she was doing, but clearly you are far more the IT
guru than she. An aggressive or merely clumsy user with admin
privilege can make themselves lots of trouble.

Indeed that is what I've concluded. The myth that Macs are more stable
than PCs is simply that, a myth. I've also noticed that many Mac users
seem to under report the number of system issues, somehow not counting
the need to reinstall an application to get it to work properly as a
system problem.


So we return to the mainstream problem, that Windows systems are far
more fragile in practice than Macs, once one removes the effects of
clumsy meddling. PCs isolated from the world actually work tolerably
well these days, once correctly set up, but how many people want to be
isolated, or to spend their time managing multiple anti-virus and
anti-spyware programs?


Again you are pushing a myth that Windoze users need to spend all their
time managing multiple security programs, this simply isn't true. The
only time Windoze users need to do that is if they want to rely on
multiple freeware security packages vs. spend $40 on a single commercial
product. The fact that Windoze users have more options to chose from is
an asset, not a liability.


The PC mags would disagree. They recommend multiple security packages,
because no one does a good enough job by itself.


I submit that your answer above proves my point in spades. Listen
to the tone of voice, and parse the implicit assumptions.

Huh? Hardly. I find the Windoze UI vastly more tolerable than the Mac
UI, largely because I can customize the Windoze UI sufficiently to
eliminate the most annoying parts. This does not in any way indicate
that I am a Windoze fan or bigot, simply that I hate the Mac UI. My OS
preference is VMS, however there is a bit of a shortage of affordable
applications for the things I do.

You really don't hear it, do you? OK, I'll parse it a little:

We'll set the stage with such dispassionate, value neutral statements
like "MacOS was total crap until Apple finally realized they lacked the
expertise to write an OS" - It may be crap, but 25 million users rather
like it, and were known to say similarly unemotional things about DOS
and Windows.

How is an OS that had -no- memory management until the entire OS core
was scrapped and replaced with a Unix core not crap? Windows was
evolving memory management (which I consider to be a fundamental concept
for an OS) in the Win 3.1 days and had it working reasonably well long
before Apple gave up on their OS core.


Because neither original Windows (pre NT) nor original MacOS (pre 10)
had memory management.


Indeed, but Microsoft addressed the problem long (years) before Apple
did.


In a manner of speaking. The DOS core remained long after the supposed
conversion, causing endless memory problems. I think Win2K was the
first to have a clean memory architecture.


Microsoft solved the problem by stealing VMS technology form DEC,
yielding the original NT core. Books were written about this deathmarch.


Indeed, it's amazing how much of the computing world originated from
DEC. I didn't follow this history in detail, but I though that Microsoft
hired former DEC folks vs. the more outright stealing of Intel / Alpha
that resulted in a lawsuit.


It's true that MS hired lots of key DEC folk. Don't know about Intel
and the Alpha. Do you have any cites?


Apple solved the problem by buying NeXT (getting Jobs back as part of
the package deal). NeXT was built from scratch by Jobs after he was
tossed out of Apple some ten years earlier. The OS core of NeXT was BSD
UNIX, from which Jobs and company built a through object oriented
software platform. When NeXT came back, the solution was obvious -
replace the NeXT GUI with a Mac-like GUI, retaining the UNIX core from
NeXT.


That NeXT was an odd little thing that never seemed to go anywhere.


Yes, it only made a few hundred million dollars, so by Jobs' standards
it was a failure. Jobs founded three companies, Apple, NeXT, and Pixar
(Toy Story). Apple and Pixar are worth billions.


I give up. The above response is not relevant to the issue raised. You
don't seem to be able to separate technical issues from emotional issues.


What emotional issue? The Mac UI does not fit my though process so I
have never liked it. The pre OSX Mac OS had massive technical failings
as well. The Mac hardware was always overpriced relative to the rest of
the desktop computing market. The Mac hardware was always a closed
architecture and lacking many options that I required. The list goes on
and on, there are no shortage of technical reasons why a Mac would be
useless to me.


As I said, I give up.


Downloading stuff (including napster) is very much a part of the net.
It shouldn't be possible for this to cause such problems, even if some
users are naive and some people out there are evil, because such people
have always existed, and always will.

That is perhaps one of the most absurd statements I have ever heard.
Nowhere else in life do people have such an absurd expectation that they
should be magically protected against their reckless actions.

Go walking through a dark alley in the bad part of town at night and you
will probably be mugged and nobody will say that shouldn't be possible.
Hop in your car and go careening down the road ignoring safety rules and
you're going to get in an accident and nobody will say that shouldn't be
possible.


Bad analogy. A better analogy would be to ask if you expect the Police
to keep bad people from coming to your town and mugging people, or
breaking into their homes.


That doesn't work in the real world either. The Police are only there
after the fact, not to prevent anything.


The analogy breaks down, but is still better than the dark-alley
analogy. And we do expect the Police to catch and badly hurt the perps,
so they won't do it again. We don't care if it's because they are
incapacitated, or merely deterred, so long as they stop.



Most users are experts at something other than the computers that they
must use as a tool. It's expecting far too much to think that this will
ever change, if for no other reason that by definition the clueless
vastly outnumber the priesthood. But the money from vast unwashed
clueless masses folds and spends pretty good, and keeps the priesthood
in beer.


True, but has little to do with the technical merits of the Mac OS vs.
Windoze vs. whatever else.


Not exactly. A system that doesn't need so much effort to make safe is
going to very much easier to live with for those users that are not IT
gurus.


The fact is that whether you are using a Mac or a PC, downloading (and
running) questionable junk such as Napster *will* cause problems and I
have seen plenty of examples of this on both platforms. The Mac provides
no more protection from these careless user actions than Windoze does.


Tell me, would using a VMS host to download stuff from even the worst
sites cause any danger? If not, why can't Windows do that too?


If you were to find a "worst site" that actually had malicious VMS code
and were to download and run it as a privileged user it certainly would.


Misses the point.


For whatever reason, Macs are immune to this stuff. You can argue that
it's mere insignificance that protects the Macs, but the fact remains
that Macs are immune.


The Macs are in no way immune, there are simply fewer bugs for them to
catch currently. They are the "bubble boy" who does just fine while
isolated from the germs but will quickly die in the real world due to
their lack of defenses.


It may be true that if Macs were 90% of the market, they would fall
under the assault. But Macs aren't going to achieve 90%, so what's your
point?


Your belief that Macs don't need to be "protected" from
the 'net is also false.

Most Macs do not run with any virus protection whatsoever, and are none
the worse for it.

And that is simply a function of threat volume and statistics, not a
function of platform security. Because PCs running Windoze outnumber
Macs 20:1 the volume of viruses targeting Windoze outnumbers those
targeting Macs by an even larger ratio and the probability of a
particular Windoze user being hit by one of those viruses is
consequently higher.


Whatever. See above.


Immunity and lack of exposure are not the same thing.


True enough, but so what? See above.


Do you not wear a seat belt in a Volvo because they are perceived as
safer? Do you not follow traffic safety rules because your Volvo will
protect you? You might get away that false sense of security for a while
just due to statistics, but you *will* get nailed eventually.


I wear a seat belt because they demonstrably reduce injuries, regardless
of the make and model of car in question. It's still a car, and an
unbelted person will collide with something in a crash.

But how does this analogy apply to computers? In Macs, no such "seat
belt" is needed.


Completely false. Macs need security products just like Windoze PCs need
them and any other OS needs them. The relative threat level is lower,
but the threat still exists.


I'm not so sure that Macs need security products (despite the thunder
from the security product vendors), because most seem to do just fine
without such products.


That has nothing to do with "isolationism", it has to do with product
quality. Do you purchase brake shoes for your car from some guy in a
dark alley? Would you expect them to be safe? Why would you expect
any
different if you get your software from equally questionable sources?

I don't see the analogy. Are you claiming that Macs are bought only
with small unmarked bills from junkies in dark and fetid alleys? This
is quite the scoop - I always wondered about them.

Load garbage software (Napster et al) from suspect sources onto a
computer (Mac or Windows or any other OS for that matter) and you *will*
have problems just as surely as putting cheap counterfeit parts on your
car *will* cause problems. Even a top grade ultra secure OSs such as VMS
will have problems if a privileged user were to execute malicious code
on them.


Rehash. See above.


Rehash and still true. A privileged user loading malicious code *will*
hose any OS. The only alternative to this is to keep the users
non-priveleged and/or prevent the loading of non-certified code.


We've drifted off the target. The original question was why it was that
the internet is so much more dangerous for PCs than Macs. If MS had
really copied VMS correctly, Windows would be essentially immune to all
the evil stuff floating around the web. But it's Macs and Linux/UNIX
that are essentially immune, not Windows. What happened?


If you read the PC magazines (yes, PC magazines), you will see that
they
consistently rate the reliability and quality of Macs at the top, with
Dell close behind. Apple beats all PC makers on quality of user
support. Consumer Reports backs these findings up.

Consumer Reports has -zero- credibility with me in -any- subject area.
As for reliability and quality, the PC magazines use some questionable
criteria and also exclude many PC lines from their comparisons. The same
goes for support as the comparisons typically exclude the "business"
lines from the PC manufacturers.


So, who do you believe?


Not CR certainly.


Some source beats no source, any day.


Consumer Reports sends out a survey to their subscribers every year. In
this survey, the subscribers report their experience with all manner of
products, and this experience is summarized and reported.

Specifically, on page 35 of the December 2005 issue of Consumer Reports
appears the "Brand Repair History" (based on 85,000 desktop systems and
49,000 laptops) and "Tech Support" (based on 6,500 responses for
desktops and 4,200 responses for laptops).

There is no better publically available source of by-brand reliability
data. And the scores don't vary that much from year to year, as
reliability and tech support don't just happen, they are achieved if and
only if the company management thinks them important.


I'll have to look in the store to see the report.


OK.


Actually, the old DoD 5200.28 family of standards have been withdrawn by
the DoD, replaced by Common Criteria and DoD 5200.1 and 5200.2. The
formal equivalent to 5200.25 C2-Level is CAPP (Controlled Access
protection level) EAL (Evaluated Assurance Level) 3 or better.

Recent versions of Windows have CAPP EAL3 certs, as do two Linux
distributions. All the major UNIX platforms have EAL3 or EAL4. I
haven't checked MacOS, but I imagine that Apple will get or has gotten
the certs, just so they can sell to DoD. With a BSD base, it won't be
hard.

The fact that Apple had to scrap their entire OS core speaks volumes to
their software expertise. I can't see the DoD buying Macs for what
essentially is nothing more than a UI.


Microsoft scrapped their entire Windows 3.x OS core as well, in favor of
NT.


Yes, but Microsoft rebuilt their core with newly hired expertise.


From DEC. So, we may conclude that MS also discovered that they lacked
the expertise to write an OS.


Apple would seek certification to be able to sell to DoD. If this will
work or not is quite another matter. But don't dismiss it out of hand:

There was a big flurry a few years ago when the US Army (at Ft. Monmouth
if I recall) dumped all their Windows webservers in favor of Mac
servers, mainly because the Army was tired of being hacked despite their
heroic efforts to keep the WinNT servers patched and running.


Oddly enough I run a Windoze web server, mostly 'cause I need that
machine up anyway to run an IVR server and general storage services.
It's been up 24x7 for a couple years now and despite frequent attack
attempts it's still doing fine. Nothing particularly special or heroic
on it either, I do the MS patch thing roughly monthly, have an ordinary
commercial software firewall on it and this server along with the rest
of my network is behind an ordinary Linksys NAT/Firewall router with
only port 80 mapped through to the web server.


Hardware firewalls help a great deal. As do expert IT efforts. How
many average users can manage this level of expertise?


This is nothing recent to Windoze, it has been common for a long time to
"lock down" Windows in business environments so that the users are less
able to compromise the systems.


The problem has been that a fully locked down Windows system is close to
useless, as many Windows apps won't run as anything other than admin.
This has just now started to change, but will take years to achieve what
MacOS now has.


I can't say as I've encountered any Windoze apps that have to run as
administrator. Sure they need to be installed as administrator, but the
ones I've used run fine as a regular user.


The PC and IT magazines discuss this from time to time, especially in
thir security columns.


I can't recall the last time I heard of a VMS or Tandem or Stratus
system being compromised.

By your own logic, this must be only because with their miniscule market
share compared to Windows (and the Mac for that matter), they just were
not worth the trouble to break.

Hardly, since those three OSs control a sizable portion of the financial
world.


Ah. A new issue emerges.

So, we should use only these machines for web surfing. And if they can
achieve safety despite naive users, why can't Windows do the same?


For the same reason the Macs give the false appearance of being immune
to these issues - the simple fact that there are few threats targeting
them. Like I said, a privileged user downloading and running malicious
code will hose any OS.


I'll start to worry when Mac achieves 25% market share. Until then,
Macs are effectively immune.


I think Compaq could have if they had understood what they had and the
enterprise world. HP was hopelessly screwed up with Carley running
around hyping every low margin consumer toy while selling off or
otherwise destroying the diverse underpinnings of the company.


I suspect that you are right.


It's pretty sad, especially with the $45M "thanks for destroying the
company" parting gift.


Yes.



Software is most often the cause of problems and it's only getting worse
as the software gets both more complex and more poorly engineered.

Yes, but reliability is still a problem even if well engineered. It
usually takes a few years of intense post-delivery bug fixing to
achieve reliability.

Indeed, and this is something that MacOS is just as subject to as
Windoze.


Yes, they are still computers. But Apple seems to push it much farther
than Microsoft, and Apple has better control of the Mac ecosystem than
MS has of the Windows ecosystem.


Macs are socialists and PCs are capitalists, and that debate has gone on
for years with no resolution. Each can work, it's just a function of how
much liberty the user is willing to sacrifice in the name of security.


I'm not sure I buy the analogy, but it is lots of fun.


This is a bit self contradictory. Those flighty non-technical creative
types love the Mac but are clueless about IT, have no internet
discipline whatsoever, and yet they prosper.

I think the ones that "prosper" are the ones that keep work and personal
machines separate.


Should not be necessary, as mentioned above in multiple places.


Well, it *is* necessary to either keep the two separate, or have the
knowledge / self control to avoid what should be obvious threats.


Not everybody can be an IT guru.

So, the summary is that Windows is only for IT gurus, and Macs are for
everybody else?


Just think what stolid
uncreative technical types could do with such a tool.

Nothing, absolutely nothing, because the whole Mac concept is to prevent
anyone from doing anything technical.


Not quite. The objective is to hide technical details from those
uninterested in such details.

If the metric is access to *all* the technical details, you should
choose Linux.


Not so. The technical controls are at the terminal window (and root
console) level, and many controls are only at that level. One can argue
that this or that control should or should not be GUI accessible, and
I'm sure that there will be some migration in the coming years, but the
controls are there, but mostly kept away from naive users.


And that only came about in the past couple years. Pre OSX there was no
ability to do anything technical. Post OSX there is some, but post OSX
MacOS is just another Unix variant with a particular flavor of GUI
shell.


Yes and no. I had developer-level knowledge of pre-10 MacOS, and it was
all there, but mostly hidden from average users. It wasn't locked, it's
just that the average user wouldn't know which rock to look under.

MacOS 10 and later keeps all the good stuff under the UNIX rock.



Be careful about who you call a "rehash" (nice neutral word that):
Intel processors are by the same token an absolute hash, retaining
compatibility with every past version, with bits encrusted upon bits.

They had been until the infusion of stolen Alpha technology.


Not exactly. IBM invented the RISK processor architecture, and the
first RISC CPU was the IBM P801.


Ok, but Intel's boost came / is coming from what they stole from Alpha.


Can you cite a court case on this? I'd like to read the ruling.


Nor is the Alpha really a RISC machine, as it had to execute the VMS
instruction set. I no longer recall the details of what was done in
silicon and what was done by the compilers, but the VMS instruction set
is pretty big and complex, even if it is nicely designed. My
recollection is that DEC implemented all the one and two address
instructions in Alpha silicon, and everything else was emulated.


Microcode I believe. I never dug all that deep into it though since I'm
not a programmer.


Not microcode; this is anethema in the RISK world. In RISC, one has a
few very simple but exceedingly fast instructions, from which all else
is cobbled together. This is a large net win, largely because simple
instructions are so much more common than complex ones in actual
application code.




Of course, people switch one way or another for a variety of reasons and
I know at least three or four people that switched from Mac to PC. Since
the installed base of PCs is very high and since there are a number of
alternatives to Windoze that will run nicely on that existing hardware,
the bulk of people looking to switch off of Windoze are likely to follow
the low cost path to keep their existing hardware. When Apple introduces
MacOS for PC hardware then those users might consider it or by then they
may well be happy with one of the other Unix offerings and not be
enticed by the Mac UI.


People switch for multiple reasons, and low hardware cost isn't the only
reason.

And there are people that run multiple platforms.



Um. Circuit Cellar is for hobbiests, not the military industrial
complex.

Hardly. Circuit Cellar is for the embedded engineering world, but uses a
format with the hobby projects of those embedded engineers to highlight
a lot of the new stuff and keep the magazine interesting. As a useless
side note, I've provided some of the props and ideas for the covers in
recent years.


A lot of embedded projects are for microwave ovens and the like. These
are very small systems, and typically use very slow CPUs (because they
don't need anything faster). The Military Industrial Complex is solving
a very different set of problems.


Unfortunately those MI solutions keep getting blown up on the end of a
cruise missile.


Nah. VMEbus cards are too large, and won't fit.

There is a lot more to MI than missiles. Tends to be low volume, high
cost per item.


There is a lot of embedded process control stuff as well.


This is the Industrial end of things. Lots of VMEbus et al. Not
everything can be done on an 8051 microcontroller.


If you look through magazines like Embedded Systems
Programming, you'll get a far different picture. For instance, the bulk
of the VMEbus SBCs (single-board computers) sold are made by Motorola
and use the PowerPC processor. The runner-up is Intel processors, and DOS
isn't dead.

Been quite a while since I've looked at those magazines.


OK. Some libraries have them.


Hmmm, I'm not even sure where the local library is since I moved a year
and a half ago. I should probably find it.


Actually, this may be one that is advertiser supported, and thus is free
to anyone that claims to be in the field and thus a potential customer
of their advertisers.


Um. Hardware abstraction layers are another form of barnacle, and kill
performance. We live with the performance cost for practical reasons,
but there is nonetheless a cost. Look into the history of microkernels
in operating-system design.


It's certainly a barnacle, but it's a bit less problematic than the
hardware version. At least you can get the better hardware and then apps
that need the performance can native code.


That's the reason people pay the price, but it is a price.


It won't happen anytime soon. This has been suggested for years, and
Steve Jobs (a founder and the current CEO of Apple) always says that
allowing MacOS to run on generic PC hardware would put Apple out of
business. I see no reason not to take him at his word.

Given Apple's various product duds and reversals of concepts like open
architecture to closed architecture and back to semi-open architecture,
I see no reason they won't eventually decide to exit the hardware arena.


You mean like the iPod?


I was thinking more of the IIc and Newton for duds. I also don't
consider the iPod or any other "consumer toy" in my overall analysis.


Long time ago. Apple actually has had lots of duds, because they are
always trying new things, and by definition new things aren't always a
success. Said another way, if there are no failures, there is no
innovation.



Macs are about as configurable as Dell PCs, right down to configuring
and ordering from the Apple website. If you like, go to
http://www.apple.com/ and click on the Store tab. You can walk
through the entire chose and configure and price process without having
to register or provide a credit card. (What's in the stores is a
fraction of the configurations available from Apple.)

And exactly how much of that is *user* configurable?


I don't get your point. The Apple store cannot tell if you are a
unwashed user or an administrator, so long as your money spends good.


How much of that can you as the end user at home configure and
reconfigure as your needs change? My PCs go through many incarnations as
I get newer machines and migrate good hardware from older machines. I
also combine leftover bits of old machines into usable machines for
specific tasks. I don't need a GHz machine with dual monitors to run a
PIC programmer for example.


Skilled users can disassemble and remake Macs just the same as one can
with PCs. At this level, hardware is much the same. That said, I've
never found it really useful to do, although I have done it. The core
problem is that the hardware becomes obsolete so fast (in PCs and Macs)
that I don't really have any real hardware investment to protect after
three years. The money is in the application software, and my time.

I instead sold my old machines to a friend up the street for about 10%
of the original new price, and he gave these Macs to his daughters.


All products are packaged in some manner, so you always get more than
you absolutely wanted or needed. I guess I don't see your point.

In the PC world (regardless of OS), I have far greater flexibility to
configure a hardware platform to my exact needs.


If this is simply another way to observe that the Windows ecosystem is a
factor larger than the Mac ecosystem, I agree. But the question was
adequacy for a purpose, not ecosystem size per se.


See my comment above about the "investment protection" the ability to
migrate hardware from older systems and between systems provides.


See above.


Yes and no, Macs could do the same tasks as PCs, but in some cases they
lack the available options (both hardware and software) to do so. It
also too Apple quite a while to join that "public road" and abandon
their proprietary networks and busses (SCSI being the only notable
exception).


And Firewire. And ethernet.


Firewire is far more recent. As for Ethernet, it took Apple an
excruciatingly long time to realize that Ethernet had to be a standard
feature. They clung to their Appletalk stuff for a long time, kind of
like IBM clinging to Token Ring. I also recall the $400 Ethernet cards
for the few Macs that could take them when the PC world had them for
$50.


I don't know. Appletalk on ethernet was used widely, and didn't cost
$400.

Also note that Macs had Appletalk and true networking from 1984, long
before PCs discovered any networking. And Appletalk just worked. That
friend up the street had something like six machines networked in his
home in the late 1980s to early 1990s.



That's not interoperability, that's openness to third party software.
Interoperability is working with established standards, something that
Macs have been loathe to do.


Um. Have you been following Microsoft's tangles with the EU antitrust
regulators? They are proposing fines of a million dollars a day.


The EU antitrust folks are about as far out in left field as they can
get. If I were Bill I would pull all MS products from the EU and void
the license agreements, but that's just me and I can be a bit harsh.


Well the EU antitrust folk may be in left field, but they are the law.



Even the best of laptops have about a 10% failure rate, according to
Consumer Reports, so one can always find someone with a dead laptop.

I think Apple is the only one who had melting laptops though.


True, but lots of PC laptops get too hot to have on one's lap. In any
event, Consumer Reports shows that Apple desktops are by far more
reliable than any PC, but that Apple laptops are in the middle of the PC
range (which isn't that wide).


Dunno, I have or have had a Armada 1590-something, an Armada M700, a
Thinkpad 600X and a Dell 600-something and none of them seem to have any
thermal issues.

Every Mac user I know which is about five has had hardware problems at
one point or another. A small sample granted, but still a 100% problem
rate.


Lots of my friends have had hardware problems, especially with laptops.
But the stats are clear - Mac desktops are far more reliable than PC
desktops. Laptops are all about equally bad.


Macs have been loosing some ground in parts of the graphic world, due to
a number of factors. Since most of the same software is available for
Windoze and works just as well there, and the fact that Windoze is still
a business essential and less expensive per-seat than Macs has led some
companies to ditch Macs in favor of PCs / Windoze (CCI did this).


Macs are something like 60-70% of the Graphics Arts world.


And at one point they were like 95% As noted CCI dumped Macs to their
publishing work since they could get the same apps for Windoze and
eliminate the more expensive machines and additional support issues.


There was some switching a few years ago, but the market saher has been
holding steady recently.


I was the sole holdout at work for many years, long enough to have
managed to miss all the Windows 3.x and NT dramas my coworkers told me
about over lunch. It was a lot of fun, at least for me. After a while,
someone would notice that I had been uncommonly quiet, and would ask me
a question. I would reply with some variant of "I have a Mac, and it
just works, so I have nothing to report". Getting Windows to support
CD/ROM drives was a leading cause of wasted weekends.


That rather mirrors me watching the numerous problems on the zillion
Unix systems while my VMS systems "just work".


VMS (like many mainframes of the day) was quite solid. Too bad it's no
longer mainstream.


Of course I also noted
how my Win95 desktop at a previous job would happily run for 45+ days at
a time while my coworkers identical desktops would crash every few days.


Bet you were popular. Again, there seems to be a difference in IT skill
at work here, but a machine intended for such wide use should require no
IT skill.


Comparisons will be interesting as I see the Mac in its current OSX
form as nothing more than another UI shell available for a standard Unix
base.


The standard retort is that by the same token there is no difference
between a Porche and a Chevy; they are both cars.


Nope, the comparison would be a Chevy and a Ford (or Dodge or Toyota,
etc.). Comparing a Chevy to a Porche is like comparing an ordinary PC to
a multiprocessor, hot swappable, fault tolerant PC server, and a Mac is
not in that class either.


OK, a Chevy and a Cadillac.


How can you possibly justify forcing Microsquish to include competitors
products with their distributions? Is GM required to include Ford
products with the cars they sell just because some users may prefer to
put a Ford dashboard in their GM car? That's about on par with some of
the stuff pushed on Microsquish.


Any company that achieves ~90% market share in an important industry
will find its freedom of action curtailed. The classic example is AT&T,
which achieved a similar market share by methods that are now illegal,
but were common back then. The solution was to turn telephones into a
regulated utility. This is probably one of Microsoft's biggest fears,
but their current conduct makes such an outcome more and more likely.


Except that there is no comparison between a telephone utility and a
commodity OS. There was no choice for the telephone service since it was
delivered by a fixed network, anyone can run whatever OS they chose on
their PC hardware and they have quite a few choices. The fact that they
choose one particular brand of OS overwhelmingly does not make a
monopoly in my book.

Now that there are alternatives to traditional Telcos for voice
services, there is no longer a Telco monopoly either and interestingly
enough we see the old "Bell bits" recombining.


Missing the point. There was no equivalent to the telephone before it
was invented. The issue is that if something becomes that important,
and yet is a "natural monopoly" (where either economics or technical
issues more or less demand that there be a single supplier) , it isn't
long before the monopoly is either broken up (as in the oil industry) or
converted to a regulated utility (as in city gas, electricity, water,
telephones, and to some degree cable TV).


Datapoint: At its peak, IBM had only a 70% market share.


In which market? They cover many markets with vastly different needs.


Computers, which meant mainframes back then. Remember "IBM and the
Seven Dwarves"?


Anyway, when barriers are removed, migration happens. Some will go to
Linux (if they like that unpolished an environment), and some will go
to
MacOS (polished exterior, real UNIX available below).

Yep, Apple might reach 8% market share while Windows drops to 67% and
Linux/Unix rises to 25%.


Truth is, Apple (and Mac users) would be perfectly happy with ~10%
market share.


And that's likely what they will have. I just don't see any compelling
reasons why they would achieve more than this regardless of what
Microsoft does. In fact given totally free hardware and software, I
don't think the MacOS could ever capture more than 50% of the market
since it primarily appeals to the creative types and they are perhaps
50% of the population.


But they are far more fun.


Joe Gwinn


  #91   Report Post  
Posted to rec.crafts.metalworking
Robert Nichols
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

Note: Crosspost to misc.survivalism deleted. Sorry, but I don't
want to post there. See signature for email addr.

In article ,
Gunner wrote:
:
:Connected via modem (kppp) to earthlin
:Contents of resolve.config
:
:
:domain earthlink.net #kppp temp entry
:# nameserver 206.13.28.12 #entry disabled by kppp
:# nameserver 205.152.0.20 #entry disabled by kppp
:nameserver 207.217.126.81 #kppp temp entry
:nameserver 207.217.77.82 #kppp temp entry
:nameserver 207.217.95.8 #kppp temp entry
:===============
oing a route call from terminal
:
:gunner@1[~]$ route
:Kernel IP routing table
estination Gateway Genmask Flags Metric Ref Use Iface
:acn09.ca-pasade * 255.255.255.255 UH 0 0 0 ppp0
:default acn09.ca-pasade 0.0.0.0 UG 0 0 0 ppp0
:gunner@1[~]$
:
:Im running dialup, and the Box is configured to connect to the local
:network here at home, via nic card on eth0. For some reason..it wants
:to send internet requests to the lan, rather than the ISP (via
:dialup), unless I turn off eth0. I run static IPs assigned to each
:machine in the LAN in the 192.168.0,1-192.168.0.15 range, with a mask
f 255.255.255.0, not having a DCHP server assigning them.
:
:Im rather like a moderately retarded kitten flailing around with a
:ball of yarn..and the yarn is winning.
:
:Ok..here is a big question...in dos, we have the config.sys and
:autoexec.bat files, where house keeping settings are made, and then
:various programs are run on boot. What is the Linux equivelent? What
:is the boot sequence, and what is looked for as it boots?
:
:Example..it makes a call and trys to look up ntpdate..outside time
:server on bootup. Where is this and how do I edit it out?
:
:I really really need to get some books on this damit...
:
:Im tempted to reinstall, without the nic card, let it autoconfigure
:the dialup, then add the nic card after.

The arrangement of configuration files and the GUI tools that you
normally use to set them up varies among distributions. As long as you
keep saying, "I've tried these 5 distributions ..." it's hard to give
you specific answers. I'm familiar with Fedora Core 3. I have no idea
which of the others might be similar. Unfortunately, it's been a long
time since I had to configure a PPP connection, and the last time I
did it was with a homebrew script and not the very primitive tools
that were available at the time.

The routing table you've shown looks just fine. It makes no mention of
eth0. Is this because you had eth0 disabled for this test and PPP was
actually working? If so, you need to show what the routine table looks
like when PPP is _not_ working properly. I suspect that your problem
would be solved by configuring eth0 _without_ a default route. The
PPP daemon typically won't override an existing default route.

The nameservers that kppp is finding are working nameservers, and the
output from the 'route' command is showing the result of the reverse
lookup of the gateway IP address, so it looks like DNS queries, at
least, are being correctly routed over the ppp0 interface.

As for disabling functions like NTP, see if you have the command-line
tool 'chkconfig'. If you do (distribution-dependent, again), then
permanently turning off NTP is most likely just a matter of running

chkconfig --level 12345 ntpd off

from the command line. That's a commonly available low-level tool. Any
given distribution almost certainly has a nice GUI tool for enabling and
disabling system services and daemons.

--
Bob Nichols AT comcast.net I am "RNichols42"
  #92   Report Post  
Posted to rec.crafts.metalworking
Donnie Barnes
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

On Thu, 05 Jan, Robert Nichols wrote:
As for disabling functions like NTP, see if you have the command-line
tool 'chkconfig'. If you do (distribution-dependent, again), then
permanently turning off NTP is most likely just a matter of running

chkconfig --level 12345 ntpd off

from the command line. That's a commonly available low-level tool. Any
given distribution almost certainly has a nice GUI tool for enabling and
disabling system services and daemons.


That's "common" to all RH derived distributions, anyway. Not sure how many
others have adopted it, but that stuff was written by RH employees.

Also note that you can run 'ntsysv' to just edit your current runlevel in a
text based UI that's decent. If you aren't the type to ever change your
runlevel regularly and just want to kill or enable stuff in the "normal"
runlevel, ntsysv is for you.

Note too that neither ntsysv or chkconfig will start or stop services, they
only change the flags that say whether those services will get started at
boot time! If ntpd is on and you want it off you have to do:

service ntpd stop

(There are other ways, but this is the easiest and best, but again is only
really standard to systems that use 'service', which includes RH and
derived distros, of which Fedora is one).

So, to turn ntpd off and to not have it come back after a reboot, you need
both the 'service' command above *and* the 'chkconfig' command (or run
ntsysv and uncheck the ntpd box).


--Donnie

--
Donnie Barnes http://www.donniebarnes.com 879. V.
  #93   Report Post  
Posted to rec.crafts.metalworking,misc.survivalism
Pete C.
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

Joseph Gwinn wrote:

I have snipped areas where the debate has become cyclic.

In article ,
"Pete C." wrote:

Joseph Gwinn wrote:

In article ,
"Pete C." wrote:

Joseph Gwinn wrote:


snipped


Apple's usual strategy is the make the technical controls invisible to
the ordinary user, but one can do damn pretty much anything from a
terminal window. Or, if one enables it, the root console.


I can't give you the exact details since it was a good year ago, but
much of the annoyances were related to network configuration and WiFi
configuration. The terminal window was about the only place I was able
to accomplish anything as the UI dialogs were either completely
mislabeled, refused to accept perfectly valid entries, or were missing
the setting options entirely.


I recall that WiFi was a problem for everybody, although the complaints
have died down.


Well, I've setup a bunch of WiFi stuff with Linksys, Netgear, Compaq and
Belkin branded stuff depending on what was on sale and I've not had a
single issue with configuration or interoperability. Only with the Mac
were there issues, not the least of which was the fact that unlike
everyone else, they did not use the proper terminology.


I don't know what she was doing, but clearly you are far more the IT
guru than she. An aggressive or merely clumsy user with admin
privilege can make themselves lots of trouble.

Indeed that is what I've concluded. The myth that Macs are more stable
than PCs is simply that, a myth. I've also noticed that many Mac users
seem to under report the number of system issues, somehow not counting
the need to reinstall an application to get it to work properly as a
system problem.

So we return to the mainstream problem, that Windows systems are far
more fragile in practice than Macs, once one removes the effects of
clumsy meddling. PCs isolated from the world actually work tolerably
well these days, once correctly set up, but how many people want to be
isolated, or to spend their time managing multiple anti-virus and
anti-spyware programs?


Again you are pushing a myth that Windoze users need to spend all their
time managing multiple security programs, this simply isn't true. The
only time Windoze users need to do that is if they want to rely on
multiple freeware security packages vs. spend $40 on a single commercial
product. The fact that Windoze users have more options to chose from is
an asset, not a liability.


The PC mags would disagree. They recommend multiple security packages,
because no one does a good enough job by itself.


Well, my experience differed I guess as I've found no issues with using
a single commercial package.


I submit that your answer above proves my point in spades. Listen
to the tone of voice, and parse the implicit assumptions.

Huh? Hardly. I find the Windoze UI vastly more tolerable than the Mac
UI, largely because I can customize the Windoze UI sufficiently to
eliminate the most annoying parts. This does not in any way indicate
that I am a Windoze fan or bigot, simply that I hate the Mac UI. My OS
preference is VMS, however there is a bit of a shortage of affordable
applications for the things I do.

You really don't hear it, do you? OK, I'll parse it a little:

We'll set the stage with such dispassionate, value neutral statements
like "MacOS was total crap until Apple finally realized they lacked the
expertise to write an OS" - It may be crap, but 25 million users rather
like it, and were known to say similarly unemotional things about DOS
and Windows.

How is an OS that had -no- memory management until the entire OS core
was scrapped and replaced with a Unix core not crap? Windows was
evolving memory management (which I consider to be a fundamental concept
for an OS) in the Win 3.1 days and had it working reasonably well long
before Apple gave up on their OS core.

Because neither original Windows (pre NT) nor original MacOS (pre 10)
had memory management.


Indeed, but Microsoft addressed the problem long (years) before Apple
did.


In a manner of speaking. The DOS core remained long after the supposed
conversion, causing endless memory problems. I think Win2K was the
first to have a clean memory architecture.


I don't recall having any memory issues when my machines were on NT
either.


Microsoft solved the problem by stealing VMS technology form DEC,
yielding the original NT core. Books were written about this deathmarch.


Indeed, it's amazing how much of the computing world originated from
DEC. I didn't follow this history in detail, but I though that Microsoft
hired former DEC folks vs. the more outright stealing of Intel / Alpha
that resulted in a lawsuit.


It's true that MS hired lots of key DEC folk. Don't know about Intel
and the Alpha. Do you have any cites?


I don't have any specific cites, but it was in the national news when it
occurred. Something along the lines of Intel's new CPU design infringing
on a number of HPaqDEC patents shortly after the Intel engineers had had
some sort of meeting with the HPaqDEC engineers. I guess it occurred
after HP was on the scene and Intel was able to con clueless Carley into
just selling them the whole Alpha thing instead of enforcing the patent
rights.



Apple solved the problem by buying NeXT (getting Jobs back as part of
the package deal). NeXT was built from scratch by Jobs after he was
tossed out of Apple some ten years earlier. The OS core of NeXT was BSD
UNIX, from which Jobs and company built a through object oriented
software platform. When NeXT came back, the solution was obvious -
replace the NeXT GUI with a Mac-like GUI, retaining the UNIX core from
NeXT.


That NeXT was an odd little thing that never seemed to go anywhere.


Yes, it only made a few hundred million dollars, so by Jobs' standards
it was a failure. Jobs founded three companies, Apple, NeXT, and Pixar
(Toy Story). Apple and Pixar are worth billions.


Pixar seems to be stumbling a bit these days as the novelty of CGI
movies has worn off.



snip


Downloading stuff (including napster) is very much a part of the net.
It shouldn't be possible for this to cause such problems, even if some
users are naive and some people out there are evil, because such people
have always existed, and always will.

That is perhaps one of the most absurd statements I have ever heard.
Nowhere else in life do people have such an absurd expectation that they
should be magically protected against their reckless actions.

Go walking through a dark alley in the bad part of town at night and you
will probably be mugged and nobody will say that shouldn't be possible.
Hop in your car and go careening down the road ignoring safety rules and
you're going to get in an accident and nobody will say that shouldn't be
possible.

Bad analogy. A better analogy would be to ask if you expect the Police
to keep bad people from coming to your town and mugging people, or
breaking into their homes.


That doesn't work in the real world either. The Police are only there
after the fact, not to prevent anything.


The analogy breaks down, but is still better than the dark-alley
analogy. And we do expect the Police to catch and badly hurt the perps,
so they won't do it again. We don't care if it's because they are
incapacitated, or merely deterred, so long as they stop.


That doesn't change the fact that *you* are individually responsible for
your own safety and security. The police may be there after the fact,
but if *you* expose yourself to a predictable threat i.e. walking down
that alley in the bad part of town, or running without adequate security
software, *you* will still suffer the consequences of *your*
carelessness.


Most users are experts at something other than the computers that they
must use as a tool. It's expecting far too much to think that this will
ever change, if for no other reason that by definition the clueless
vastly outnumber the priesthood. But the money from vast unwashed
clueless masses folds and spends pretty good, and keeps the priesthood
in beer.


True, but has little to do with the technical merits of the Mac OS vs.
Windoze vs. whatever else.


Not exactly. A system that doesn't need so much effort to make safe is
going to very much easier to live with for those users that are not IT
gurus.


In theory perhaps, but as we all know, the masses rarely make decisions
based on a careful analysis of technical details. The masses choose
Windoze largely because it does an adequate job and it's just what
everyone else uses.


The fact is that whether you are using a Mac or a PC, downloading (and
running) questionable junk such as Napster *will* cause problems and I
have seen plenty of examples of this on both platforms. The Mac provides
no more protection from these careless user actions than Windoze does.

Tell me, would using a VMS host to download stuff from even the worst
sites cause any danger? If not, why can't Windows do that too?


If you were to find a "worst site" that actually had malicious VMS code
and were to download and run it as a privileged user it certainly would.


Misses the point.


No, the point is there, the perceived security of the Mac is a myth that
would rapidly evaporate if Macs ever achieved any large market share. As
soon as there was a sufficient mass to make an attractive target for the
various virus writing scum, the Mac world would go down in flames until
they too security seriously. In the meantime the Macs enjoy simulated
security through obscurity.



For whatever reason, Macs are immune to this stuff. You can argue that
it's mere insignificance that protects the Macs, but the fact remains
that Macs are immune.


The Macs are in no way immune, there are simply fewer bugs for them to
catch currently. They are the "bubble boy" who does just fine while
isolated from the germs but will quickly die in the real world due to
their lack of defenses.


It may be true that if Macs were 90% of the market, they would fall
under the assault. But Macs aren't going to achieve 90%, so what's your
point?


My point is don't claim that Macs are secure or are immune to attack
when they are not. See "simulated security through obscurity" above.


Your belief that Macs don't need to be "protected" from
the 'net is also false.

Most Macs do not run with any virus protection whatsoever, and are none
the worse for it.

And that is simply a function of threat volume and statistics, not a
function of platform security. Because PCs running Windoze outnumber
Macs 20:1 the volume of viruses targeting Windoze outnumbers those
targeting Macs by an even larger ratio and the probability of a
particular Windoze user being hit by one of those viruses is
consequently higher.

Whatever. See above.


Immunity and lack of exposure are not the same thing.


True enough, but so what? See above.


Then claim that Macs benefit from their obscurity, not that they are
more immune / secure which they are not.


Do you not wear a seat belt in a Volvo because they are perceived as
safer? Do you not follow traffic safety rules because your Volvo will
protect you? You might get away that false sense of security for a while
just due to statistics, but you *will* get nailed eventually.

I wear a seat belt because they demonstrably reduce injuries, regardless
of the make and model of car in question. It's still a car, and an
unbelted person will collide with something in a crash.

But how does this analogy apply to computers? In Macs, no such "seat
belt" is needed.


Completely false. Macs need security products just like Windoze PCs need
them and any other OS needs them. The relative threat level is lower,
but the threat still exists.


I'm not so sure that Macs need security products (despite the thunder
from the security product vendors), because most seem to do just fine
without such products.


They need security product if the user wants to help insure they do not
experience an attack of one sort or another. They may get along fine for
some length of time without security products, but they will eventually
suffer an attack. Cars did "just fine" for quite a while before seat
belts became the norm too. Of course now they try to force the asinine
airbags on the unsuspecting masses, kind of like security software that
formats your hard drive if it detects a virus it doesn't know how to
clean.


That has nothing to do with "isolationism", it has to do with product
quality. Do you purchase brake shoes for your car from some guy in a
dark alley? Would you expect them to be safe? Why would you expect
any
different if you get your software from equally questionable sources?

I don't see the analogy. Are you claiming that Macs are bought only
with small unmarked bills from junkies in dark and fetid alleys? This
is quite the scoop - I always wondered about them.

Load garbage software (Napster et al) from suspect sources onto a
computer (Mac or Windows or any other OS for that matter) and you *will*
have problems just as surely as putting cheap counterfeit parts on your
car *will* cause problems. Even a top grade ultra secure OSs such as VMS
will have problems if a privileged user were to execute malicious code
on them.

Rehash. See above.


Rehash and still true. A privileged user loading malicious code *will*
hose any OS. The only alternative to this is to keep the users
non-priveleged and/or prevent the loading of non-certified code.


We've drifted off the target. The original question was why it was that
the internet is so much more dangerous for PCs than Macs. If MS had
really copied VMS correctly, Windows would be essentially immune to all
the evil stuff floating around the web.


Not true at all. Even if Windoze incorporate just about all of VMS, with
it having the mass of Windoze to be a target of attack, and with all the
users operating as SYSTEM (administrator for Windoze) you'd have exactly
the same situation. Keep the users limited to minimal privileges and
you'd be ok for the most part, same with Windoze.

But it's Macs and Linux/UNIX
that are essentially immune, not Windows. What happened?


Linux/UNIX are not even remotely close to immune, they require even more
active effort than Windoze to attain any measure of security. Macs are
of course also not immune, they are simply obscure enough to not be much
of a target.


If you read the PC magazines (yes, PC magazines), you will see that
they
consistently rate the reliability and quality of Macs at the top, with
Dell close behind. Apple beats all PC makers on quality of user
support. Consumer Reports backs these findings up.

Consumer Reports has -zero- credibility with me in -any- subject area.
As for reliability and quality, the PC magazines use some questionable
criteria and also exclude many PC lines from their comparisons. The same
goes for support as the comparisons typically exclude the "business"
lines from the PC manufacturers.

So, who do you believe?


Not CR certainly.


Some source beats no source, any day.


Invalid data is *not* better than no data.


Consumer Reports sends out a survey to their subscribers every year. In
this survey, the subscribers report their experience with all manner of
products, and this experience is summarized and reported.

Specifically, on page 35 of the December 2005 issue of Consumer Reports
appears the "Brand Repair History" (based on 85,000 desktop systems and
49,000 laptops) and "Tech Support" (based on 6,500 responses for
desktops and 4,200 responses for laptops).

There is no better publically available source of by-brand reliability
data. And the scores don't vary that much from year to year, as
reliability and tech support don't just happen, they are achieved if and
only if the company management thinks them important.


I'll have to look in the store to see the report.


OK.


I looked and they only had the Jan issue.



Actually, the old DoD 5200.28 family of standards have been withdrawn by
the DoD, replaced by Common Criteria and DoD 5200.1 and 5200.2. The
formal equivalent to 5200.25 C2-Level is CAPP (Controlled Access
protection level) EAL (Evaluated Assurance Level) 3 or better.

Recent versions of Windows have CAPP EAL3 certs, as do two Linux
distributions. All the major UNIX platforms have EAL3 or EAL4. I
haven't checked MacOS, but I imagine that Apple will get or has gotten
the certs, just so they can sell to DoD. With a BSD base, it won't be
hard.

The fact that Apple had to scrap their entire OS core speaks volumes to
their software expertise. I can't see the DoD buying Macs for what
essentially is nothing more than a UI.

Microsoft scrapped their entire Windows 3.x OS core as well, in favor of
NT.


Yes, but Microsoft rebuilt their core with newly hired expertise.


From DEC. So, we may conclude that MS also discovered that they lacked
the expertise to write an OS.


Indeed, but acquiring that expertise and just acquiring a ready-to-go OS
core are not exactly the same thing. Kind of like hiring a few mechanics
to help you build a race car vs. buying a race car and putting your
paint job on it.


Apple would seek certification to be able to sell to DoD. If this will
work or not is quite another matter. But don't dismiss it out of hand:

There was a big flurry a few years ago when the US Army (at Ft. Monmouth
if I recall) dumped all their Windows webservers in favor of Mac
servers, mainly because the Army was tired of being hacked despite their
heroic efforts to keep the WinNT servers patched and running.


Oddly enough I run a Windoze web server, mostly 'cause I need that
machine up anyway to run an IVR server and general storage services.
It's been up 24x7 for a couple years now and despite frequent attack
attempts it's still doing fine. Nothing particularly special or heroic
on it either, I do the MS patch thing roughly monthly, have an ordinary
commercial software firewall on it and this server along with the rest
of my network is behind an ordinary Linksys NAT/Firewall router with
only port 80 mapped through to the web server.


Hardware firewalls help a great deal. As do expert IT efforts. How
many average users can manage this level of expertise?


Um, just about any of them. I've got a Linksys router plugged into the
cable modem, and plain old Norton Personal Firewall on the server. The
only this remotely "configured" is one port map on the router to map
port 80 to the address of the server. This is not brain surgery by any
means and it's worked just fine. The two lightning strikes that took out
two cable modems, two routers and a couple Ethernet ports are another
story.


This is nothing recent to Windoze, it has been common for a long time to
"lock down" Windows in business environments so that the users are less
able to compromise the systems.

The problem has been that a fully locked down Windows system is close to
useless, as many Windows apps won't run as anything other than admin.
This has just now started to change, but will take years to achieve what
MacOS now has.


I can't say as I've encountered any Windoze apps that have to run as
administrator. Sure they need to be installed as administrator, but the
ones I've used run fine as a regular user.


The PC and IT magazines discuss this from time to time, especially in
thir security columns.


Dunno, haven't read them or run across any issues personally.


I can't recall the last time I heard of a VMS or Tandem or Stratus
system being compromised.

By your own logic, this must be only because with their miniscule market
share compared to Windows (and the Mac for that matter), they just were
not worth the trouble to break.

Hardly, since those three OSs control a sizable portion of the financial
world.

Ah. A new issue emerges.

So, we should use only these machines for web surfing. And if they can
achieve safety despite naive users, why can't Windows do the same?


For the same reason the Macs give the false appearance of being immune
to these issues - the simple fact that there are few threats targeting
them. Like I said, a privileged user downloading and running malicious
code will hose any OS.


I'll start to worry when Mac achieves 25% market share. Until then,
Macs are effectively immune.


Obscure, not immune.



snipped

Software is most often the cause of problems and it's only getting worse
as the software gets both more complex and more poorly engineered.

Yes, but reliability is still a problem even if well engineered. It
usually takes a few years of intense post-delivery bug fixing to
achieve reliability.

Indeed, and this is something that MacOS is just as subject to as
Windoze.

Yes, they are still computers. But Apple seems to push it much farther
than Microsoft, and Apple has better control of the Mac ecosystem than
MS has of the Windows ecosystem.


Macs are socialists and PCs are capitalists, and that debate has gone on
for years with no resolution. Each can work, it's just a function of how
much liberty the user is willing to sacrifice in the name of security.


I'm not sure I buy the analogy, but it is lots of fun.


Less control i.e. freedom or access to technical settings in exchange
for more protection from threats i.e. social security blanket or
protection from screwing up your computer. I think the analogy fits.


This is a bit self contradictory. Those flighty non-technical creative
types love the Mac but are clueless about IT, have no internet
discipline whatsoever, and yet they prosper.

I think the ones that "prosper" are the ones that keep work and personal
machines separate.

Should not be necessary, as mentioned above in multiple places.


Well, it *is* necessary to either keep the two separate, or have the
knowledge / self control to avoid what should be obvious threats.


Not everybody can be an IT guru.


No, but it doesn't require an IT guru to keep work and personal machines
separate.


So, the summary is that Windows is only for IT gurus, and Macs are for
everybody else?


Nope, but it does appear that Windoze users may be better at keeping
work and personal machines separate.


Just think what stolid
uncreative technical types could do with such a tool.

Nothing, absolutely nothing, because the whole Mac concept is to prevent
anyone from doing anything technical.


Not quite. The objective is to hide technical details from those
uninterested in such details.


Up until OSX they were more than just hidden.


If the metric is access to *all* the technical details, you should
choose Linux.


Certainly the metric is access to the technical details needed to
perform what I would consider "normal" configuration tasks.


Not so. The technical controls are at the terminal window (and root
console) level, and many controls are only at that level. One can argue
that this or that control should or should not be GUI accessible, and
I'm sure that there will be some migration in the coming years, but the
controls are there, but mostly kept away from naive users.


And that only came about in the past couple years. Pre OSX there was no
ability to do anything technical. Post OSX there is some, but post OSX
MacOS is just another Unix variant with a particular flavor of GUI
shell.


Yes and no. I had developer-level knowledge of pre-10 MacOS, and it was
all there, but mostly hidden from average users. It wasn't locked, it's
just that the average user wouldn't know which rock to look under.


How much of it was potentially accessible to the user with the knowledge
of where to find it vs. accessible to a user that purchased additional
development utility packages?


MacOS 10 and later keeps all the good stuff under the UNIX rock.


Right, it is at least accessible without the need for any additional
software.


Be careful about who you call a "rehash" (nice neutral word that):
Intel processors are by the same token an absolute hash, retaining
compatibility with every past version, with bits encrusted upon bits.

They had been until the infusion of stolen Alpha technology.

Not exactly. IBM invented the RISK processor architecture, and the
first RISC CPU was the IBM P801.


Ok, but Intel's boost came / is coming from what they stole from Alpha.


Can you cite a court case on this? I'd like to read the ruling.


It was settled out of court after Intel managed to con HP's clueless CEO
into just selling them the whole Alpha lot. It was in the national news
when it occurred.


Nor is the Alpha really a RISC machine, as it had to execute the VMS
instruction set. I no longer recall the details of what was done in
silicon and what was done by the compilers, but the VMS instruction set
is pretty big and complex, even if it is nicely designed. My
recollection is that DEC implemented all the one and two address
instructions in Alpha silicon, and everything else was emulated.


Microcode I believe. I never dug all that deep into it though since I'm
not a programmer.


Not microcode; this is anethema in the RISK world. In RISC, one has a
few very simple but exceedingly fast instructions, from which all else
is cobbled together. This is a large net win, largely because simple
instructions are so much more common than complex ones in actual
application code.


Right, and to emulate the CISC processor you need microcode calls to
produce the equivalent of the CISC instructions.




Of course, people switch one way or another for a variety of reasons and
I know at least three or four people that switched from Mac to PC. Since
the installed base of PCs is very high and since there are a number of
alternatives to Windoze that will run nicely on that existing hardware,
the bulk of people looking to switch off of Windoze are likely to follow
the low cost path to keep their existing hardware. When Apple introduces
MacOS for PC hardware then those users might consider it or by then they
may well be happy with one of the other Unix offerings and not be
enticed by the Mac UI.


People switch for multiple reasons, and low hardware cost isn't the only
reason.


Indeed, and for the general masses, being like everyone else is a big
factor. Sure there are those that try to be different to make a
statement, but they are a minority and somehow they all end up looking
the same as well.


And there are people that run multiple platforms.


Indeed there are, however I have not seen many people that run both PCs
and Macs for any length of time, generally one is abandoned because they
do not provide significantly different functionality for most people. Of
course for most specialized applications the PC wins since the apps
aren't available for a Mac.


Um. Circuit Cellar is for hobbiests, not the military industrial
complex.

Hardly. Circuit Cellar is for the embedded engineering world, but uses a
format with the hobby projects of those embedded engineers to highlight
a lot of the new stuff and keep the magazine interesting. As a useless
side note, I've provided some of the props and ideas for the covers in
recent years.

A lot of embedded projects are for microwave ovens and the like. These
are very small systems, and typically use very slow CPUs (because they
don't need anything faster). The Military Industrial Complex is solving
a very different set of problems.


Unfortunately those MI solutions keep getting blown up on the end of a
cruise missile.


Nah. VMEbus cards are too large, and won't fit.


And most anything with connectors is a bit of a problem with those G
forces and vibration.


There is a lot more to MI than missiles. Tends to be low volume, high
cost per item.


Some nifty stuff in the surveillance end of things like on the EP3s. I
think I made them nervous shooting detailed pics and video of everything
in that plane, but nobody told me to stop...


There is a lot of embedded process control stuff as well.


This is the Industrial end of things. Lots of VMEbus et al. Not
everything can be done on an 8051 microcontroller.


You don't see too much 8051 stuff in CCI these days.


If you look through magazines like Embedded Systems
Programming, you'll get a far different picture. For instance, the bulk
of the VMEbus SBCs (single-board computers) sold are made by Motorola
and use the PowerPC processor. The runner-up is Intel processors, and DOS
isn't dead.

Been quite a while since I've looked at those magazines.

OK. Some libraries have them.


Hmmm, I'm not even sure where the local library is since I moved a year
and a half ago. I should probably find it.


Actually, this may be one that is advertiser supported, and thus is free
to anyone that claims to be in the field and thus a potential customer
of their advertisers.


Most of them are advertiser supported outside CCI. I just really don't
need them and don't have a need to kill all those trees even if they are
renewable. Now perhaps if I started heating my house with a free
magazine fueled furnace...


Um. Hardware abstraction layers are another form of barnacle, and kill
performance. We live with the performance cost for practical reasons,
but there is nonetheless a cost. Look into the history of microkernels
in operating-system design.


It's certainly a barnacle, but it's a bit less problematic than the
hardware version. At least you can get the better hardware and then apps
that need the performance can native code.


That's the reason people pay the price, but it is a price.


I'm not in the development world, but I think the languages and cross
compilers are making the code conversion fairly painless these days.


It won't happen anytime soon. This has been suggested for years, and
Steve Jobs (a founder and the current CEO of Apple) always says that
allowing MacOS to run on generic PC hardware would put Apple out of
business. I see no reason not to take him at his word.

Given Apple's various product duds and reversals of concepts like open
architecture to closed architecture and back to semi-open architecture,
I see no reason they won't eventually decide to exit the hardware arena.

You mean like the iPod?


I was thinking more of the IIc and Newton for duds. I also don't
consider the iPod or any other "consumer toy" in my overall analysis.


Long time ago. Apple actually has had lots of duds, because they are
always trying new things, and by definition new things aren't always a
success. Said another way, if there are no failures, there is no
innovation.


Except their duds mostly get swept under the rug, where Microsoft's duds
are always hyped and paraded.


Macs are about as configurable as Dell PCs, right down to configuring
and ordering from the Apple website. If you like, go to
http://www.apple.com/ and click on the Store tab. You can walk
through the entire chose and configure and price process without having
to register or provide a credit card. (What's in the stores is a
fraction of the configurations available from Apple.)

And exactly how much of that is *user* configurable?

I don't get your point. The Apple store cannot tell if you are a
unwashed user or an administrator, so long as your money spends good.


How much of that can you as the end user at home configure and
reconfigure as your needs change? My PCs go through many incarnations as
I get newer machines and migrate good hardware from older machines. I
also combine leftover bits of old machines into usable machines for
specific tasks. I don't need a GHz machine with dual monitors to run a
PIC programmer for example.


Skilled users can disassemble and remake Macs just the same as one can
with PCs. At this level, hardware is much the same. That said, I've
never found it really useful to do, although I have done it. The core
problem is that the hardware becomes obsolete so fast (in PCs and Macs)
that I don't really have any real hardware investment to protect after
three years. The money is in the application software, and my time.


Depends on the applications I guess. In the past I've moved things like
DVD-R drives and specialized video and I/O cards. Those whose
applications are software only would not have this need.


I instead sold my old machines to a friend up the street for about 10%
of the original new price, and he gave these Macs to his daughters.


I've both given away old machines, and reconfigured them for dedicated
tasks where they were still quite adequate.



snipped


Yes and no, Macs could do the same tasks as PCs, but in some cases they
lack the available options (both hardware and software) to do so. It
also too Apple quite a while to join that "public road" and abandon
their proprietary networks and busses (SCSI being the only notable
exception).

And Firewire. And ethernet.


Firewire is far more recent. As for Ethernet, it took Apple an
excruciatingly long time to realize that Ethernet had to be a standard
feature. They clung to their Appletalk stuff for a long time, kind of
like IBM clinging to Token Ring. I also recall the $400 Ethernet cards
for the few Macs that could take them when the PC world had them for
$50.


I don't know. Appletalk on ethernet was used widely, and didn't cost
$400.


Dunno, getting the one Mac in the office at the old job onto Ethernet
did cost a bloody fortune compared to all the PCs. Think it was one of
the Macs that was a non-standard bus, perhaps nubus?


Also note that Macs had Appletalk and true networking from 1984, long
before PCs discovered any networking.


Mainstream PCs perhaps. Same with graphics since PCs were doing high end
graphics well before Macs even found color.

And Appletalk just worked. That
friend up the street had something like six machines networked in his
home in the late 1980s to early 1990s.


I had several PCs networked at home in that general timeframe as well.
Towards the tail end of that timeframe, but that corresponded to me
having more than one machine at home at all.


That's not interoperability, that's openness to third party software.
Interoperability is working with established standards, something that
Macs have been loathe to do.

Um. Have you been following Microsoft's tangles with the EU antitrust
regulators? They are proposing fines of a million dollars a day.


The EU antitrust folks are about as far out in left field as they can
get. If I were Bill I would pull all MS products from the EU and void
the license agreements, but that's just me and I can be a bit harsh.


Well the EU antitrust folk may be in left field, but they are the law.


Which is why I would simply pull out of that market if I was Bill. I
certainly would not do business of license my products for use in a
country or countries that tried to force me to bundle competitors
products with my own.


Even the best of laptops have about a 10% failure rate, according to
Consumer Reports, so one can always find someone with a dead laptop.

I think Apple is the only one who had melting laptops though.

True, but lots of PC laptops get too hot to have on one's lap. In any
event, Consumer Reports shows that Apple desktops are by far more
reliable than any PC, but that Apple laptops are in the middle of the PC
range (which isn't that wide).


Dunno, I have or have had a Armada 1590-something, an Armada M700, a
Thinkpad 600X and a Dell 600-something and none of them seem to have any
thermal issues.

Every Mac user I know which is about five has had hardware problems at
one point or another. A small sample granted, but still a 100% problem
rate.


Lots of my friends have had hardware problems, especially with laptops.
But the stats are clear - Mac desktops are far more reliable than PC
desktops. Laptops are all about equally bad.


Still 100% problem rate for the small sample of Mac desktops I know of.


Macs have been loosing some ground in parts of the graphic world, due to
a number of factors. Since most of the same software is available for
Windoze and works just as well there, and the fact that Windoze is still
a business essential and less expensive per-seat than Macs has led some
companies to ditch Macs in favor of PCs / Windoze (CCI did this).

Macs are something like 60-70% of the Graphics Arts world.


And at one point they were like 95% As noted CCI dumped Macs to their
publishing work since they could get the same apps for Windoze and
eliminate the more expensive machines and additional support issues.


There was some switching a few years ago, but the market saher has been
holding steady recently.


That may be more related to the general freeze in the business computer
world due to the economic issues. Since change has upfront costs even if
it may save long term businesses have been holding to the current
systems. As the economy readjusts and new initiatives begin I think
there will be additional shifting taking place.

The fact that nearly all applications for the Mac are also available for
the PC makes it fairly easy to generate a long term savings by
eliminating "non-standard" platforms and the support personnel
associated with them.


I was the sole holdout at work for many years, long enough to have
managed to miss all the Windows 3.x and NT dramas my coworkers told me
about over lunch. It was a lot of fun, at least for me. After a while,
someone would notice that I had been uncommonly quiet, and would ask me
a question. I would reply with some variant of "I have a Mac, and it
just works, so I have nothing to report". Getting Windows to support
CD/ROM drives was a leading cause of wasted weekends.


That rather mirrors me watching the numerous problems on the zillion
Unix systems while my VMS systems "just work".


VMS (like many mainframes of the day) was quite solid. Too bad it's no
longer mainstream.


It is still quite solid, and oddly enough I keep hearing of new
installations here and there and upgrades to existing ones. It doesn't
seem to be growing, but it also doesn't seem to be disappearing either.
Where it's phased out in one company it seems to pop up somewhere else.


Of course I also noted
how my Win95 desktop at a previous job would happily run for 45+ days at
a time while my coworkers identical desktops would crash every few days.


Bet you were popular. Again, there seems to be a difference in IT skill
at work here, but a machine intended for such wide use should require no
IT skill.


I'm always pop-pop-popular as the "certified jack of all trades".



Comparisons will be interesting as I see the Mac in its current OSX
form as nothing more than another UI shell available for a standard Unix
base.

The standard retort is that by the same token there is no difference
between a Porche and a Chevy; they are both cars.


Nope, the comparison would be a Chevy and a Ford (or Dodge or Toyota,
etc.). Comparing a Chevy to a Porche is like comparing an ordinary PC to
a multiprocessor, hot swappable, fault tolerant PC server, and a Mac is
not in that class either.


OK, a Chevy and a Cadillac.


At which point you have the same function and reliability, with the only
difference being in the cosmetics. I for one don't want my computer to
look stylish, I want it to do it's task reliably and inexpensively. I'm
not the type who wants to put a "stylish" Mac on my neatly prop-arranged
desk to impress my friends, I put my server PCs, network gear and big
UPS in a standard 19" equipment rack in the back corner of my garage.


How can you possibly justify forcing Microsquish to include competitors
products with their distributions? Is GM required to include Ford
products with the cars they sell just because some users may prefer to
put a Ford dashboard in their GM car? That's about on par with some of
the stuff pushed on Microsquish.

Any company that achieves ~90% market share in an important industry
will find its freedom of action curtailed. The classic example is AT&T,
which achieved a similar market share by methods that are now illegal,
but were common back then. The solution was to turn telephones into a
regulated utility. This is probably one of Microsoft's biggest fears,
but their current conduct makes such an outcome more and more likely.


Except that there is no comparison between a telephone utility and a
commodity OS. There was no choice for the telephone service since it was
delivered by a fixed network, anyone can run whatever OS they chose on
their PC hardware and they have quite a few choices. The fact that they
choose one particular brand of OS overwhelmingly does not make a
monopoly in my book.

Now that there are alternatives to traditional Telcos for voice
services, there is no longer a Telco monopoly either and interestingly
enough we see the old "Bell bits" recombining.


Missing the point. There was no equivalent to the telephone before it
was invented. The issue is that if something becomes that important,
and yet is a "natural monopoly" (where either economics or technical
issues more or less demand that there be a single supplier) , it isn't
long before the monopoly is either broken up (as in the oil industry) or
converted to a regulated utility (as in city gas, electricity, water,
telephones, and to some degree cable TV).


There have been alternatives to Windows for as long as there has been
Windows, hence no monopoly. Windows simply won the popularity contest
and is now attacked because of it.


Datapoint: At its peak, IBM had only a 70% market share.


In which market? They cover many markets with vastly different needs.


Computers, which meant mainframes back then. Remember "IBM and the
Seven Dwarves"?


I wasn't real involved in the computer world pre-midrange, not really
old enough at 36.


Anyway, when barriers are removed, migration happens. Some will go to
Linux (if they like that unpolished an environment), and some will go
to
MacOS (polished exterior, real UNIX available below).

Yep, Apple might reach 8% market share while Windows drops to 67% and
Linux/Unix rises to 25%.

Truth is, Apple (and Mac users) would be perfectly happy with ~10%
market share.


And that's likely what they will have. I just don't see any compelling
reasons why they would achieve more than this regardless of what
Microsoft does. In fact given totally free hardware and software, I
don't think the MacOS could ever capture more than 50% of the market
since it primarily appeals to the creative types and they are perhaps
50% of the population.


But they are far more fun.


Dunno, us technical creative types can be fun too. Certainly my friends
are often amused at the over-the-top solutions I pull out of my
posterior for some insurmountable problems.

Pete C.
  #94   Report Post  
Posted to rec.crafts.metalworking
Jon Elson
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

Pete C. wrote:
Jon Elson wrote:

Cydrome Leader wrote:

In rec.crafts.metalworking Gunner wrote:


Ok..for all you Linux junkies...this has been driving me nuts for
months and months and ... well you get the idea.
Originally..I thought this issue was hardware....but now...


What are you trying to do that justifies all the time wasted on trying to
run anything but windows?


Umm, I've been running Linux systems here, where I have a web server
and a mail server online 24/7. I have not had a successful hacking
attack in over 2 years, but they try 20+ times a day. My (several)
systems are often up for 60 - 90 days before a power outage hits them.
I essentially have never had a real crash. Sometimes a particular
software component gets confused, and I need to manually reset it.
That happens maybe once or twice a year. I have never seen a
software system as reliable as Linux, in 35 years working with
computers. (DEC's VMS came pretty close.)

Yes, there are a few things that still need work, and network
configuration is one of the weak points. They expect you to be
a network guru. I think I did get a regular modem working once,
years ago, and it was a tricky bit of business to complete the
auto login with the scripts, automatically.

Jon



I've had VMS systems with uptimes in excess of 800 days.

You just about have to have a UPS for that. And, while some
special systems can stay up that long, we had a bunch of guys
compiling new (and often badly-written) programs all day. That
takes a toll on any OS. And, I know a LOT of ways to foul
up a VMS system. My favorite is to just let the file system
run, without maintenance ("rolling the disks") for 6 months
or so. Guaranteed system crash, and DEC NEVER came up with
a fix until past where we cared. Also, clustering was supposed
to INCREASE reliability and availability, but without a lot
of dedicated hardware, it actually REDUCED availability. If
any machine hosting disk volumes crashed, it would pretty
much lock up all other machines that opened even one file
on the failed CPU. We couldn't afford star couplers and
dedicated disk servers for our Vaxstations.

Jon

  #95   Report Post  
Posted to rec.crafts.metalworking
Jon Elson
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

Gunner wrote:

Well..with the kind help of you good folks..I managed to bumble my way
into getting this bitch online..but its still not right. If I closed
etho0, it will allow me to go on line. The moment I start it..it cuts off
the internet. This after inputting the proper dns numbers. Ive got some
clues where to start looking..but this is still a whole new ball game to
me.

Of course, this makes total sense. eth0 is probably set up to route
all destinations that are not explicitly routed to a known host.
Do the route command, and you'll see a list of what addresses get routed
where. 127.0.0.1 is the loopback route to the local machine.
0.0.0.0 or default is the route to anywhere not otherwise listed.

If you want to make your machine with the modem a local router for your
home net, you will have to delete the route entry for 0.0.0.0 on device
eth0. That would be a command like this :

route del default dev eth0

then you'd have to set up a route to the ppp "device", although if the
net works through the modem, that most likely is there already.

If you are not using any other machines in the home that need to be
networked to the modem, then just leave eth0 disabled.

Jon



  #96   Report Post  
Posted to rec.crafts.metalworking
Pete C.
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

Jon Elson wrote:

Pete C. wrote:
Jon Elson wrote:

Cydrome Leader wrote:

In rec.crafts.metalworking Gunner wrote:


Ok..for all you Linux junkies...this has been driving me nuts for
months and months and ... well you get the idea.
Originally..I thought this issue was hardware....but now...


What are you trying to do that justifies all the time wasted on trying to
run anything but windows?

Umm, I've been running Linux systems here, where I have a web server
and a mail server online 24/7. I have not had a successful hacking
attack in over 2 years, but they try 20+ times a day. My (several)
systems are often up for 60 - 90 days before a power outage hits them.
I essentially have never had a real crash. Sometimes a particular
software component gets confused, and I need to manually reset it.
That happens maybe once or twice a year. I have never seen a
software system as reliable as Linux, in 35 years working with
computers. (DEC's VMS came pretty close.)

Yes, there are a few things that still need work, and network
configuration is one of the weak points. They expect you to be
a network guru. I think I did get a regular modem working once,
years ago, and it was a tricky bit of business to complete the
auto login with the scripts, automatically.

Jon



I've had VMS systems with uptimes in excess of 800 days.

You just about have to have a UPS for that. And, while some
special systems can stay up that long, we had a bunch of guys
compiling new (and often badly-written) programs all day. That
takes a toll on any OS. And, I know a LOT of ways to foul
up a VMS system. My favorite is to just let the file system
run, without maintenance ("rolling the disks") for 6 months
or so. Guaranteed system crash, and DEC NEVER came up with
a fix until past where we cared. Also, clustering was supposed
to INCREASE reliability and availability, but without a lot
of dedicated hardware, it actually REDUCED availability. If
any machine hosting disk volumes crashed, it would pretty
much lock up all other machines that opened even one file
on the failed CPU. We couldn't afford star couplers and
dedicated disk servers for our Vaxstations.

Jon


Banks tend to be serious about backups, whether they be the multiple
UPSes, generators, redundant disk controllers, mirroring / RAID, etc.

I don't recall any VAXstations that had CI capability, mostly DSSI.

Pete C.
  #97   Report Post  
Posted to rec.crafts.metalworking
Dave Hinz
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

On Fri, 06 Jan 2006 00:32:46 -0600, Jon Elson wrote:
Gunner wrote:

Well..with the kind help of you good folks..I managed to bumble my way
into getting this bitch online..but its still not right.


Of course, this makes total sense. eth0 is probably set up to route
all destinations that are not explicitly routed to a known host.
Do the route command, and you'll see a list of what addresses get routed
where. 127.0.0.1 is the loopback route to the local machine.
0.0.0.0 or default is the route to anywhere not otherwise listed.


Yeah, that makes sense, but the point remains that this is a very
unusual workaround, and in the...8? years I've been working with Linux
(I dunno, Redhat 5.1 days, whenever that was...) I've never needed to
screw around with that area, at all. Something is odd.

Gunner, have you tried something like a Knoppix liveCD on your existing
hardware? Run right from the CD, see how it works?

Dave Hinz
  #98   Report Post  
Posted to rec.crafts.metalworking
Gunner
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

On 6 Jan 2006 12:44:30 GMT, Dave Hinz wrote:

On Fri, 06 Jan 2006 00:32:46 -0600, Jon Elson wrote:
Gunner wrote:

Well..with the kind help of you good folks..I managed to bumble my way
into getting this bitch online..but its still not right.


Of course, this makes total sense. eth0 is probably set up to route
all destinations that are not explicitly routed to a known host.
Do the route command, and you'll see a list of what addresses get routed
where. 127.0.0.1 is the loopback route to the local machine.
0.0.0.0 or default is the route to anywhere not otherwise listed.


Yeah, that makes sense, but the point remains that this is a very
unusual workaround, and in the...8? years I've been working with Linux
(I dunno, Redhat 5.1 days, whenever that was...) I've never needed to
screw around with that area, at all. Something is odd.

Gunner, have you tried something like a Knoppix liveCD on your existing
hardware? Run right from the CD, see how it works?

Dave Hinz


Yes..see my original post

Gunner

The aim of untold millions is to be free to do exactly as they choose
and for someone else to pay when things go wrong.

In the past few decades, a peculiar and distinctive psychology
has emerged in England. Gone are the civility, sturdy independence,
and admirable stoicism that carried the English through the war years
.. It has been replaced by a constant whine of excuses, complaints,
and special pleading. The collapse of the British character has been
as swift and complete as the collapse of British power.

Theodore Dalrymple,
  #99   Report Post  
Posted to rec.crafts.metalworking
The Hurdy Gurdy Man
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

DoN. Nichols wrote:

If there is an existing default route, you will either need to
remove it first. For the above line, that would be done by:

route delete default 10.0.0.50

followed by adding the default route to your ppp interface.

route add default (IP-address of the other end of your PPP interface)


Syntax of the route command varies between Linux and Solaris quite regularly,
to the point that I get bitten in the rear by it pretty frequently at work.
Linux often requires things like an extraneous 'gw' keyword when specifying
default routes, which is annoying to say the least. Plus it can be
distribution specific, which is even worse.

Anyway -- on both systems, there is an /etc/hostname.(interface-name)
file for each interface. In your case, you would need two. One for the
eth0 interface, and one for the ppp0 one. *Assuming that Linux does it
like Solaris and OpenBSD.


This is one area where Linux is pretty reliably different from Solaris. I
don't think I've ever seen that format used on any Linux distributions...
a lot of them now seem to inherit from the Red Hattishness that is putting
stuff under /etc/sysconfig, although I seem to recall that SuSE used to use
a gigantic rc file in /etc someplace. In the end, it can all be changed since
it's based on what the various start scripts launched by init during the
boot process were written to do, which is one of the reasons why Linux
distributions have such variability between them.

I think the one common thing I've seen with everything is how PPP utilities
function, though. It seems like they all simply launch a shell script for
bringing the link up and down, and that script is generally responsible for
switching the default route, plumbing (to use a Solaris term) interfaces,
and possibly even loading kernel modules. I'll have to dig through my
workstation, I might have kppp docs on it already. It'd be a nice break
from dealing with firewalls and VPN tunnels.

  #100   Report Post  
Posted to rec.crafts.metalworking
The Hurdy Gurdy Man
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

Gunner wrote:

I go through there at least twice a week. If I cant muddle through
it..Id love to have you look it over.


I'm located right on the 210 where it meets the 2, so it's an easy stopover
should you decide to drop by. Just fire me off an e-mail at bryan (at) aernovo
(dot) com if you decide to throw in the towel with the thing and would like
me to take a crack at it. I'd be happy to help.



  #101   Report Post  
Posted to rec.crafts.metalworking,misc.survivalism
Joseph Gwinn
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

In article ,
"Pete C." wrote:

Joseph Gwinn wrote:

I have snipped areas where the debate has become cyclic.

In article ,
"Pete C." wrote:

Joseph Gwinn wrote:

In article ,
"Pete C." wrote:

Joseph Gwinn wrote:



I recall that WiFi was a problem for everybody, although the complaints
have died down.


Well, I've setup a bunch of WiFi stuff with Linksys, Netgear, Compaq and
Belkin branded stuff depending on what was on sale and I've not had a
single issue with configuration or interoperability. Only with the Mac
were there issues, not the least of which was the fact that unlike
everyone else, they did not use the proper terminology.


Well, I think "proper terminology" is a matter of opinion, and I recall
the different jargons of the various computer vendors from the 1970s on.
It will always be thus.


Because neither original Windows (pre NT) nor original MacOS (pre 10)
had memory management.

Indeed, but Microsoft addressed the problem long (years) before Apple
did.


In a manner of speaking. The DOS core remained long after the supposed
conversion, causing endless memory problems. I think Win2K was the
first to have a clean memory architecture.


I don't recall having any memory issues when my machines were on NT
either.


The symptom was various kinds of odd and/or unstable behaviour. The
connection to memory management was not obvious to those not immersed in
kernel arcania. I no longer recall the details, but there was an area
within the kernel that was shared, and was essential to each and every
process (user or kernel) and this whole setup had descended in some
manner from the days of DOS.


Microsoft solved the problem by stealing VMS technology form DEC,
yielding the original NT core. Books were written about this
deathmarch.

Indeed, it's amazing how much of the computing world originated from
DEC. I didn't follow this history in detail, but I though that Microsoft
hired former DEC folks vs. the more outright stealing of Intel / Alpha
that resulted in a lawsuit.


It's true that MS hired lots of key DEC folk. Don't know about Intel
and the Alpha. Do you have any cites?


I don't have any specific cites, but it was in the national news when it
occurred. Something along the lines of Intel's new CPU design infringing
on a number of HPaqDEC patents shortly after the Intel engineers had had
some sort of meeting with the HPaqDEC engineers. I guess it occurred
after HP was on the scene and Intel was able to con clueless Carley into
just selling them the whole Alpha thing instead of enforcing the patent
rights.


This has to have started long before Carly appeared. Do you recall the
timeframe?

While DEC may have alleged infringement, did the courts agree?


Apple solved the problem by buying NeXT (getting Jobs back as part of
the package deal). NeXT was built from scratch by Jobs after he was
tossed out of Apple some ten years earlier. The OS core of NeXT was
BSD UNIX, from which Jobs and company built a through object oriented
software platform. When NeXT came back, the solution was obvious -
replace the NeXT GUI with a Mac-like GUI, retaining the UNIX core from
NeXT.

That NeXT was an odd little thing that never seemed to go anywhere.


Yes, it only made a few hundred million dollars, so by Jobs' standards
it was a failure. Jobs founded three companies, Apple, NeXT, and Pixar
(Toy Story). Apple and Pixar are [each] worth billions.


Pixar seems to be stumbling a bit these days as the novelty of CGI
movies has worn off.


I could live with an income in the billions, even with the stumbles.

Pixar is at base a movie studio, and studios have their ups and downs.


Bad analogy. A better analogy would be to ask if you expect the Police
to keep bad people from coming to your town and mugging people, or
breaking into their homes.

That doesn't work in the real world either. The Police are only there
after the fact, not to prevent anything.


The analogy breaks down, but is still better than the dark-alley
analogy. And we do expect the Police to catch and badly hurt the perps,
so they won't do it again. We don't care if it's because they are
incapacitated, or merely deterred, so long as they stop.


That doesn't change the fact that *you* are individually responsible for
your own safety and security. The police may be there after the fact,
but if *you* expose yourself to a predictable threat i.e. walking down
that alley in the bad part of town, or running without adequate security
software, *you* will still suffer the consequences of *your*
carelessness.


Individual responsibility is great for young, strong, and well-armed
people. This may be 5% of the population, in a really good year. But
even they will become old and weak.

One of the pillars of civilization is the idea, the realization that the
95% of the population that isn't young, strong, and well-armed can pay
~1% of the population to keep the criminal part of the population under
control. The ~1% are called cops.


Most users are experts at something other than the computers that they
must use as a tool. It's expecting far too much to think that this will
ever change, if for no other reason that by definition the clueless
vastly outnumber the priesthood. But the money from vast unwashed
clueless masses folds and spends pretty good, and keeps the priesthood
in beer.

True, but has little to do with the technical merits of the Mac OS vs.
Windoze vs. whatever else.


Not exactly. A system that doesn't need so much effort to make safe is
going to very much easier to live with for those users that are not IT
gurus.


In theory perhaps, but as we all know, the masses rarely make decisions
based on a careful analysis of technical details. The masses choose
Windoze largely because it does an adequate job and it's just what
everyone else uses.


Exactly! If only they were more thoughtful - they would pick MacOS, and
Bill would be pumping gas. Really.


Tell me, would using a VMS host to download stuff from even the worst
sites cause any danger? If not, why can't Windows do that too?

If you were to find a "worst site" that actually had malicious VMS code
and were to download and run it as a privileged user it certainly would.


Misses the point.


No, the point is there, the perceived security of the Mac is a myth that
would rapidly evaporate if Macs ever achieved any large market share. As
soon as there was a sufficient mass to make an attractive target for the
various virus writing scum, the Mac world would go down in flames until
they took security seriously. In the meantime the Macs enjoy simulated
security through obscurity.


It does not follow that because no operating system is impervious, that
there is no difference in resistance to attack. This has already been
discussed at length.

By the way, MacOS 10.3.6 got CAPP EAL3 certification from NSA in
November 2004. See http://niap.nist.gov/cc-scheme/st/ST_VID4012.html.


For whatever reason, Macs are immune to this stuff. You can argue that
it's mere insignificance that protects the Macs, but the fact remains
that Macs are immune.

The Macs are in no way immune, there are simply fewer bugs for them to
catch currently. They are the "bubble boy" who does just fine while
isolated from the germs but will quickly die in the real world due to
their lack of defenses.


It may be true that if Macs were 90% of the market, they would fall
under the assault. But Macs aren't going to achieve 90%, so what's your
point?


My point is don't claim that Macs are secure or are immune to attack
when they are not. See "simulated security through obscurity" above.

Your belief that Macs don't need to be "protected" from
the 'net is also false.

Most Macs do not run with any virus protection whatsoever, and are
none the worse for it.

And that is simply a function of threat volume and statistics, not a
function of platform security. Because PCs running Windoze outnumber
Macs 20:1 the volume of viruses targeting Windoze outnumbers those
targeting Macs by an even larger ratio and the probability of a
particular Windoze user being hit by one of those viruses is
consequently higher.

Whatever. See above.

Immunity and lack of exposure are not the same thing.


True enough, but so what? See above.


Then claim that Macs benefit from their obscurity, not that they are
more immune / secure which they are not.


Even if you were to be correct, my point was that the average user
doesn't care why the Macs are secure. It's enough that they are secure,
and they are.


But how does this analogy apply to computers? In Macs, no such "seat
belt" is needed.

Completely false. Macs need security products just like Windoze PCs need
them and any other OS needs them. The relative threat level is lower,
but the threat still exists.


I'm not so sure that Macs need security products (despite the thunder
from the security product vendors), because most seem to do just fine
without such products.


They need security product if the user wants to help insure they do not
experience an attack of one sort or another. They may get along fine for
some length of time without security products, but they will eventually
suffer an attack.


Analogy fails. Macs simply don't need antivirus programs, very few have
them, and no infections result.

A new and major Windows threat emerged last week, on 1 Jan 06. No patch
available until 9 Jan, according to my company's IT dept, so we have
been instructed to minimize our use of the Internet. The issue is with
WMV files, and perhaps JPEG and GIF. The notice I saw didn't get into
tech details, and I asked the IT folk if this was an ActiveX and/or
Internet Explorer problem, and they said that the problem was deeper
than that.


Cars did "just fine" for quite a while before seat
belts became the norm too. Of course now they try to force the asinine
airbags on the unsuspecting masses, kind of like security software that
formats your hard drive if it detects a virus it doesn't know how to
clean.


Car analogy irrelevant. See above.


Rehash. See above.

Rehash and still true. A privileged user loading malicious code *will*
hose any OS. The only alternative to this is to keep the users
non-priveleged and/or prevent the loading of non-certified code.


We've drifted off the target. The original question was why it was that
the internet is so much more dangerous for PCs than Macs. If MS had
really copied VMS correctly, Windows would be essentially immune to all
the evil stuff floating around the web.


Not true at all. Even if Windoze incorporate just about all of VMS, with
it having the mass of Windoze to be a target of attack, and with all the
users operating as SYSTEM (administrator for Windoze) you'd have exactly
the same situation. Keep the users limited to minimal privileges and
you'd be ok for the most part, same with Windoze.

But it's Macs and Linux/UNIX
that are essentially immune, not Windows. What happened?


Linux/UNIX are not even remotely close to immune, they require even more
active effort than Windoze to attain any measure of security. Macs are
of course also not immune, they are simply obscure enough to not be much
of a target.


Sorry. In practice, MacOS and Linux/UNIX are immune, compared to
Windows. As discussed above, no operating system is impervious to a
knowing attack by experts, but it does not follow from this that all
operating systems are equally vulnerable.

A good mechanical analogy is locks. (Metal content!) No lock is
pickproof, average people can open many cheap locks with a bobby pin, an
average locksmith can pick an average lock in tens of seconds, while
only a handful of very skilled and focused locksmiths have ever managed
to pick a Medeco, and it took these expert locksmiths no less than an
hour per lock, and some locks took days.


If you read the PC magazines (yes, PC magazines), you will see that
they consistently rate the reliability and quality of Macs at the top,
with Dell close behind. Apple beats all PC makers on quality of user
support. Consumer Reports backs these findings up.

Consumer Reports has -zero- credibility with me in -any- subject area.
As for reliability and quality, the PC magazines use some questionable
criteria and also exclude many PC lines from their comparisons. The same
goes for support as the comparisons typically exclude the "business"
lines from the PC manufacturers.

So, who do you believe?

Not CR certainly.


Some source beats no source, any day.


Invalid data is *not* better than no data.


Consumer Reports published invalid data? That's a very strong
accusation, perhaps a libel. Upon what do you base this statement?

And still unanswered is what source you would instead recommend.


Consumer Reports sends out a survey to their subscribers every year.
In this survey, the subscribers report their experience with all manner of
products, and this experience is summarized and reported.

Specifically, on page 35 of the December 2005 issue of Consumer Reports
appears the "Brand Repair History" (based on 85,000 desktop systems and
49,000 laptops) and "Tech Support" (based on 6,500 responses for
desktops and 4,200 responses for laptops).

There is no better publically available source of by-brand reliability
data. And the scores don't vary that much from year to year, as
reliability and tech support don't just happen, they are achieved if
and only if the company management thinks them important.

I'll have to look in the store to see the report.


OK.


I looked and they only had the Jan issue.


The article isn't that long, so one could xerox the relevant part from
the public library's copy.


The fact that Apple had to scrap their entire OS core speaks volumes
to their software expertise. I can't see the DoD buying Macs for what
essentially is nothing more than a UI.

Microsoft scrapped their entire Windows 3.x OS core as well, in favor
of NT.

Yes, but Microsoft rebuilt their core with newly hired expertise.


From DEC. So, we may conclude that MS also discovered that they lacked
the expertise to write an OS.


Indeed, but acquiring that expertise and just acquiring a ready-to-go OS
core are not exactly the same thing. Kind of like hiring a few mechanics
to help you build a race car vs. buying a race car and putting your
paint job on it.


Bad analogy. Apple bought NeXT the company, getting both NeXT the
operating system and the whole development team, both of which Jobs had
built from the ground up. It was a merger.


Oddly enough I run a Windoze web server, mostly 'cause I need that
machine up anyway to run an IVR server and general storage services.
It's been up 24x7 for a couple years now and despite frequent attack
attempts it's still doing fine. Nothing particularly special or heroic
on it either, I do the MS patch thing roughly monthly, have an ordinary
commercial software firewall on it and this server along with the rest
of my network is behind an ordinary Linksys NAT/Firewall router with
only port 80 mapped through to the web server.


Hardware firewalls help a great deal. As do expert IT efforts. How
many average users can manage this level of expertise?


Um, just about any of them. I've got a Linksys router plugged into the
cable modem, and plain old Norton Personal Firewall on the server. The
only this remotely "configured" is one port map on the router to map
port 80 to the address of the server. This is not brain surgery by any
means and it's worked just fine. The two lightning strikes that took out
two cable modems, two routers and a couple Ethernet ports are another
story.


I've had my share of engineers who had to ask me how to configure their
Linksys, so I'd venture that this knowledge isn't exactly universal.


This is a bit self contradictory. Those flighty non-technical creative
types love the Mac but are clueless about IT, have no internet
discipline whatsoever, and yet they prosper.

I think the ones that "prosper" are the ones that keep work and
personal machines separate.

Should not be necessary, as mentioned above in multiple places.

Well, it *is* necessary to either keep the two separate, or have the
knowledge / self control to avoid what should be obvious threats.


Not everybody can be an IT guru.


No, but it doesn't require an IT guru to keep work and personal machines
separate.

So, the summary is that Windows is only for IT gurus, and Macs are for
everybody else?


Nope, but it does appear that Windoze users may be better at keeping
work and personal machines separate.


Because they must?


Just think what stolid
uncreative technical types could do with such a tool.

Nothing, absolutely nothing, because the whole Mac concept is to
prevent
anyone from doing anything technical.


Not quite. The objective is to hide technical details from those
uninterested in such details.


Up until OSX they were more than just hidden.


I think this is circular. I knew where those controls were, even the
controls that nobody but a developer would dare to touch, even if they
did happen to know how to find them. Apparently, they don't teach
Windows gurus where the MacOS controls are, but it's a stretch to
conclude that such controls don't exist.


Not so. The technical controls are at the terminal window (and root
console) level, and many controls are only at that level. One can argue
that this or that control should or should not be GUI accessible, and
I'm sure that there will be some migration in the coming years, but the
controls are there, but mostly kept away from naive users.

And that only came about in the past couple years. Pre OSX there was no
ability to do anything technical. Post OSX there is some, but post OSX
MacOS is just another Unix variant with a particular flavor of GUI shell.


Yes and no. I had developer-level knowledge of pre-10 MacOS, and it was
all there, but mostly hidden from average users. It wasn't locked, it's
just that the average user wouldn't know which rock to look under.


How much of it was potentially accessible to the user with the knowledge
of where to find it vs. accessible to a user that purchased additional
development utility packages?


In MacOS before 10, all the normal controls were in the control panels
plus resedit (the rough equivalent of the Windows Registry, but on a per
application basis).

You didn't really need much in the way of added utility packages to get
at the deeper controls, but most people did, for convenience. The tools
were not expensive.


MacOS 10 and later keeps all the good stuff under the UNIX rock.


Right, it is at least accessible without the need for any additional
software.


Yep. Again, people who do much of this will get specialized tools, for
convenience.


Be careful about who you call a "rehash" (nice neutral word that):
Intel processors are by the same token an absolute hash, retaining
compatibility with every past version, with bits encrusted upon
bits.

They had been until the infusion of stolen Alpha technology.

Not exactly. IBM invented the RISK processor architecture, and the
first RISC CPU was the IBM P801.

Ok, but Intel's boost came / is coming from what they stole from Alpha.


Can you cite a court case on this? I'd like to read the ruling.


It was settled out of court after Intel managed to con HP's clueless CEO
into just selling them the whole Alpha lot. It was in the national news
when it occurred.


Was this when she had been there a few months to a year? I vaguely
recall something of the like. It didn't make nearly the splash that the
Microsoft antitrust case made.


Nor is the Alpha really a RISC machine, as it had to execute the VMS
instruction set. I no longer recall the details of what was done in
silicon and what was done by the compilers, but the VMS instruction set
is pretty big and complex, even if it is nicely designed. My
recollection is that DEC implemented all the one and two address
instructions in Alpha silicon, and everything else was emulated.

Microcode I believe. I never dug all that deep into it though since I'm
not a programmer.


Not microcode; this is anethema in the RISK world. In RISC, one has a
few very simple but exceedingly fast instructions, from which all else
is cobbled together. This is a large net win, largely because simple
instructions are so much more common than complex ones in actual
application code.


Right, and to emulate the CISC processor you need microcode calls to
produce the equivalent of the CISC instructions.


No microcode. RISK processors are hardwired to execute their small
instruction set, and anything more complex (like the CISC instructions
of yore) are implemented using lots of machine-code instructions.


Of course, people switch one way or another for a variety of reasons and
I know at least three or four people that switched from Mac to PC. Since
the installed base of PCs is very high and since there are a number of
alternatives to Windoze that will run nicely on that existing hardware,
the bulk of people looking to switch off of Windoze are likely to follow
the low cost path to keep their existing hardware. When Apple introduces
MacOS for PC hardware then those users might consider it or by then they
may well be happy with one of the other Unix offerings and not be
enticed by the Mac UI.


People switch for multiple reasons, and low hardware cost isn't the only
reason.


Indeed, and for the general masses, being like everyone else is a big
factor. Sure there are those that try to be different to make a
statement, but they are a minority and somehow they all end up looking
the same as well.


And there are people that run multiple platforms.


Indeed there are, however I have not seen many people that run both PCs
and Macs for any length of time, generally one is abandoned because they
do not provide significantly different functionality for most people. Of
course for most specialized applications the PC wins since the apps
aren't available for a Mac.


After watching what the IT folk at work have to do to keep the PCs
working, I'm not interested in spending my life that way. I'd rather be
making chips. (Metal content!) So, I run only MacOS at home, because
they require essentially no maintenance. I will get a PC to run some
PC-only programs, but this PC will not be allowed out of the house,
hiding behind the firewall, and will certainly not be allowed any
internet and email access.


There is a lot more to MI than missiles. Tends to be low volume, high
cost per item.


Some nifty stuff in the surveillance end of things like on the EP3s. I
think I made them nervous shooting detailed pics and video of everything
in that plane, but nobody told me to stop...


There is a lot of embedded process control stuff as well.


This is the Industrial end of things. Lots of VMEbus et al. Not
everything can be done on an 8051 microcontroller.


You don't see too much 8051 stuff in CCI these days.


I don't know the current market shares, but I still see ads for 8051
tools in the back of electronic design and embedded programming
magazines. I think that the old Motorola 68040, now called ColdFire, is
a market leader, and it's a whole lot nicer to program than an 8051.


Long time ago. Apple actually has had lots of duds, because they are
always trying new things, and by definition new things aren't always a
success. Said another way, if there are no failures, there is no
innovation.


Except their duds mostly get swept under the rug, where Microsoft's duds
are always hyped and paraded.


Hardly. The Mac magazines talk endlessly of such things, just like the
PC magazines in the Windows ecosystem. If you read only the PC mags,
you may not hear the gnashing of teeth in the Mac world.


I don't get your point. The Apple store cannot tell if you are a
unwashed user or an administrator, so long as your money spends good.

How much of that can you as the end user at home configure and
reconfigure as your needs change? My PCs go through many incarnations as
I get newer machines and migrate good hardware from older machines. I
also combine leftover bits of old machines into usable machines for
specific tasks. I don't need a GHz machine with dual monitors to run a
PIC programmer for example.


Skilled users can disassemble and remake Macs just the same as one can
with PCs. At this level, hardware is much the same. That said, I've
never found it really useful to do, although I have done it. The core
problem is that the hardware becomes obsolete so fast (in PCs and Macs)
that I don't really have any real hardware investment to protect after
three years. The money is in the application software, and my time.


Depends on the applications I guess. In the past I've moved things like
DVD-R drives and specialized video and I/O cards. Those whose
applications are software only would not have this need.


I have bought such cards (generally PCI) for Macs on the theory that I
would carry them forward to the next generation. Never happened,
because by the time the machine was old enough to replace or upgrade, I
wanted the new stuff. So, I stopped trying to solve anything but the
problem of the day.


I don't know. Appletalk on ethernet was used widely, and didn't cost
$400.


Dunno, getting the one Mac in the office at the old job onto Ethernet
did cost a bloody fortune compared to all the PCs. Think it was one of
the Macs that was a non-standard bus, perhaps nubus?


Perhaps. I don't recall when the transition happened, but Macs had
built-in ethernet long before the PCI bus slots appeared. Macs still
have the AUI connector (that connects to the external module that
interfaces to the coax-cable varieties of ethernet).


Also note that Macs had Appletalk and true networking from 1984, long
before PCs discovered any networking.


Mainstream PCs perhaps. Same with graphics since PCs were doing high end
graphics well before Macs even found color.


In a manner of speaking. The reason the Graphics Arts world went to
Macs in the first place was that only Macs handled text and pictures at
all well back then. Don't forget that the comparison was DOS; Windows
was many years in the future.


And Appletalk just worked. That
friend up the street had something like six machines networked in his
home in the late 1980s to early 1990s.


I had several PCs networked at home in that general timeframe as well.
Towards the tail end of that timeframe, but that corresponded to me
having more than one machine at home at all.


PC were not networked in 1984, while Macs came that way, straight from
the box, included in the original design. PCs really only got networked
when Novell came along. Eventually, MS wised up, and built networking
in, and crushed Novell. But this was far in the future.


That's not interoperability, that's openness to third party software.
Interoperability is working with established standards, something
that Macs have been loathe to do.

Um. Have you been following Microsoft's tangles with the EU antitrust
regulators? They are proposing fines of a million dollars a day.

The EU antitrust folks are about as far out in left field as they can
get. If I were Bill I would pull all MS products from the EU and void
the license agreements, but that's just me and I can be a bit harsh.


Well the EU antitrust folk may be in left field, but they are the law.


Which is why I would simply pull out of that market if I was Bill. I
certainly would not do business of license my products for use in a
country or countries that tried to force me to bundle competitors
products with my own.


It isn't going to happen.


Dunno, I have or have had a Armada 1590-something, an Armada M700, a
Thinkpad 600X and a Dell 600-something and none of them seem to have any
thermal issues.

Every Mac user I know which is about five has had hardware problems at
one point or another. A small sample granted, but still a 100% problem
rate.


Lots of my friends have had hardware problems, especially with laptops.
But the stats are clear - Mac desktops are far more reliable than PC
desktops. Laptops are all about equally bad.


Still 100% problem rate for the small sample of Mac desktops I know of.


Maybe you need better friends? They seem to be very hard on computers.
Seriously, if this were representative of the failure rate of any
computer company's products, that company would soon disappear. But
Apple has not disappeared. This is an example of why one should not
generalize from anecdotal reports.

The Consumer Reports stats are that Macs desktops are far more reliable
than PC desktops, and Mac laptops are in the middle of the crowd with PC
laptops. The stats are also that laptops are far more likely to fail
than desktops.


There was some switching a few years ago, but the market saher has been
holding steady recently.


That may be more related to the general freeze in the business computer
world due to the economic issues. Since change has upfront costs even if
it may save long term businesses have been holding to the current
systems. As the economy readjusts and new initiatives begin I think
there will be additional shifting taking place.

The fact that nearly all applications for the Mac are also available for
the PC makes it fairly easy to generate a long term savings by
eliminating "non-standard" platforms and the support personnel
associated with them.


IT folk do think that way, and sometimes get away with this. But notice
the subtle switch here. We have gone from "Macs are bad" to "Macs are
too few to bother with (even if they are better)".


OK, a Chevy and a Cadillac.


At which point you have the same function and reliability, with the only
difference being in the cosmetics. I for one don't want my computer to
look stylish, I want it to do it's task reliably and inexpensively. I'm
not the type who wants to put a "stylish" Mac on my neatly prop-arranged
desk to impress my friends, I put my server PCs, network gear and big
UPS in a standard 19" equipment rack in the back corner of my garage.


You haven't seen my desk.


Any company that achieves ~90% market share in an important industry
will find its freedom of action curtailed. The classic example is AT&T,
which achieved a similar market share by methods that are now illegal,
but were common back then. The solution was to turn telephones into a
regulated utility. This is probably one of Microsoft's biggest fears,
but their current conduct makes such an outcome more and more likely.

Except that there is no comparison between a telephone utility and a
commodity OS. There was no choice for the telephone service since it was
delivered by a fixed network, anyone can run whatever OS they chose on
their PC hardware and they have quite a few choices. The fact that they
choose one particular brand of OS overwhelmingly does not make a
monopoly in my book.

Now that there are alternatives to traditional Telcos for voice
services, there is no longer a Telco monopoly either and interestingly
enough we see the old "Bell bits" recombining.


Missing the point. There was no equivalent to the telephone before it
was invented. The issue is that if something becomes that important,
and yet is a "natural monopoly" (where either economics or technical
issues more or less demand that there be a single supplier) , it isn't
long before the monopoly is either broken up (as in the oil industry) or
converted to a regulated utility (as in city gas, electricity, water,
telephones, and to some degree cable TV).


There have been alternatives to Windows for as long as there has been
Windows, hence no monopoly. Windows simply won the popularity contest
and is now attacked because of it.


You can't have it both ways. If all but Windows is insignificant, which
you have been insisting, and Windows' ~90% market share certainly
supports that conclusion, then Windows is a de facto monopoly. There is
no getting around it, and it makes no real difference how this position
was achieved. So, Microsoft will find itself unable to have the
freedoms of its youth.


Datapoint: At its peak, IBM had only a 70% market share.

In which market? They cover many markets with vastly different needs.


Computers, which meant mainframes back then. Remember "IBM and the
Seven Dwarves"?


I wasn't real involved in the computer world pre-midrange, not really
old enough at 36.


So you won't remember when IBM was taken to Federal Court for antitrust.
The case was eventually settled, but one school of thought holds that
the only reason that it was possible for the PC to escape IBM's control
(IBM invented the PC), or Microsoft was able to become the OS vendor,
was that IBM way paralyzed by the antitrust case. It may well be true.


Yep, Apple might reach 8% market share while Windows drops to 67% and
Linux/Unix rises to 25%.

Truth is, Apple (and Mac users) would be perfectly happy with ~10%
market share.

And that's likely what they will have. I just don't see any compelling
reasons why they would achieve more than this regardless of what
Microsoft does. In fact given totally free hardware and software, I
don't think the MacOS could ever capture more than 50% of the market
since it primarily appeals to the creative types and they are perhaps
50% of the population.


But they are far more fun.


Dunno, us technical creative types can be fun too. Certainly my friends
are often amused at the over-the-top solutions I pull out of my
posterior for some insurmountable problems.


Somehow, I don't think that our partying ability will impress the
creative types. One can only hope that they don't giggle too loudly.

Joe Gwinn
  #102   Report Post  
Posted to rec.crafts.metalworking,misc.survivalism
Martin H. Eastburn
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

Mid winter in Texas or the central plan - means cold or cool - and dry.
Grass is brown and dry.

The summer brings green grass and green trees.
Coastal areas - west coast - is the inverse. The winter rains bring grass
and the hot summer kills it off. The pretty green hills of Ca. are winter pics.

Martin
Martin Eastburn
@ home at Lions' Lair with our computer lionslair at consolidated dot net
NRA LOH & Endowment Member
NRA Second Amendment Task Force Charter Founder



zadoc wrote:
On Tue, 03 Jan 2006 00:32:56 GMT, "Pete C."
wrote:

[snip, see original]


Don't you hate those annoying little distractions? I've got the
potential threat of grass fires to contend with here in the Dallas area.

Pete C.



The fires in Texas and Oklahoma are even making the TV news in Sydney.
As it is midwinter there, isn't this a bit unusual?

Are people there starting to wonder about climatic changes and global
warming?

One wonders what midsummer will be like.

Cheers,





Gunner

"Pax Americana is a philosophy. Hardly an empire.
Making sure other people play nice and dont kill each other (and us)
off in job lots is hardly empire building, particularly when you give
them self determination under "play nice" rules.

Think of it as having your older brother knock the **** out of you
for torturing the cat." Gunner




----== Posted via Newsfeeds.Com - Unlimited-Unrestricted-Secure Usenet News==----
http://www.newsfeeds.com The #1 Newsgroup Service in the World! 120,000+ Newsgroups
----= East and West-Coast Server Farms - Total Privacy via Encryption =----
  #103   Report Post  
Posted to rec.crafts.metalworking,misc.survivalism
Pete C.
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

Joseph Gwinn wrote:


Well, I've setup a bunch of WiFi stuff with Linksys, Netgear, Compaq and
Belkin branded stuff depending on what was on sale and I've not had a
single issue with configuration or interoperability. Only with the Mac
were there issues, not the least of which was the fact that unlike
everyone else, they did not use the proper terminology.


Well, I think "proper terminology" is a matter of opinion, and I recall
the different jargons of the various computer vendors from the 1970s on.
It will always be thus.


Encryption keys are still *not* passwords.


I don't recall having any memory issues when my machines were on NT
either.


The symptom was various kinds of odd and/or unstable behaviour. The
connection to memory management was not obvious to those not immersed in
kernel arcania. I no longer recall the details, but there was an area
within the kernel that was shared, and was essential to each and every
process (user or kernel) and this whole setup had descended in some
manner from the days of DOS.


Perhaps, but my NT machines still ran without problems.


I don't have any specific cites, but it was in the national news when it
occurred. Something along the lines of Intel's new CPU design infringing
on a number of HPaqDEC patents shortly after the Intel engineers had had
some sort of meeting with the HPaqDEC engineers. I guess it occurred
after HP was on the scene and Intel was able to con clueless Carley into
just selling them the whole Alpha thing instead of enforcing the patent
rights.


This has to have started long before Carly appeared. Do you recall the
timeframe?


I'm not very chronologically oriented. Had to be '98 - '00 range I
think.


While DEC may have alleged infringement, did the courts agree?


It didn't get to that point. The suit I believe was filed, but then
dropped after Compaq or HP sold out.



Pixar seems to be stumbling a bit these days as the novelty of CGI
movies has worn off.


I could live with an income in the billions, even with the stumbles.


So could I, however they need to adapt in order to sustain it.


Pixar is at base a movie studio, and studios have their ups and downs.


Perhaps, but now that the novelty of CGI movies has worn off they will
have to adapt. When was the last time a studio made a western?


That doesn't change the fact that *you* are individually responsible for
your own safety and security. The police may be there after the fact,
but if *you* expose yourself to a predictable threat i.e. walking down
that alley in the bad part of town, or running without adequate security
software, *you* will still suffer the consequences of *your*
carelessness.


Individual responsibility is great for young, strong, and well-armed
people. This may be 5% of the population, in a really good year.


More than that based on the stats, even in left leaning states.

But
even they will become old and weak.


And better armed.


One of the pillars of civilization is the idea, the realization that the
95% of the population that isn't young, strong, and well-armed can pay
~1% of the population to keep the criminal part of the population under
control. The ~1% are called cops.


Except that it doesn't really work very well. Indeed with the recent
examples in other parts of the world for inspiration, the criminal
element may begin to realize that the whole police / law structure can
collapse if it comes under attack.


In theory perhaps, but as we all know, the masses rarely make decisions
based on a careful analysis of technical details. The masses choose
Windoze largely because it does an adequate job and it's just what
everyone else uses.


Exactly! If only they were more thoughtful - they would pick MacOS, and
Bill would be pumping gas. Really.


If only they were more thoughtful they would chose neither.


No, the point is there, the perceived security of the Mac is a myth that
would rapidly evaporate if Macs ever achieved any large market share. As
soon as there was a sufficient mass to make an attractive target for the
various virus writing scum, the Mac world would go down in flames until
they took security seriously. In the meantime the Macs enjoy simulated
security through obscurity.


It does not follow that because no operating system is impervious, that
there is no difference in resistance to attack. This has already been
discussed at length.


Indeed and I still contend that MacOS is not more resistant than
Windoze. Given comparable attacks both will fail.


By the way, MacOS 10.3.6 got CAPP EAL3 certification from NSA in
November 2004. See http://niap.nist.gov/cc-scheme/st/ST_VID4012.html.


No time to look at that at the moment.

Then claim that Macs benefit from their obscurity, not that they are
more immune / secure which they are not.


Even if you were to be correct, my point was that the average user
doesn't care why the Macs are secure. It's enough that they are secure,
and they are.


They are *not* secure, they are simply not targeted often. Not the same
thing and an important fact to understand.

They need security product if the user wants to help insure they do not
experience an attack of one sort or another. They may get along fine for
some length of time without security products, but they will eventually
suffer an attack.


Analogy fails. Macs simply don't need antivirus programs, very few have
them, and no infections result.


Very few attacks are made. Give them an attack and the user will see why
it was foolish to operate without sufficient safeguards.


A new and major Windows threat emerged last week, on 1 Jan 06. No patch
available until 9 Jan, according to my company's IT dept, so we have
been instructed to minimize our use of the Internet. The issue is with
WMV files, and perhaps JPEG and GIF. The notice I saw didn't get into
tech details, and I asked the IT folk if this was an ActiveX and/or
Internet Explorer problem, and they said that the problem was deeper
than that.


Yep, a threat came and went, woop-de-do. There was a patch from a third
party source before the official MS patch was released BTW.


Cars did "just fine" for quite a while before seat
belts became the norm too. Of course now they try to force the asinine
airbags on the unsuspecting masses, kind of like security software that
formats your hard drive if it detects a virus it doesn't know how to
clean.


Car analogy irrelevant. See above.


What above?

Linux/UNIX are not even remotely close to immune, they require even more
active effort than Windoze to attain any measure of security. Macs are
of course also not immune, they are simply obscure enough to not be much
of a target.


Sorry. In practice, MacOS and Linux/UNIX are immune, compared to
Windows. As discussed above, no operating system is impervious to a
knowing attack by experts, but it does not follow from this that all
operating systems are equally vulnerable.


Well, I've personally had a Linux system attacked and compromised, the
same has not happened on any of my Windoze systems. The Linux
installation could have been made more secure given a fair amount of
time and effort, but it is/was most certainly not secure "out of the
box".


A good mechanical analogy is locks. (Metal content!) No lock is
pickproof, average people can open many cheap locks with a bobby pin, an
average locksmith can pick an average lock in tens of seconds, while
only a handful of very skilled and focused locksmiths have ever managed
to pick a Medeco, and it took these expert locksmiths no less than an
hour per lock, and some locks took days.


Not a real good analogy since in the computer world, the picking can
happen at blinding speeds and unseen and neither Windoze not MacOS have
intrusion detection warnings to indicate an attack is in progress. It
makes no difference if it takes 1 minute or 10 minutes to compromise the
OS when it can be done undetected.



Invalid data is *not* better than no data.


Consumer Reports published invalid data? That's a very strong
accusation, perhaps a libel. Upon what do you base this statement?


It is invalid because it is very often biased. We all know of their
rigged rollover "test", they may have escaped liability for that mostly
on first amendment grounds, but that in no way exonerated them. They are
frauds and I stand by that.


And still unanswered is what source you would instead recommend.


I'm afraid I don't recommend any source other than personal research
which is the only thing that can be relied on to be objective and if
biased, biased in a way that is acceptable.

I looked and they only had the Jan issue.


The article isn't that long, so one could xerox the relevant part from
the public library's copy.


Haven't found the library yet.


Indeed, but acquiring that expertise and just acquiring a ready-to-go OS
core are not exactly the same thing. Kind of like hiring a few mechanics
to help you build a race car vs. buying a race car and putting your
paint job on it.


Bad analogy. Apple bought NeXT the company, getting both NeXT the
operating system and the whole development team, both of which Jobs had
built from the ground up. It was a merger.


But NeXT didn't write the OS either. NeXT does not equal BSD.

Um, just about any of them. I've got a Linksys router plugged into the
cable modem, and plain old Norton Personal Firewall on the server. The
only this remotely "configured" is one port map on the router to map
port 80 to the address of the server. This is not brain surgery by any
means and it's worked just fine. The two lightning strikes that took out
two cable modems, two routers and a couple Ethernet ports are another
story.


I've had my share of engineers who had to ask me how to configure their
Linksys, so I'd venture that this knowledge isn't exactly universal.


Well, there is essentially nothing configured on it other than the port
map. What few other things I have configured aren't relevant to
security. Mapping port 80 to a web server is well documented on the
Linksys site, so anyone who can read and follow directions can set it up
in minutes.


Nope, but it does appear that Windoze users may be better at keeping
work and personal machines separate.


Because they must?


Nope, just because more of them have separate work machines. They keep
their work and personal lives and machines separate.


Up until OSX they were more than just hidden.


I think this is circular. I knew where those controls were, even the
controls that nobody but a developer would dare to touch, even if they
did happen to know how to find them. Apparently, they don't teach
Windows gurus where the MacOS controls are, but it's a stretch to
conclude that such controls don't exist.


I don't claim to be a Windoze guru, I simply use it as a tool like
everything else. In Windoze, even during significant changes like NT -
95 - 2K - XP I never had a problem finding the settings I was looking
for within 1 minute. The few times I've worked on a Mac I've dug for
tens of minutes and still not found the proper settings.

How much of it was potentially accessible to the user with the knowledge
of where to find it vs. accessible to a user that purchased additional
development utility packages?


In MacOS before 10, all the normal controls were in the control panels
plus resedit (the rough equivalent of the Windows Registry, but on a per
application basis).


I can't say as I've ever had to make changes in the Windoze registry to
make configuration changes, every setting I've needed has been in the
normal control panels or device manager where they belong.


You didn't really need much in the way of added utility packages to get
at the deeper controls, but most people did, for convenience. The tools
were not expensive.


Never spent enough time on Macs to look at add on software, the UI
turned me off too quickly.


MacOS 10 and later keeps all the good stuff under the UNIX rock.


Right, it is at least accessible without the need for any additional
software.


Yep. Again, people who do much of this will get specialized tools, for
convenience.


No, people who do much of this will be very happy with a CLI. To a tech,
a GUI is largely a hindrance, a CLI is fast, efficient and direct.

It was settled out of court after Intel managed to con HP's clueless CEO
into just selling them the whole Alpha lot. It was in the national news
when it occurred.


Was this when she had been there a few months to a year? I vaguely
recall something of the like. It didn't make nearly the splash that the
Microsoft antitrust case made.


Quite possible, I'm not chronologically oriented. It of course didn't
make a big news splash outside tech circles since nobody outside tech
circles has or had the slightest idea what Alpha is. Even though it
involved Intel which people know, it did not involve any CPU that was
actually released at the time so it was still obscure.

Right, and to emulate the CISC processor you need microcode calls to
produce the equivalent of the CISC instructions.


No microcode. RISK processors are hardwired to execute their small
instruction set, and anything more complex (like the CISC instructions
of yore) are implemented using lots of machine-code instructions.


Micro code would be machine-code instructions, just at a very low level
as a translation of a CISC opcode into the slew of RISC opcodes needed
to perform the same task.

And there are people that run multiple platforms.


Indeed there are, however I have not seen many people that run both PCs
and Macs for any length of time, generally one is abandoned because they
do not provide significantly different functionality for most people. Of
course for most specialized applications the PC wins since the apps
aren't available for a Mac.


After watching what the IT folk at work have to do to keep the PCs
working, I'm not interested in spending my life that way. I'd rather be
making chips. (Metal content!) So, I run only MacOS at home, because
they require essentially no maintenance. I will get a PC to run some
PC-only programs, but this PC will not be allowed out of the house,
hiding behind the firewall, and will certainly not be allowed any
internet and email access.


PC maintenance in a corporate environment and in a home environment are
vastly different. Don't get misguided trying to make the comparison as
there is none.

I only do email on one PC and that is the one with the AV software on
it, but all my machines have all the Internet access they need without
any issues. I readily surf out to lookup information from my CNC control
PC (not while CNCing of course) and have no problems. I pull my G-code
from the PC in the house where I run the CAD software and also Mach3 for
preview purposes.

You don't see too much 8051 stuff in CCI these days.


I don't know the current market shares, but I still see ads for 8051
tools in the back of electronic design and embedded programming
magazines. I think that the old Motorola 68040, now called ColdFire, is
a market leader, and it's a whole lot nicer to program than an 8051.


I don't pay a whole lot of attention as the PIC line covers what little
I need to do.


Long time ago. Apple actually has had lots of duds, because they are
always trying new things, and by definition new things aren't always a
success. Said another way, if there are no failures, there is no
innovation.


Except their duds mostly get swept under the rug, where Microsoft's duds
are always hyped and paraded.


Hardly. The Mac magazines talk endlessly of such things, just like the
PC magazines in the Windows ecosystem. If you read only the PC mags,
you may not hear the gnashing of teeth in the Mac world.


I don't read the PC mags either since they seem to only be interested in
the latest gamer junk.

Depends on the applications I guess. In the past I've moved things like
DVD-R drives and specialized video and I/O cards. Those whose
applications are software only would not have this need.


I have bought such cards (generally PCI) for Macs on the theory that I
would carry them forward to the next generation. Never happened,
because by the time the machine was old enough to replace or upgrade, I
wanted the new stuff. So, I stopped trying to solve anything but the
problem of the day.


PCI went a long way towards lowering costs and making options available
across multiple platforms. The fact that you can use the same piece of
hardware in your PC, Mac, Sun, IBM, Alpha, etc. with only the drivers
being different was a big help.


I don't know. Appletalk on ethernet was used widely, and didn't cost
$400.


Dunno, getting the one Mac in the office at the old job onto Ethernet
did cost a bloody fortune compared to all the PCs. Think it was one of
the Macs that was a non-standard bus, perhaps nubus?


Perhaps. I don't recall when the transition happened, but Macs had
built-in ethernet long before the PCI bus slots appeared. Macs still
have the AUI connector (that connects to the external module that
interfaces to the coax-cable varieties of ethernet).


AUI has rather gone the way of the Dodo. With 10/100/1000 Ethernet
ports, an external fiber converter is about the only thing you might
want to add.


Also note that Macs had Appletalk and true networking from 1984, long
before PCs discovered any networking.


Mainstream PCs perhaps. Same with graphics since PCs were doing high end
graphics well before Macs even found color.


In a manner of speaking. The reason the Graphics Arts world went to
Macs in the first place was that only Macs handled text and pictures at
all well back then. Don't forget that the comparison was DOS; Windows
was many years in the future.


And the technical world went to PCs because Apple abandoned an open
architecture so they became useless to the tech world.


And Appletalk just worked. That
friend up the street had something like six machines networked in his
home in the late 1980s to early 1990s.


I had several PCs networked at home in that general timeframe as well.
Towards the tail end of that timeframe, but that corresponded to me
having more than one machine at home at all.


PC were not networked in 1984, while Macs came that way, straight from
the box, included in the original design. PCs really only got networked
when Novell came along. Eventually, MS wised up, and built networking
in, and crushed Novell. But this was far in the future.


Macs may have had some form of networking, but few people could afford
more than one making it largely irrelevant. At that point the whole
concept of why you might want more than one machine and a network
connection was still in it's infancy. Oddly enough, pre-PC I was
RS232ing stuff between a Vic20 and a C64.

Well the EU antitrust folk may be in left field, but they are the law.


Which is why I would simply pull out of that market if I was Bill. I
certainly would not do business of license my products for use in a
country or countries that tried to force me to bundle competitors
products with my own.


It isn't going to happen.


I don't expect it would, it's just what I'd do. If I buy Windoze I
bought Windoze and I do not want junk from Netscape or Apple or anyone
else included with it. If I choose to run a third party component I will
download or purchase it and install it. I consider it completely absurd
to force one company to include advertising for it's competitors in it's
products.

Still 100% problem rate for the small sample of Mac desktops I know of.


Maybe you need better friends? They seem to be very hard on computers.
Seriously, if this were representative of the failure rate of any
computer company's products, that company would soon disappear. But
Apple has not disappeared. This is an example of why one should not
generalize from anecdotal reports.


Like anecdotal reports of Macs that aren't being attacked not being
compromised? Or PCs requiring superhuman efforts to make secure?


The Consumer Reports stats are that Macs desktops are far more reliable
than PC desktops,


Most of those Macs that friends had problems with were desktops. I don't
count the one that drowned from an aquarium accident either.

and Mac laptops are in the middle of the crowd with PC
laptops. The stats are also that laptops are far more likely to fail
than desktops.


Anything that is regularly moved around is at far greater risk. They
experience far more shocks, vibration, humidity and thermal extremes.

The fact that nearly all applications for the Mac are also available for
the PC makes it fairly easy to generate a long term savings by
eliminating "non-standard" platforms and the support personnel
associated with them.


IT folk do think that way, and sometimes get away with this. But notice
the subtle switch here. We have gone from "Macs are bad" to "Macs are
too few to bother with (even if they are better)".


More, there are some compelling reasons like lower cost, business
standard, technical applications to use Windoze and few to use Macs a
GUI that is friendly to "creative" types, nothing else I can think of.


OK, a Chevy and a Cadillac.


At which point you have the same function and reliability, with the only
difference being in the cosmetics. I for one don't want my computer to
look stylish, I want it to do it's task reliably and inexpensively. I'm
not the type who wants to put a "stylish" Mac on my neatly prop-arranged
desk to impress my friends, I put my server PCs, network gear and big
UPS in a standard 19" equipment rack in the back corner of my garage.


You haven't seen my desk.


No, I haven't, but I have seen many styled centerpiece Mac desks of
"metro-computationals"?

There have been alternatives to Windows for as long as there has been
Windows, hence no monopoly. Windows simply won the popularity contest
and is now attacked because of it.


You can't have it both ways. If all but Windows is insignificant, which
you have been insisting, and Windows' ~90% market share certainly
supports that conclusion, then Windows is a de facto monopoly. There is
no getting around it, and it makes no real difference how this position
was achieved. So, Microsoft will find itself unable to have the
freedoms of its youth.


All but Windoze is indeed insignificant in the desktop arena, but that
still does not make a monopoly. A monopoly requires a lack of options as
was the case with telephones up until recently. Just because one company
has 90% market share does not make it a monopoly when there are a dozen
alternatives available to anyone who chooses to adopt them, it only
makes the company successful.


I wasn't real involved in the computer world pre-midrange, not really
old enough at 36.


So you won't remember when IBM was taken to Federal Court for antitrust.
The case was eventually settled, but one school of thought holds that
the only reason that it was possible for the PC to escape IBM's control
(IBM invented the PC), or Microsoft was able to become the OS vendor,
was that IBM way paralyzed by the antitrust case. It may well be true.


Oh goodie, OS2 for all (ick!). I once had a voice mail system to manage
that ran on OS2, while it was reliable enough, it was a real POS to
manage.

Dunno, us technical creative types can be fun too. Certainly my friends
are often amused at the over-the-top solutions I pull out of my
posterior for some insurmountable problems.


Somehow, I don't think that our partying ability will impress the
creative types. One can only hope that they don't giggle too loudly.


We "party" just fine, what we don't do is the social back-stabbing and
sucking-up of the "creative" types.

Pete C.
  #104   Report Post  
Posted to rec.crafts.metalworking,misc.survivalism
Nick Hull
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

In article ,
Joseph Gwinn wrote:

That doesn't change the fact that *you* are individually responsible for
your own safety and security. The police may be there after the fact,
but if *you* expose yourself to a predictable threat i.e. walking down
that alley in the bad part of town, or running without adequate security
software, *you* will still suffer the consequences of *your*
carelessness.


Individual responsibility is great for young, strong, and well-armed
people. This may be 5% of the population, in a really good year. But
even they will become old and weak.


'Men come in all sizes, but Colt makes eveners'
95% of the population + a good gun can take care of themselves.

One of the pillars of civilization is the idea, the realization that the
95% of the population that isn't young, strong, and well-armed can pay
~1% of the population to keep the criminal part of the population under
control. The ~1% are called cops.


And who protects the citizens from the cops? A 'good' cop is one who is
silent when a 'bad' cop commits crime. Citizens need cops like slaves
need masters.

--
Free men own guns, slaves don't
www.geocities.com/CapitolHill/5357/
  #105   Report Post  
Posted to rec.crafts.metalworking,misc.survivalism
Joseph Gwinn
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

In article ,
"Pete C." wrote:

Joseph Gwinn wrote:


I don't have any specific cites, but it was in the national news when it
occurred. Something along the lines of Intel's new CPU design infringing
on a number of HPaqDEC patents shortly after the Intel engineers had had
some sort of meeting with the HPaqDEC engineers. I guess it occurred
after HP was on the scene and Intel was able to con clueless Carley into
just selling them the whole Alpha thing instead of enforcing the patent
rights.


This has to have started long before Carly appeared. Do you recall the
timeframe?


I'm not very chronologically oriented. Had to be '98 - '00 range I
think.


While DEC may have alleged infringement, did the courts agree?


It didn't get to that point. The suit I believe was filed, but then
dropped after Compaq or HP sold out.


OK. So we will never know the truth of it.


That doesn't change the fact that *you* are individually responsible for
your own safety and security. The police may be there after the fact,
but if *you* expose yourself to a predictable threat i.e. walking down
that alley in the bad part of town, or running without adequate security
software, *you* will still suffer the consequences of *your*
carelessness.


Individual responsibility is great for young, strong, and well-armed
people. This may be 5% of the population, in a really good year.


More than that based on the stats, even in left leaning states.

But even they will become old and weak.


And better armed.


But slow and with poor vision.


One of the pillars of civilization is the idea, the realization that the
95% of the population that isn't young, strong, and well-armed can pay
~1% of the population to keep the criminal part of the population under
control. The ~1% are called cops.


Except that it doesn't really work very well. Indeed with the recent
examples in other parts of the world for inspiration, the criminal
element may begin to realize that the whole police / law structure can
collapse if it comes under attack.


It works pretty well in many countries, those with real governments. We
are fortunate to live in such a country, despite all our complaints
about that government getting too big for its britches.


By the way, MacOS 10.3.6 got CAPP EAL3 certification from NSA in
November 2004. See http://niap.nist.gov/cc-scheme/st/ST_VID4012.html.


No time to look at that at the moment.


It takes no time to read a one-page certificate.


Then claim that Macs benefit from their obscurity, not that they are
more immune / secure which they are not.


Even if you were to be correct, my point was that the average user
doesn't care why the Macs are secure. It's enough that they are secure,
and they are.


They are *not* secure, they are simply not targeted often. Not the same
thing and an important fact to understand.


They are secure in practice, and that's good enough for most people.
Tomorrow is tomorrow's problem.


They need security product if the user wants to help insure they do not
experience an attack of one sort or another. They may get along fine for
some length of time without security products, but they will eventually
suffer an attack.


Analogy fails. Macs simply don't need antivirus programs, very few have
them, and no infections result.


Very few attacks are made. Give them an attack and the user will see why
it was foolish to operate without sufficient safeguards.


Antivirus products are available for Macs. When the first real attack
happens, people will install such products.


A new and major Windows threat emerged last week, on 1 Jan 06. No patch
available until 9 Jan, according to my company's IT dept, so we have
been instructed to minimize our use of the Internet. The issue is with
WMV files, and perhaps JPEG and GIF. The notice I saw didn't get into
tech details, and I asked the IT folk if this was an ActiveX and/or
Internet Explorer problem, and they said that the problem was deeper
than that.


Yep, a threat came and went, woop-de-do. There was a patch from a third
party source before the official MS patch was released BTW.


They are getting better, but it took years, while all the while the Mac
and Linux/UNIX worlds were peacefully going about their business.


Cars did "just fine" for quite a while before seat
belts became the norm too. Of course now they try to force the asinine
airbags on the unsuspecting masses, kind of like security software that
formats your hard drive if it detects a virus it doesn't know how to
clean.


Car analogy irrelevant. See above.


What above?


I've lost the thread too.

A good mechanical analogy is locks. (Metal content!) No lock is
pickproof, average people can open many cheap locks with a bobby pin, an
average locksmith can pick an average lock in tens of seconds, while
only a handful of very skilled and focused locksmiths have ever managed
to pick a Medeco, and it took these expert locksmiths no less than an
hour per lock, and some locks took days.


Not a real good analogy since in the computer world, the picking can
happen at blinding speeds and unseen and neither Windoze not MacOS have
intrusion detection warnings to indicate an attack is in progress. It
makes no difference if it takes 1 minute or 10 minutes to compromise the
OS when it can be done undetected.


The point is that while everything made by man can be undone by man, not
all things are equally easy to undo. And the greater the required
skill, the fewer the people that can participate. This is universally
true.


Invalid data is *not* better than no data.


Consumer Reports published invalid data? That's a very strong
accusation, perhaps a libel. Upon what do you base this statement?


It is invalid because it is very often biased. We all know of their
rigged rollover "test", they may have escaped liability for that mostly
on first amendment grounds, but that in no way exonerated them. They are
frauds and I stand by that.


The first amendment? What does that have to do with it? The first
amendment does not protect one from a libel suit by an aggrieved
billion-dollar manufacturer, with a building full of lawyers to press
their case.


And still unanswered is what source you would instead recommend.


I'm afraid I don't recommend any source other than personal research
which is the only thing that can be relied on to be objective and if
biased, biased in a way that is acceptable.


Acceptable bias is only that which parallels one's own bias?


I looked and they only had the Jan issue.


The article isn't that long, so one could xerox the relevant part from
the public library's copy.


Haven't found the library yet.


As for reliability figures, it's hard to see how asking 80,000 people if
they have had a problem, and reporting the fraction that did have a
problem, could be biased, unless one claims that CR has it in for
Windows machines, and lied.


Indeed, but acquiring that expertise and just acquiring a ready-to-go OS
core are not exactly the same thing. Kind of like hiring a few mechanics
to help you build a race car vs. buying a race car and putting your
paint job on it.


Bad analogy. Apple bought NeXT the company, getting both NeXT the
operating system and the whole development team, both of which Jobs had
built from the ground up. It was a merger.


But NeXT didn't write the OS either. NeXT does not equal BSD.


NeXT the company took the BSD core (which is open source) and added the
stuff (GUI, OO development system, etc) needed to make an operating
system for general use. The unexpected thing was that a major market
for NeXT was the Financial world, where people used NeXT machines to
develop and run complex financial models.


Um, just about any of them. I've got a Linksys router plugged into the
cable modem, and plain old Norton Personal Firewall on the server. The
only this remotely "configured" is one port map on the router to map
port 80 to the address of the server. This is not brain surgery by any
means and it's worked just fine. The two lightning strikes that took out
two cable modems, two routers and a couple Ethernet ports are another
story.


I've had my share of engineers who had to ask me how to configure their
Linksys, so I'd venture that this knowledge isn't exactly universal.


Well, there is essentially nothing configured on it other than the port
map. What few other things I have configured aren't relevant to
security. Mapping port 80 to a web server is well documented on the
Linksys site, so anyone who can read and follow directions can set it up
in minutes.


I didn't claim that the problem was complex for those who understand
routers.


Nope, but it does appear that Windoze users may be better at keeping
work and personal machines separate.


Because they must?


Nope, just because more of them have separate work machines. They keep
their work and personal lives and machines separate.


Then why all the thunder about the necessity of keeping these separate?


Up until OSX they were more than just hidden.


I think this is circular. I knew where those controls were, even the
controls that nobody but a developer would dare to touch, even if they
did happen to know how to find them. Apparently, they don't teach
Windows gurus where the MacOS controls are, but it's a stretch to
conclude that such controls don't exist.


I don't claim to be a Windoze guru, I simply use it as a tool like
everything else. In Windoze, even during significant changes like NT -
95 - 2K - XP I never had a problem finding the settings I was looking
for within 1 minute. The few times I've worked on a Mac I've dug for
tens of minutes and still not found the proper settings.


I couldn't find them in tens of minutes either, when I first started.


MacOS 10 and later keeps all the good stuff under the UNIX rock.

Right, it is at least accessible without the need for any additional
software.


Yep. Again, people who do much of this will get specialized tools, for
convenience.


No, people who do much of this will be very happy with a CLI. To a tech,
a GUI is largely a hindrance, a CLI is fast, efficient and direct.


A CLI (command line interface) works best with config files that are
plain ascii, while the config files in MacOS 9 and before are binary.

MacOS 10 and later follow the UNIX rule, so CLI is back.


Right, and to emulate the CISC processor you need microcode calls to
produce the equivalent of the CISC instructions.


No microcode. RISK processors are hardwired to execute their small
instruction set, and anything more complex (like the CISC instructions
of yore) are implemented using lots of machine-code instructions.


Micro code would be machine-code instructions, just at a very low level
as a translation of a CISC opcode into the slew of RISC opcodes needed
to perform the same task.


No, microcode is at a deeper hardware level than machine code, although
many people use the term microcode loosely. Microcode is one way one
can design the CPU, and direct wiring is another. In both cases, the
CPU executes the machine code generated by the assembler and/or compiler.


And there are people that run multiple platforms.

Indeed there are, however I have not seen many people that run both PCs
and Macs for any length of time, generally one is abandoned because they
do not provide significantly different functionality for most people. Of
course for most specialized applications the PC wins since the apps
aren't available for a Mac.


After watching what the IT folk at work have to do to keep the PCs
working, I'm not interested in spending my life that way. I'd rather be
making chips. (Metal content!) So, I run only MacOS at home, because
they require essentially no maintenance. I will get a PC to run some
PC-only programs, but this PC will not be allowed out of the house,
hiding behind the firewall, and will certainly not be allowed any
internet and email access.


PC maintenance in a corporate environment and in a home environment are
vastly different. Don't get misguided trying to make the comparison as
there is none.


The difference is that at work, they have an entire IT department full
of full-time experts. At home, it's just me, myself, and I.


I only do email on one PC and that is the one with the AV software on
it, but all my machines have all the Internet access they need without
any issues. I readily surf out to lookup information from my CNC control
PC (not while CNCing of course) and have no problems. I pull my G-code
from the PC in the house where I run the CAD software and also Mach3 for
preview purposes.

You don't see too much 8051 stuff in CCI these days.


I don't know the current market shares, but I still see ads for 8051
tools in the back of electronic design and embedded programming
magazines. I think that the old Motorola 68040, now called ColdFire, is
a market leader, and it's a whole lot nicer to program than an 8051.


I don't pay a whole lot of attention as the PIC line covers what little
I need to do.


Actually, PICs are different from embedded CPUs like 8051 and 68040.
But yes, these appear only buried in disk drives and the like on the
computers of interest here.


I don't know. Appletalk on ethernet was used widely, and didn't cost
$400.

Dunno, getting the one Mac in the office at the old job onto Ethernet
did cost a bloody fortune compared to all the PCs. Think it was one of
the Macs that was a non-standard bus, perhaps nubus?


Perhaps. I don't recall when the transition happened, but Macs had
built-in ethernet long before the PCI bus slots appeared. Macs still
have the AUI connector (that connects to the external module that
interfaces to the coax-cable varieties of ethernet).


AUI has rather gone the way of the Dodo. With 10/100/1000 Ethernet
ports, an external fiber converter is about the only thing you might
want to add.


It's true that AUI ports are yesterday's story, but we were discussing
yesterday.


Also note that Macs had Appletalk and true networking from 1984, long
before PCs discovered any networking.

Mainstream PCs perhaps. Same with graphics since PCs were doing high end
graphics well before Macs even found color.


In a manner of speaking. The reason the Graphics Arts world went to
Macs in the first place was that only Macs handled text and pictures at
all well back then. Don't forget that the comparison was DOS; Windows
was many years in the future.


And the technical world went to PCs because Apple abandoned an open
architecture so they became useless to the tech world.


Huh? Apple never had an open architecture to abandon back then. It was
all closed; that was the complaint. Now, with MacOS 10, it is mostly
open, being based on BSD UNIX. The BSD core of MacOS 10 (called darwin)
is open source.


And Appletalk just worked. That
friend up the street had something like six machines networked in his
home in the late 1980s to early 1990s.

I had several PCs networked at home in that general timeframe as well.
Towards the tail end of that timeframe, but that corresponded to me
having more than one machine at home at all.


PC were not networked in 1984, while Macs came that way, straight from
the box, included in the original design. PCs really only got networked
when Novell came along. Eventually, MS wised up, and built networking
in, and crushed Novell. But this was far in the future.


Macs may have had some form of networking, but few people could afford
more than one making it largely irrelevant.


Oh, I don't know about that. My friend up the street had everything
networked. He is not a techie, but had no problem getting it all to
work. I never had to help him, and I lived about 0.5 miles down the
road.


At that point the whole
concept of why you might want more than one machine and a network
connection was still in it's infancy. Oddly enough, pre-PC I was
RS232ing stuff between a Vic20 and a C64.


Networking was happening in mainframes, but was pretty expensive. A few
companies developed low-cost but useful versions.



Still 100% problem rate for the small sample of Mac desktops I know of.


Maybe you need better friends? They seem to be very hard on computers.
Seriously, if this were representative of the failure rate of any
computer company's products, that company would soon disappear. But
Apple has not disappeared. This is an example of why one should not
generalize from anecdotal reports.


Like anecdotal reports of Macs that aren't being attacked not being
compromised? Or PCs requiring superhuman efforts to make secure?


One person's personal experience is anecdotal, as no single person can
see that much of the whole. The aggregated reports from many people can
achieve statistical significance, if there are enough people involved.


There have been alternatives to Windows for as long as there has been
Windows, hence no monopoly. Windows simply won the popularity contest
and is now attacked because of it.


You can't have it both ways. If all but Windows is insignificant, which
you have been insisting, and Windows' ~90% market share certainly
supports that conclusion, then Windows is a de facto monopoly. There is
no getting around it, and it makes no real difference how this position
was achieved. So, Microsoft will find itself unable to have the
freedoms of its youth.


All but Windoze is indeed insignificant in the desktop arena, but that
still does not make a monopoly. A monopoly requires a lack of options as
was the case with telephones up until recently. Just because one company
has 90% market share does not make it a monopoly when there are a dozen
alternatives available to anyone who chooses to adopt them, it only
makes the company successful.


In the Microsoft antitrust case, the Federal Appeals Court specifically
found that Microsoft was a monopoly, in law and in fact. This was an
explicit finding, not a passing inference.


I wasn't real involved in the computer world pre-midrange, not really
old enough at 36.


So you won't remember when IBM was taken to Federal Court for antitrust.
The case was eventually settled, but one school of thought holds that
the only reason that it was possible for the PC to escape IBM's control
(IBM invented the PC), or Microsoft was able to become the OS vendor,
was that IBM way paralyzed by the antitrust case. It may well be true.


Oh goodie, OS2 for all (ick!). I once had a voice mail system to manage
that ran on OS2, while it was reliable enough, it was a real POS to
manage.


OS2 and the Microchannel Bus were IBM's attempt to wrest control back.
Didn't work; people were on to them.


Joe Gwinn


  #106   Report Post  
Posted to rec.crafts.metalworking,misc.survivalism
Pete C.
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

Joseph Gwinn wrote:

In article ,
"Pete C." wrote:

Joseph Gwinn wrote:


I don't have any specific cites, but it was in the national news when it
occurred. Something along the lines of Intel's new CPU design infringing
on a number of HPaqDEC patents shortly after the Intel engineers had had
some sort of meeting with the HPaqDEC engineers. I guess it occurred
after HP was on the scene and Intel was able to con clueless Carley into
just selling them the whole Alpha thing instead of enforcing the patent
rights.

This has to have started long before Carly appeared. Do you recall the
timeframe?


I'm not very chronologically oriented. Had to be '98 - '00 range I
think.


While DEC may have alleged infringement, did the courts agree?


It didn't get to that point. The suit I believe was filed, but then
dropped after Compaq or HP sold out.


OK. So we will never know the truth of it.


Probably not, but the courts aren't much for the truth anyway, more for
what suits the political climate.


That doesn't change the fact that *you* are individually responsible for
your own safety and security. The police may be there after the fact,
but if *you* expose yourself to a predictable threat i.e. walking down
that alley in the bad part of town, or running without adequate security
software, *you* will still suffer the consequences of *your*
carelessness.

Individual responsibility is great for young, strong, and well-armed
people. This may be 5% of the population, in a really good year.


More than that based on the stats, even in left leaning states.

But even they will become old and weak.


And better armed.


But slow and with poor vision.


Dunno, I've seen plenty of old folks who are fast and have good vision.
Nothing is universal.


One of the pillars of civilization is the idea, the realization that the
95% of the population that isn't young, strong, and well-armed can pay
~1% of the population to keep the criminal part of the population under
control. The ~1% are called cops.


Except that it doesn't really work very well. Indeed with the recent
examples in other parts of the world for inspiration, the criminal
element may begin to realize that the whole police / law structure can
collapse if it comes under attack.


It works pretty well in many countries, those with real governments. We
are fortunate to live in such a country, despite all our complaints
about that government getting too big for its britches.


It works in those countries only because the vast majority of the
population agrees with the system. The examples we have seen however
make one wonder what would happen if even a few hundred people in on of
the "working" countries began actively attacking the police and
military. Not a pretty picture...



By the way, MacOS 10.3.6 got CAPP EAL3 certification from NSA in
November 2004. See http://niap.nist.gov/cc-scheme/st/ST_VID4012.html.


No time to look at that at the moment.


It takes no time to read a one-page certificate.


"Apple Mac OS X v10.3.6 and Mac OS X Server v10.3.6 provides a moderate
level of independently assured security in a conventional TOE and is
suitable for a cooperative non-hostile environment." rather says it all,
it does ok in a non-hostile environment. That's pretty much what I've
said, it does ok because it isn't attacked often.


Then claim that Macs benefit from their obscurity, not that they are
more immune / secure which they are not.

Even if you were to be correct, my point was that the average user
doesn't care why the Macs are secure. It's enough that they are secure,
and they are.


They are *not* secure, they are simply not targeted often. Not the same
thing and an important fact to understand.


They are secure in practice, and that's good enough for most people.
Tomorrow is tomorrow's problem.


That may be true, but the claims need to match the reality, and claiming
the Mac is immune is false.


They need security product if the user wants to help insure they do not
experience an attack of one sort or another. They may get along fine for
some length of time without security products, but they will eventually
suffer an attack.

Analogy fails. Macs simply don't need antivirus programs, very few have
them, and no infections result.


Very few attacks are made. Give them an attack and the user will see why
it was foolish to operate without sufficient safeguards.


Antivirus products are available for Macs. When the first real attack
happens, people will install such products.


Ok, and that is fine if not the best practice, however claiming immunity
when there is none is a lie.


A new and major Windows threat emerged last week, on 1 Jan 06. No patch
available until 9 Jan, according to my company's IT dept, so we have
been instructed to minimize our use of the Internet. The issue is with
WMV files, and perhaps JPEG and GIF. The notice I saw didn't get into
tech details, and I asked the IT folk if this was an ActiveX and/or
Internet Explorer problem, and they said that the problem was deeper
than that.


Yep, a threat came and went, woop-de-do. There was a patch from a third
party source before the official MS patch was released BTW.


They are getting better, but it took years, while all the while the Mac
and Linux/UNIX worlds were peacefully going about their business.


Not at all true, the Mac world was going about their business simply as
a result of their obscurity, but the Linux/UNIX worlds went through the
same junk Windoze has, they just didn't get mainstream publicity.


Cars did "just fine" for quite a while before seat
belts became the norm too. Of course now they try to force the asinine
airbags on the unsuspecting masses, kind of like security software that
formats your hard drive if it detects a virus it doesn't know how to
clean.

Car analogy irrelevant. See above.


What above?


I've lost the thread too.


Unraveling...


A good mechanical analogy is locks. (Metal content!) No lock is
pickproof, average people can open many cheap locks with a bobby pin, an
average locksmith can pick an average lock in tens of seconds, while
only a handful of very skilled and focused locksmiths have ever managed
to pick a Medeco, and it took these expert locksmiths no less than an
hour per lock, and some locks took days.


Not a real good analogy since in the computer world, the picking can
happen at blinding speeds and unseen and neither Windoze not MacOS have
intrusion detection warnings to indicate an attack is in progress. It
makes no difference if it takes 1 minute or 10 minutes to compromise the
OS when it can be done undetected.


The point is that while everything made by man can be undone by man, not
all things are equally easy to undo. And the greater the required
skill, the fewer the people that can participate. This is universally
true.


Bad analogy, skill is not a requirement in the computer attack world as
evidenced by the script kiddies. Unlike the physical skill required to
pick that Medeco lock, the tools required to attack a computer can be
readily transferred to non-skilled users. One skilled user is all that
is required to identify a vulnerability and then disseminate the code to
exploit it to the script kiddies.


Invalid data is *not* better than no data.

Consumer Reports published invalid data? That's a very strong
accusation, perhaps a libel. Upon what do you base this statement?


It is invalid because it is very often biased. We all know of their
rigged rollover "test", they may have escaped liability for that mostly
on first amendment grounds, but that in no way exonerated them. They are
frauds and I stand by that.


The first amendment? What does that have to do with it? The first
amendment does not protect one from a libel suit by an aggrieved
billion-dollar manufacturer, with a building full of lawyers to press
their case.


Then how did they get off the hook? It was 100% clear that CR had rigged
the test to parameters outside real world conditions.


And still unanswered is what source you would instead recommend.


I'm afraid I don't recommend any source other than personal research
which is the only thing that can be relied on to be objective and if
biased, biased in a way that is acceptable.


Acceptable bias is only that which parallels one's own bias?


Yes. When assessing products yourself, the only bias that can enter is
your own which is inherently acceptable to you.


I looked and they only had the Jan issue.

The article isn't that long, so one could xerox the relevant part from
the public library's copy.


Haven't found the library yet.


As for reliability figures, it's hard to see how asking 80,000 people if
they have had a problem, and reporting the fraction that did have a
problem, could be biased, unless one claims that CR has it in for
Windows machines, and lied.


They may or may not have lied, however their sample, while relatively
large only represents the responses of their readers which is not a
valid sampling of the computer user population as a whole. Not seeing
the report I also don't know if it made any attempt to validate actual
hardware problems vs. user error.


Indeed, but acquiring that expertise and just acquiring a ready-to-go OS
core are not exactly the same thing. Kind of like hiring a few mechanics
to help you build a race car vs. buying a race car and putting your
paint job on it.

Bad analogy. Apple bought NeXT the company, getting both NeXT the
operating system and the whole development team, both of which Jobs had
built from the ground up. It was a merger.


But NeXT didn't write the OS either. NeXT does not equal BSD.


NeXT the company took the BSD core (which is open source) and added the
stuff (GUI, OO development system, etc) needed to make an operating
system for general use. The unexpected thing was that a major market
for NeXT was the Financial world, where people used NeXT machines to
develop and run complex financial models.


Dunno, I've been at a large bank the last 7+ years and I've not seen a
trace of anything NeXT, or anything Apple for that matter. Our server
counts are in the tens of thousands BTW.


Um, just about any of them. I've got a Linksys router plugged into the
cable modem, and plain old Norton Personal Firewall on the server. The
only this remotely "configured" is one port map on the router to map
port 80 to the address of the server. This is not brain surgery by any
means and it's worked just fine. The two lightning strikes that took out
two cable modems, two routers and a couple Ethernet ports are another
story.

I've had my share of engineers who had to ask me how to configure their
Linksys, so I'd venture that this knowledge isn't exactly universal.


Well, there is essentially nothing configured on it other than the port
map. What few other things I have configured aren't relevant to
security. Mapping port 80 to a web server is well documented on the
Linksys site, so anyone who can read and follow directions can set it up
in minutes.


I didn't claim that the problem was complex for those who understand
routers.


I don't think you need to understand routers to lookup the "how do I
setup the router so the world can see my web server" FAQ on the Linksys
site (not sure that's the actual title).


Nope, but it does appear that Windoze users may be better at keeping
work and personal machines separate.

Because they must?


Nope, just because more of them have separate work machines. They keep
their work and personal lives and machines separate.


Then why all the thunder about the necessity of keeping these separate?


Because it's the prudent thing to do no matter what OS you run. If
you're going to run software from unknown sources on the machine in
pursuit of your music pirating activities, it is the prudent thing to
*not* do that on the machine you rely on for work.


Up until OSX they were more than just hidden.

I think this is circular. I knew where those controls were, even the
controls that nobody but a developer would dare to touch, even if they
did happen to know how to find them. Apparently, they don't teach
Windows gurus where the MacOS controls are, but it's a stretch to
conclude that such controls don't exist.


I don't claim to be a Windoze guru, I simply use it as a tool like
everything else. In Windoze, even during significant changes like NT -
95 - 2K - XP I never had a problem finding the settings I was looking
for within 1 minute. The few times I've worked on a Mac I've dug for
tens of minutes and still not found the proper settings.


I couldn't find them in tens of minutes either, when I first started.


Which confirms that the Mac UI is deficient. Even when the Windoze UI
changed from NT to 2K I still found the settings I wanted in seconds.



MacOS 10 and later keeps all the good stuff under the UNIX rock.

Right, it is at least accessible without the need for any additional
software.

Yep. Again, people who do much of this will get specialized tools, for
convenience.


No, people who do much of this will be very happy with a CLI. To a tech,
a GUI is largely a hindrance, a CLI is fast, efficient and direct.


A CLI (command line interface) works best with config files that are
plain ascii, while the config files in MacOS 9 and before are binary.


Not at all, CLI utilities manipulate config files (binary or ascii) just
fine $ ucx set config name_service /server=xxx.xxx.xxx.xxx
/domain=xxxx.xxx for example.


MacOS 10 and later follow the UNIX rule, so CLI is back.


Indeed, but MaxOS 10 and later *are* UNIX, with just a Mac UI shell, so
you may as well just run any of the UNIX variants with any of the
various shells, and save the cost of proprietary hardware.


Right, and to emulate the CISC processor you need microcode calls to
produce the equivalent of the CISC instructions.

No microcode. RISK processors are hardwired to execute their small
instruction set, and anything more complex (like the CISC instructions
of yore) are implemented using lots of machine-code instructions.


Micro code would be machine-code instructions, just at a very low level
as a translation of a CISC opcode into the slew of RISC opcodes needed
to perform the same task.


No, microcode is at a deeper hardware level than machine code, although
many people use the term microcode loosely. Microcode is one way one
can design the CPU, and direct wiring is another. In both cases, the
CPU executes the machine code generated by the assembler and/or compiler.


It's also a way to emulate opcodes the hardware itself doesn't handle
directly. RISC opcodes execute directly and CISC opcodes call microcode
to do the actual RISC opcodes to get the task done.


And there are people that run multiple platforms.

Indeed there are, however I have not seen many people that run both PCs
and Macs for any length of time, generally one is abandoned because they
do not provide significantly different functionality for most people. Of
course for most specialized applications the PC wins since the apps
aren't available for a Mac.

After watching what the IT folk at work have to do to keep the PCs
working, I'm not interested in spending my life that way. I'd rather be
making chips. (Metal content!) So, I run only MacOS at home, because
they require essentially no maintenance. I will get a PC to run some
PC-only programs, but this PC will not be allowed out of the house,
hiding behind the firewall, and will certainly not be allowed any
internet and email access.


PC maintenance in a corporate environment and in a home environment are
vastly different. Don't get misguided trying to make the comparison as
there is none.


The difference is that at work, they have an entire IT department full
of full-time experts. At home, it's just me, myself, and I.


They also have a company full of users trying to find the latest way to
screw up a machine. If they had a company full of Macs they would still
have the same headaches. It takes a dedicated IT dept to maintain a few
thousand desktop computers regardless of OS. It was only different in
the days of the "dumb" terminal.


I only do email on one PC and that is the one with the AV software on
it, but all my machines have all the Internet access they need without
any issues. I readily surf out to lookup information from my CNC control
PC (not while CNCing of course) and have no problems. I pull my G-code
from the PC in the house where I run the CAD software and also Mach3 for
preview purposes.

You don't see too much 8051 stuff in CCI these days.

I don't know the current market shares, but I still see ads for 8051
tools in the back of electronic design and embedded programming
magazines. I think that the old Motorola 68040, now called ColdFire, is
a market leader, and it's a whole lot nicer to program than an 8051.


I don't pay a whole lot of attention as the PIC line covers what little
I need to do.


Actually, PICs are different from embedded CPUs like 8051 and 68040.
But yes, these appear only buried in disk drives and the like on the
computers of interest here.


Only marginally different, PICs have internal memory and peripherals
while CPUs like the 8051 and 68040 require separate memory and
peripherals. PICs are just more efficient for small apps.


I don't know. Appletalk on ethernet was used widely, and didn't cost
$400.

Dunno, getting the one Mac in the office at the old job onto Ethernet
did cost a bloody fortune compared to all the PCs. Think it was one of
the Macs that was a non-standard bus, perhaps nubus?

Perhaps. I don't recall when the transition happened, but Macs had
built-in ethernet long before the PCI bus slots appeared. Macs still
have the AUI connector (that connects to the external module that
interfaces to the coax-cable varieties of ethernet).


AUI has rather gone the way of the Dodo. With 10/100/1000 Ethernet
ports, an external fiber converter is about the only thing you might
want to add.


It's true that AUI ports are yesterday's story, but we were discussing
yesterday.


Well, the Mac in question did not have an AUI connection as standard and
required an expensive Ethernet card to get it networked. This was in I
think about '97 or so, I don't recall the Mac details, but it was
probably only a couple years old at most.


Also note that Macs had Appletalk and true networking from 1984, long
before PCs discovered any networking.

Mainstream PCs perhaps. Same with graphics since PCs were doing high end
graphics well before Macs even found color.

In a manner of speaking. The reason the Graphics Arts world went to
Macs in the first place was that only Macs handled text and pictures at
all well back then. Don't forget that the comparison was DOS; Windows
was many years in the future.


And the technical world went to PCs because Apple abandoned an open
architecture so they became useless to the tech world.


Huh? Apple never had an open architecture to abandon back then. It was
all closed; that was the complaint. Now, with MacOS 10, it is mostly
open, being based on BSD UNIX. The BSD core of MacOS 10 (called darwin)
is open source.


Um, what about the then very popular II+? People built I/O cards for the
II+ and wrote assembler to control what they built. Apple abandoned the
open architecture with the Lisa and then Mac and to some extent the IIc.



And Appletalk just worked. That
friend up the street had something like six machines networked in his
home in the late 1980s to early 1990s.

I had several PCs networked at home in that general timeframe as well.
Towards the tail end of that timeframe, but that corresponded to me
having more than one machine at home at all.

PC were not networked in 1984, while Macs came that way, straight from
the box, included in the original design. PCs really only got networked
when Novell came along. Eventually, MS wised up, and built networking
in, and crushed Novell. But this was far in the future.


Macs may have had some form of networking, but few people could afford
more than one making it largely irrelevant.


Oh, I don't know about that. My friend up the street had everything
networked. He is not a techie, but had no problem getting it all to
work. I never had to help him, and I lived about 0.5 miles down the
road.


What does that have to do with affording multiple expensive Macs?


At that point the whole
concept of why you might want more than one machine and a network
connection was still in it's infancy. Oddly enough, pre-PC I was
RS232ing stuff between a Vic20 and a C64.


Networking was happening in mainframes, but was pretty expensive. A few
companies developed low-cost but useful versions.


And DEC was developing Ethernet along with who was it? They were also
developing DSSI which became SCSI and oddly enough came full circle with
SCSIs return to differential connections like DSSI had to begin with.


Still 100% problem rate for the small sample of Mac desktops I know of.

Maybe you need better friends? They seem to be very hard on computers.
Seriously, if this were representative of the failure rate of any
computer company's products, that company would soon disappear. But
Apple has not disappeared. This is an example of why one should not
generalize from anecdotal reports.


Like anecdotal reports of Macs that aren't being attacked not being
compromised? Or PCs requiring superhuman efforts to make secure?


One person's personal experience is anecdotal, as no single person can
see that much of the whole. The aggregated reports from many people can
achieve statistical significance, if there are enough people involved.


They can, but it takes more than just volume. A large sample from one
specific population only shows their impressions and may not reflect the
truth.


There have been alternatives to Windows for as long as there has been
Windows, hence no monopoly. Windows simply won the popularity contest
and is now attacked because of it.

You can't have it both ways. If all but Windows is insignificant, which
you have been insisting, and Windows' ~90% market share certainly
supports that conclusion, then Windows is a de facto monopoly. There is
no getting around it, and it makes no real difference how this position
was achieved. So, Microsoft will find itself unable to have the
freedoms of its youth.


All but Windoze is indeed insignificant in the desktop arena, but that
still does not make a monopoly. A monopoly requires a lack of options as
was the case with telephones up until recently. Just because one company
has 90% market share does not make it a monopoly when there are a dozen
alternatives available to anyone who chooses to adopt them, it only
makes the company successful.


In the Microsoft antitrust case, the Federal Appeals Court specifically
found that Microsoft was a monopoly, in law and in fact. This was an
explicit finding, not a passing inference.


Well, I disagree with them and as I've noted courts are not as much
about finding the truth as they are about finding what is acceptable in
the political climate. It is my belief and contention that there can be
no monopoly when there are alternatives.


I wasn't real involved in the computer world pre-midrange, not really
old enough at 36.

So you won't remember when IBM was taken to Federal Court for antitrust.
The case was eventually settled, but one school of thought holds that
the only reason that it was possible for the PC to escape IBM's control
(IBM invented the PC), or Microsoft was able to become the OS vendor,
was that IBM way paralyzed by the antitrust case. It may well be true.


Oh goodie, OS2 for all (ick!). I once had a voice mail system to manage
that ran on OS2, while it was reliable enough, it was a real POS to
manage.


OS2 and the Microchannel Bus were IBM's attempt to wrest control back.
Didn't work; people were on to them.


Thankfully Microchannel and even Token Ring have gone the way of the
dodo.

Pete C.
  #107   Report Post  
Posted to rec.crafts.metalworking
Martin H. Eastburn
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

Week after this coming looks like a cold trend is to be here.
So prepare Pete - going to be back to normal fast!

Martin
Martin Eastburn
@ home at Lions' Lair with our computer lionslair at consolidated dot net
NRA LOH & Endowment Member
NRA Second Amendment Task Force Charter Founder



Wayne Cook wrote:
On Tue, 03 Jan 2006 02:06:02 GMT, "Pete C."
wrote:


zadoc wrote:

On Tue, 03 Jan 2006 00:32:56 GMT, "Pete C."
wrote:

[snip, see original]


Don't you hate those annoying little distractions? I've got the
potential threat of grass fires to contend with here in the Dallas area.

Pete C.

The fires in Texas and Oklahoma are even making the TV news in Sydney.
As it is midwinter there, isn't this a bit unusual?

Are people there starting to wonder about climatic changes and global
warming?

One wonders what midsummer will be like.

Cheers,


Dunno, it's apparently a bit warmer than normal. I bailed out of the
frozen northeast to come down here and I'm loving the weather. It's 57
at the moment, was near 70 today, and I just talked to my mother in CT
where it's 33 and snowing heavily. This past summer had a few 104 degree
days, but the humidity was like 20% so as long as you were in the shade
it was just fine.



The areas that are having problems are used to more moisture than
they got last year. As for temperature well it's staying warmer than
it unusually does. Normally we will get cold spells followed by warm
spells. This year there's been very few cold spells (at least that's
how the old timers would put it).

Up here it's slightly drier than it really should be but not
extremely so. We're used to being rather dry up here. As for the fires
well we had a pretty bad one right next to my house Sun during all the
wind. In fact it did a good job of trying to burn the whole town down
and controlling it was a bear with 40-50 mph winds. But we are no
strangers to that type of fire up here thus all the fire departments
have a mutual assistance pact and there's even a joint command
structure worked out already. Thus they can fight fires like this as
effectively as possible. What really mess them up though was the front
coming through right in the middle of the fire. When it started the
wind was out of the west in a pretty steady 30-40mph. This spread the
fire down the river where it was difficult to fight. Then the wind
turned out of the north and at times got even higher. This caused the
start and end of the first burn to take off in a southern direction
(fortunately the middle of the fire had been controlled enough that it
didn't take off, real fortunate for me since my house would of been
right in line for it). They managed to stop the west end of the fire
right at I-40. If it had managed to jump that it would of ended up in
town. Another good fortune was the rain that came after the front came
through (though it took hour after the wind changed for it to come).
If it hadn't of came then they would of had a much harder time of
getting the rather spread out (by that time) fire under control.

Wayne Cook
Shamrock, TX
http://members.dslextreme.com/users/waynecook/index.htm


----== Posted via Newsfeeds.Com - Unlimited-Unrestricted-Secure Usenet News==----
http://www.newsfeeds.com The #1 Newsgroup Service in the World! 120,000+ Newsgroups
----= East and West-Coast Server Farms - Total Privacy via Encryption =----
  #108   Report Post  
Posted to rec.crafts.metalworking
Pete C.
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

"Martin H. Eastburn" wrote:

Week after this coming looks like a cold trend is to be here.
So prepare Pete - going to be back to normal fast!

Martin
Martin Eastburn
@ home at Lions' Lair with our computer lionslair at consolidated dot net
NRA LOH & Endowment Member
NRA Second Amendment Task Force Charter Founder


Ick. I hate cold weather. Not as much as I hate wet weather, but close.

Pete C.
  #109   Report Post  
Posted to rec.crafts.metalworking
Martin H. Eastburn
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

I'm with you there - but I lived in a rain forest for 17 years and sorta
got use to the fern growing on my boots!

Actually, I kinda like rain once a week. Keeps the dust down and fire back.
Martin [ down in deep East Texas ]
Martin Eastburn
@ home at Lions' Lair with our computer lionslair at consolidated dot net
NRA LOH & Endowment Member
NRA Second Amendment Task Force Charter Founder



Pete C. wrote:
"Martin H. Eastburn" wrote:

Week after this coming looks like a cold trend is to be here.
So prepare Pete - going to be back to normal fast!

Martin
Martin Eastburn
@ home at Lions' Lair with our computer lionslair at consolidated dot net
NRA LOH & Endowment Member
NRA Second Amendment Task Force Charter Founder



Ick. I hate cold weather. Not as much as I hate wet weather, but close.

Pete C.


----== Posted via Newsfeeds.Com - Unlimited-Unrestricted-Secure Usenet News==----
http://www.newsfeeds.com The #1 Newsgroup Service in the World! 120,000+ Newsgroups
----= East and West-Coast Server Farms - Total Privacy via Encryption =----
  #110   Report Post  
Posted to rec.crafts.metalworking,misc.survivalism
Joseph Gwinn
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

In article ,
"Pete C." wrote:

Joseph Gwinn wrote:

In article ,
"Pete C." wrote:

Joseph Gwinn wrote:

One of the pillars of civilization is the idea, the realization that the
95% of the population that isn't young, strong, and well-armed can pay
~1% of the population to keep the criminal part of the population under
control. The ~1% are called cops.

Except that it doesn't really work very well. Indeed with the recent
examples in other parts of the world for inspiration, the criminal
element may begin to realize that the whole police / law structure can
collapse if it comes under attack.


It works pretty well in many countries, those with real governments. We
are fortunate to live in such a country, despite all our complaints
about that government getting too big for its britches.


It works in those countries only because the vast majority of the
population agrees with the system. The examples we have seen however
make one wonder what would happen if even a few hundred people in on of
the "working" countries began actively attacking the police and
military. Not a pretty picture...


It happens from time to time. They are crushed.


By the way, MacOS 10.3.6 got CAPP EAL3 certification from NSA in
November 2004. See
http://niap.nist.gov/cc-scheme/st/ST_VID4012.html.

No time to look at that at the moment.


It takes no time to read a one-page certificate.


"Apple Mac OS X v10.3.6 and Mac OS X Server v10.3.6 provides a moderate
level of independently assured security in a conventional TOE and is
suitable for a cooperative non-hostile environment." rather says it all,
it does ok in a non-hostile environment. That's pretty much what I've
said, it does ok because it isn't attacked often.


Standard posterior-protection boilerplate. But they did pass the CAPP
certification. That said, CAPP (and Orange book before it) is silent on
network security issues.


The point is that while everything made by man can be undone by man, not
all things are equally easy to undo. And the greater the required
skill, the fewer the people that can participate. This is universally
true.


Bad analogy, skill is not a requirement in the computer attack world as
evidenced by the script kiddies. Unlike the physical skill required to
pick that Medeco lock, the tools required to attack a computer can be
readily transferred to non-skilled users. One skilled user is all that
is required to identify a vulnerability and then disseminate the code to
exploit it to the script kiddies.


The same is true of mechanical locks, if the attack is really that
simple. Try googling on the exploit to open a kryptonite bike lock
using a bic pen barrel.


Invalid data is *not* better than no data.

Consumer Reports published invalid data? That's a very strong
accusation, perhaps a libel. Upon what do you base this statement?

It is invalid because it is very often biased. We all know of their
rigged rollover "test", they may have escaped liability for that mostly
on first amendment grounds, but that in no way exonerated them. They are
frauds and I stand by that.


The first amendment? What does that have to do with it? The first
amendment does not protect one from a libel suit by an aggrieved
billion-dollar manufacturer, with a building full of lawyers to press
their case.


Then how did they get off the hook? It was 100% clear that CR had rigged
the test to parameters outside real world conditions.


How can you be so sure that the allegation was true?

The first amendment governs only the US Government, not ordinary
citizens and companies. And it confers zero protection against a libel
suit.


And still unanswered is what source you would instead recommend.

I'm afraid I don't recommend any source other than personal research
which is the only thing that can be relied on to be objective and if
biased, biased in a way that is acceptable.


Acceptable bias is only that which parallels one's own bias?


Yes. When assessing products yourself, the only bias that can enter is
your own which is inherently acceptable to you.


Um. I would hope for better, to learn things I wasn't born just
*knowing*.


I looked and they only had the Jan issue.

The article isn't that long, so one could xerox the relevant part from
the public library's copy.

Haven't found the library yet.


As for reliability figures, it's hard to see how asking 80,000 people if
they have had a problem, and reporting the fraction that did have a
problem, could be biased, unless one claims that CR has it in for
Windows machines, and lied.


They may or may not have lied, however their sample, while relatively
large only represents the responses of their readers which is not a
valid sampling of the computer user population as a whole. Not seeing
the report I also don't know if it made any attempt to validate actual
hardware problems vs. user error.


So, go see the report. It would take quite the conspiracy for all those
people to have told the same lie.


NeXT the company took the BSD core (which is open source) and added the
stuff (GUI, OO development system, etc) needed to make an operating
system for general use. The unexpected thing was that a major market
for NeXT was the Financial world, where people used NeXT machines to
develop and run complex financial models.


Dunno, I've been at a large bank the last 7+ years and I've not seen a
trace of anything NeXT, or anything Apple for that matter. Our server
counts are in the tens of thousands BTW.


Finance is not the same thing as banking. Think Wall Street and rocket
scientists.



Up until OSX they were more than just hidden.

I think this is circular. I knew where those controls were, even the
controls that nobody but a developer would dare to touch, even if they
did happen to know how to find them. Apparently, they don't teach
Windows gurus where the MacOS controls are, but it's a stretch to
conclude that such controls don't exist.

I don't claim to be a Windoze guru, I simply use it as a tool like
everything else. In Windoze, even during significant changes like NT -
95 - 2K - XP I never had a problem finding the settings I was looking
for within 1 minute. The few times I've worked on a Mac I've dug for
tens of minutes and still not found the proper settings.


I couldn't find them in tens of minutes either, when I first started.


Which confirms that the Mac UI is deficient. Even when the Windoze UI
changed from NT to 2K I still found the settings I wanted in seconds.


Not exactly. You have proven only that you don't realize how much you
know. And I have to ask the gurus where to find various Windows
controls.


MacOS 10 and later keeps all the good stuff under the UNIX rock.

Right, it is at least accessible without the need for any additional
software.

Yep. Again, people who do much of this will get specialized tools, for
convenience.

No, people who do much of this will be very happy with a CLI. To a tech,
a GUI is largely a hindrance, a CLI is fast, efficient and direct.


A CLI (command line interface) works best with config files that are
plain ascii, while the config files in MacOS 9 and before are binary.


Not at all, CLI utilities manipulate config files (binary or ascii) just
fine $ ucx set config name_service /server=xxx.xxx.xxx.xxx
/domain=xxxx.xxx for example.


In UNIX, the standard approach is for config files to be plain ascii,
which one manipulates using a standard text editor.


MacOS 10 and later follow the UNIX rule, so CLI is back.


Indeed, but MaxOS 10 and later *are* UNIX, with just a Mac UI shell, so
you may as well just run any of the UNIX variants with any of the
various shells, and save the cost of proprietary hardware.


If you like that rough a ride, yes.


No, microcode is at a deeper hardware level than machine code, although
many people use the term microcode loosely. Microcode is one way one
can design the CPU, and direct wiring is another. In both cases, the
CPU executes the machine code generated by the assembler and/or compiler.


It's also a way to emulate opcodes the hardware itself doesn't handle
directly. RISC opcodes execute directly and CISC opcodes call microcode
to do the actual RISC opcodes to get the task done.


Not necessarily, but books are written on this.


PC maintenance in a corporate environment and in a home environment are
vastly different. Don't get misguided trying to make the comparison as
there is none.


The difference is that at work, they have an entire IT department full
of full-time experts. At home, it's just me, myself, and I.


They also have a company full of users trying to find the latest way to
screw up a machine. If they had a company full of Macs they would still
have the same headaches. It takes a dedicated IT dept to maintain a few
thousand desktop computers regardless of OS. It was only different in
the days of the "dumb" terminal.


Possibly, but what has this to do with the original question?


Actually, PICs are different from embedded CPUs like 8051 and 68040.
But yes, these appear only buried in disk drives and the like on the
computers of interest here.


Only marginally different, PICs have internal memory and peripherals
while CPUs like the 8051 and 68040 require separate memory and
peripherals. PICs are just more efficient for small apps.


Well, the ColdFire et al do have onboard memory, et al. As I recall,
the PICs had a sharply limited instruction set, to simplify programming,
as the theory went.


AUI has rather gone the way of the Dodo. With 10/100/1000 Ethernet
ports, an external fiber converter is about the only thing you might
want to add.


It's true that AUI ports are yesterday's story, but we were discussing
yesterday.


Well, the Mac in question did not have an AUI connection as standard and
required an expensive Ethernet card to get it networked. This was in I
think about '97 or so, I don't recall the Mac details, but it was
probably only a couple years old at most.


Well, I missed that phase.


In a manner of speaking. The reason the Graphics Arts world went to
Macs in the first place was that only Macs handled text and pictures at
all well back then. Don't forget that the comparison was DOS; Windows
was many years in the future.

And the technical world went to PCs because Apple abandoned an open
architecture so they became useless to the tech world.


Huh? Apple never had an open architecture to abandon back then. It was
all closed; that was the complaint. Now, with MacOS 10, it is mostly
open, being based on BSD UNIX. The BSD core of MacOS 10 (called darwin)
is open source.


Um, what about the then very popular II+? People built I/O cards for the
II+ and wrote assembler to control what they built. Apple abandoned the
open architecture with the Lisa and then Mac and to some extent the IIc.


People could build for the nubus too. The interface was fully
documented; I had the specs. Not many did, though. Probably because the
nubus was too complex for small companies and homebrew folk to handle.


Macs may have had some form of networking, but few people could afford
more than one making it largely irrelevant.


Oh, I don't know about that. My friend up the street had everything
networked. He is not a techie, but had no problem getting it all to
work. I never had to help him, and I lived about 0.5 miles down the
road.


What does that have to do with affording multiple expensive Macs?


My friend up the street isn't wealthy, and yet he managed. He was
running a business, and it made business sense to him. There is more to
the cost of something than its price. He didn't want to have to become
an IT guy - no value added for him, it's pure overhead.


At that point the whole
concept of why you might want more than one machine and a network
connection was still in it's infancy. Oddly enough, pre-PC I was
RS232ing stuff between a Vic20 and a C64.


Networking was happening in mainframes, but was pretty expensive. A few
companies developed low-cost but useful versions.


And DEC was developing Ethernet along with who was it? They were also
developing DSSI which became SCSI and oddly enough came full circle with
SCSIs return to differential connections like DSSI had to begin with.


Xerox and Intel, if memory serves. Didn't realize that SCSI came from
DEC, though. They did their level best to cripple it in their own
computer line, mainly because it threatened their control of disks on
VAX systems. DEC disks cost twice the market price.


Still 100% problem rate for the small sample of Mac desktops I know
of.

Maybe you need better friends? They seem to be very hard on computers.
Seriously, if this were representative of the failure rate of any
computer company's products, that company would soon disappear. But
Apple has not disappeared. This is an example of why one should not
generalize from anecdotal reports.

Like anecdotal reports of Macs that aren't being attacked not being
compromised? Or PCs requiring superhuman efforts to make secure?


One person's personal experience is anecdotal, as no single person can
see that much of the whole. The aggregated reports from many people can
achieve statistical significance, if there are enough people involved.


They can, but it takes more than just volume. A large sample from one
specific population only shows their impressions and may not reflect the
truth.


Well, when one gets to 80,000 people, it gets pretty close to truth, as
close as one can get.


In the Microsoft antitrust case, the Federal Appeals Court specifically
found that Microsoft was a monopoly, in law and in fact. This was an
explicit finding, not a passing inference.


Well, I disagree with them and as I've noted courts are not as much
about finding the truth as they are about finding what is acceptable in
the political climate. It is my belief and contention that there can be
no monopoly when there are alternatives.


Ah, well, they don't care - they are the law.


Joe Gwinn


  #111   Report Post  
Posted to rec.crafts.metalworking,misc.survivalism
Pete C.
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

Joseph Gwinn wrote:

In article ,
"Pete C." wrote:

Joseph Gwinn wrote:


It works pretty well in many countries, those with real governments. We
are fortunate to live in such a country, despite all our complaints
about that government getting too big for its britches.


It works in those countries only because the vast majority of the
population agrees with the system. The examples we have seen however
make one wonder what would happen if even a few hundred people in on of
the "working" countries began actively attacking the police and
military. Not a pretty picture...


It happens from time to time. They are crushed.


I haven't seen instances that met the full scale guerilla war criteria
in one of the "working" countries. I suspect that those conditions would
cause a sizable reduction in the ranks of the regular police and the
military would have great difficulty operating among civilian
population. This fits what we have been seeing in Iraq and one of the
modern "working" countries would present an even greater challenge.


"Apple Mac OS X v10.3.6 and Mac OS X Server v10.3.6 provides a moderate
level of independently assured security in a conventional TOE and is
suitable for a cooperative non-hostile environment." rather says it all,
it does ok in a non-hostile environment. That's pretty much what I've
said, it does ok because it isn't attacked often.


Standard posterior-protection boilerplate. But they did pass the CAPP
certification. That said, CAPP (and Orange book before it) is silent on
network security issues.


So basically all it's saying is that MacOS provides moderate protection
against a casual walk-up intrusion i.e. login username and password,
something that Windoze also provides.


The point is that while everything made by man can be undone by man, not
all things are equally easy to undo. And the greater the required
skill, the fewer the people that can participate. This is universally
true.


Bad analogy, skill is not a requirement in the computer attack world as
evidenced by the script kiddies. Unlike the physical skill required to
pick that Medeco lock, the tools required to attack a computer can be
readily transferred to non-skilled users. One skilled user is all that
is required to identify a vulnerability and then disseminate the code to
exploit it to the script kiddies.


The same is true of mechanical locks, if the attack is really that
simple. Try googling on the exploit to open a kryptonite bike lock
using a bic pen barrel.


Still doesn't really compare to the ease of transfer to script kiddies
to launch large scale automated attacks.

It is invalid because it is very often biased. We all know of their
rigged rollover "test", they may have escaped liability for that mostly
on first amendment grounds, but that in no way exonerated them. They are
frauds and I stand by that.

The first amendment? What does that have to do with it? The first
amendment does not protect one from a libel suit by an aggrieved
billion-dollar manufacturer, with a building full of lawyers to press
their case.


Then how did they get off the hook? It was 100% clear that CR had rigged
the test to parameters outside real world conditions.


How can you be so sure that the allegation was true?


I watched their video of their tests and it was pretty obvious. One of
the CR spokespeople also made a passing semi admission that their test
was pretty extreme if I recall.


The first amendment governs only the US Government, not ordinary
citizens and companies. And it confers zero protection against a libel
suit.


I think CR escaped liability by the thinnest of hairs due to the fact
that while their test was extreme and not really representative of real
world conditions, it was not physically rigged like the infamous gas
tank video from another source.

Extreme steering input by a professional driver, specifically designed
to roll the vehicle may be outside the region of what would reasonably
be expected in the real world from an average driver, but it is
theoretically possible I guess.


And still unanswered is what source you would instead recommend.

I'm afraid I don't recommend any source other than personal research
which is the only thing that can be relied on to be objective and if
biased, biased in a way that is acceptable.

Acceptable bias is only that which parallels one's own bias?


Yes. When assessing products yourself, the only bias that can enter is
your own which is inherently acceptable to you.


Um. I would hope for better, to learn things I wasn't born just
*knowing*.


And that is what you'd get if your research skills are decent. The main
thing is that you will not fall victim to the biases of others in an
case.


They may or may not have lied, however their sample, while relatively
large only represents the responses of their readers which is not a
valid sampling of the computer user population as a whole. Not seeing
the report I also don't know if it made any attempt to validate actual
hardware problems vs. user error.


So, go see the report. It would take quite the conspiracy for all those
people to have told the same lie.


Different groups of people can and do have different experiences than
those of a truly random sampling of the population. It's not a lie, it's
simply a function of a non-random sample and a sample consisting only of
CR subscribers who took the time to respond is not a random sample.


NeXT the company took the BSD core (which is open source) and added the
stuff (GUI, OO development system, etc) needed to make an operating
system for general use. The unexpected thing was that a major market
for NeXT was the Financial world, where people used NeXT machines to
develop and run complex financial models.


Dunno, I've been at a large bank the last 7+ years and I've not seen a
trace of anything NeXT, or anything Apple for that matter. Our server
counts are in the tens of thousands BTW.


Finance is not the same thing as banking. Think Wall Street and rocket
scientists.


A large bank includes those areas, retail, commercial, investment, high
value, trading, etc.

I don't claim to be a Windoze guru, I simply use it as a tool like
everything else. In Windoze, even during significant changes like NT -
95 - 2K - XP I never had a problem finding the settings I was looking
for within 1 minute. The few times I've worked on a Mac I've dug for
tens of minutes and still not found the proper settings.

I couldn't find them in tens of minutes either, when I first started.


Which confirms that the Mac UI is deficient. Even when the Windoze UI
changed from NT to 2K I still found the settings I wanted in seconds.


Not exactly. You have proven only that you don't realize how much you
know. And I have to ask the gurus where to find various Windows
controls.


I perhaps don't realize how much I know, I seem to have a defective ego.
The point though is still that when the Windoze UI changed significantly
I still found what I needed quickly where I was not able to find what I
need quickly under the Mac UI.

No, people who do much of this will be very happy with a CLI. To a tech,
a GUI is largely a hindrance, a CLI is fast, efficient and direct.

A CLI (command line interface) works best with config files that are
plain ascii, while the config files in MacOS 9 and before are binary.


Not at all, CLI utilities manipulate config files (binary or ascii) just
fine $ ucx set config name_service /server=xxx.xxx.xxx.xxx
/domain=xxxx.xxx for example.


In UNIX, the standard approach is for config files to be plain ascii,
which one manipulates using a standard text editor.


That example was a reference to the VMS TCP/IP services package, which
derived from the Unix (Tru64) side of the house. Many of the config
files are text and readily editable, but the CLI utilities make it a bit
easier and in some cases one CLI command will update more than one file.


MacOS 10 and later follow the UNIX rule, so CLI is back.


Indeed, but MaxOS 10 and later *are* UNIX, with just a Mac UI shell, so
you may as well just run any of the UNIX variants with any of the
various shells, and save the cost of proprietary hardware.


If you like that rough a ride, yes.


I don't find the CDE shell or any of the others I've tried to be "rough"
by any means. The Mac UI is a little more cosmetically polished than
most, but not all of the shells, but I don't count cutsy stylized icons
as a feature that in any way improves function, any more than a cutsy
stylized case for the machine improves function.


No, microcode is at a deeper hardware level than machine code, although
many people use the term microcode loosely. Microcode is one way one
can design the CPU, and direct wiring is another. In both cases, the
CPU executes the machine code generated by the assembler and/or compiler.


It's also a way to emulate opcodes the hardware itself doesn't handle
directly. RISC opcodes execute directly and CISC opcodes call microcode
to do the actual RISC opcodes to get the task done.


Not necessarily, but books are written on this.


I've got a couple of those Alpha architecture books somewhere. Somewhere
I ended up with "Writing OpenVMS Alpha device drivers in C" which sounds
pretty scary.


PC maintenance in a corporate environment and in a home environment are
vastly different. Don't get misguided trying to make the comparison as
there is none.

The difference is that at work, they have an entire IT department full
of full-time experts. At home, it's just me, myself, and I.


They also have a company full of users trying to find the latest way to
screw up a machine. If they had a company full of Macs they would still
have the same headaches. It takes a dedicated IT dept to maintain a few
thousand desktop computers regardless of OS. It was only different in
the days of the "dumb" terminal.


Possibly, but what has this to do with the original question?


Dunno, it went a bit sideways, but the point still is that it is just
not possible to draw a comparison between the support needs of a
corporate environment and a home environment.


Actually, PICs are different from embedded CPUs like 8051 and 68040.
But yes, these appear only buried in disk drives and the like on the
computers of interest here.


Only marginally different, PICs have internal memory and peripherals
while CPUs like the 8051 and 68040 require separate memory and
peripherals. PICs are just more efficient for small apps.


Well, the ColdFire et al do have onboard memory, et al. As I recall,
the PICs had a sharply limited instruction set, to simplify programming,
as the theory went.


PICs have a RISC instruction set, not a limited one. One unique feature
the do have (or did have, haven't looked at the latest PICs) is single
word opcodes where the instruction word and instruction memory were 12
or 14 bits wide and the entire opcode and parameter if needed was a
single word. This along with no interrupts on the earlier PICs made the
instruction timing very predictable. It didn't really have anything to
do with simplifying programming.



AUI has rather gone the way of the Dodo. With 10/100/1000 Ethernet
ports, an external fiber converter is about the only thing you might
want to add.

It's true that AUI ports are yesterday's story, but we were discussing
yesterday.


Well, the Mac in question did not have an AUI connection as standard and
required an expensive Ethernet card to get it networked. This was in I
think about '97 or so, I don't recall the Mac details, but it was
probably only a couple years old at most.


Well, I missed that phase.


Lucky you. Not having followed Macs closely since I never liked the UI,
I don't know how long that phase lasted, only that it did exist.


Huh? Apple never had an open architecture to abandon back then. It was
all closed; that was the complaint. Now, with MacOS 10, it is mostly
open, being based on BSD UNIX. The BSD core of MacOS 10 (called darwin)
is open source.


Um, what about the then very popular II+? People built I/O cards for the
II+ and wrote assembler to control what they built. Apple abandoned the
open architecture with the Lisa and then Mac and to some extent the IIc.


People could build for the nubus too. The interface was fully
documented; I had the specs. Not many did, though. Probably because the
nubus was too complex for small companies and homebrew folk to handle.


Probably because the tech world abandoned the Mac well before nubus came
along and didn't look back. Homebrew folks are doing PCI these days
which is at least as complicated as nubus was.


Macs may have had some form of networking, but few people could afford
more than one making it largely irrelevant.

Oh, I don't know about that. My friend up the street had everything
networked. He is not a techie, but had no problem getting it all to
work. I never had to help him, and I lived about 0.5 miles down the
road.


What does that have to do with affording multiple expensive Macs?


My friend up the street isn't wealthy, and yet he managed. He was
running a business, and it made business sense to him. There is more to
the cost of something than its price. He didn't want to have to become
an IT guy - no value added for him, it's pure overhead.


Ok, so it was a business even if it was home based, which puts in in a
different realm. In those days few non-business home users could afford
more than one machine, be it PC or Mac.


At that point the whole
concept of why you might want more than one machine and a network
connection was still in it's infancy. Oddly enough, pre-PC I was
RS232ing stuff between a Vic20 and a C64.

Networking was happening in mainframes, but was pretty expensive. A few
companies developed low-cost but useful versions.


And DEC was developing Ethernet along with who was it? They were also
developing DSSI which became SCSI and oddly enough came full circle with
SCSIs return to differential connections like DSSI had to begin with.


Xerox and Intel, if memory serves. Didn't realize that SCSI came from
DEC, though. They did their level best to cripple it in their own
computer line, mainly because it threatened their control of disks on
VAX systems. DEC disks cost twice the market price.


Not sure about that last part, the DEC disk controllers (HSD/HSJ/HSG)
have uses SCSI disks for quite a while. The direct DSSI disks and the
SDI stuff was quite a ways back. There was good reason for not using
SCSI directly for quite a while until SCSI came back around to a
differential bus. Any of the non-desktop machines really need the bus
length afforded by the differential DSSI or other busses like CI in
order to connect to their storage and storage controllers.

Like anecdotal reports of Macs that aren't being attacked not being
compromised? Or PCs requiring superhuman efforts to make secure?

One person's personal experience is anecdotal, as no single person can
see that much of the whole. The aggregated reports from many people can
achieve statistical significance, if there are enough people involved.


They can, but it takes more than just volume. A large sample from one
specific population only shows their impressions and may not reflect the
truth.


Well, when one gets to 80,000 people, it gets pretty close to truth, as
close as one can get.


Only with a random sample. Sample size alone does not guarantee an
unbiased result. Ask 80,000 people of religion X if they believe in god
X and you'll get a 100% positive response. Ask 80,000 people sampled at
random from the worlds population if they believe in god X and you'll
get perhaps a 20% positive response.


In the Microsoft antitrust case, the Federal Appeals Court specifically
found that Microsoft was a monopoly, in law and in fact. This was an
explicit finding, not a passing inference.


Well, I disagree with them and as I've noted courts are not as much
about finding the truth as they are about finding what is acceptable in
the political climate. It is my belief and contention that there can be
no monopoly when there are alternatives.


Ah, well, they don't care - they are the law.


Still doesn't make them correct. I recall American Express complaining
that the big M/C / Visa monopoly was shutting them out, this at a time
when AmEx didn't even offer a comparable product. How the hell can a
monopoly be shutting you out of a market when you don't even offer a
product for that market?

Pete C.
  #112   Report Post  
Posted to rec.crafts.metalworking,misc.survivalism
Davebt
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

[total snip!}

When I had a dual booting system - Win98 and Suse 9.0Pro, the Suse
installation found all my ethernet home network, and my win partitions (the
latter with a little editing of fstab). Then I upgraded to 9.1, and it
couldn't find the network at all! Nor did 9.2, and now 10.0 says - "what
network?"!

I think this is backwards forgetfulness, not backwards compatability!

Dave (UK)


  #113   Report Post  
Posted to rec.crafts.metalworking,misc.survivalism
Tamper proof
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

Davebt wrote:

[total snip!}

When I had a dual booting system - Win98 and Suse 9.0Pro, the Suse
installation found all my ethernet home network, and my win partitions (the
latter with a little editing of fstab). Then I upgraded to 9.1, and it
couldn't find the network at all! Nor did 9.2, and now 10.0 says - "what
network?"!

I think this is backwards forgetfulness, not backwards compatability!

Dave (UK)


Just the opposite here with SuSE. I've had it since 7.3 and dual-booted
until 9.0 and it *never* had any problem setting up and seeing my home
network. Then with 9.2 I completely erased M$' crapware off my system and am
100% Linux, now up to 9.3 and still the same.
--
Ragheads - worthless pig **** eaters..
Illegal aliens - just as worthless as ragheads.
  #114   Report Post  
Posted to rec.crafts.metalworking,misc.survivalism
Joseph Gwinn
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

We seem to have achieved the perfect cyclic debate. I will let you have
the last word after this.

In article ,
"Pete C." wrote:

Joseph Gwinn wrote:

In article ,
"Pete C." wrote:

Joseph Gwinn wrote:


It works pretty well in many countries, those with real governments.
We are fortunate to live in such a country, despite all our complaints
about that government getting too big for its britches.

It works in those countries only because the vast majority of the
population agrees with the system. The examples we have seen however
make one wonder what would happen if even a few hundred people in on of
the "working" countries began actively attacking the police and
military. Not a pretty picture...


It happens from time to time. They are crushed.


I haven't seen instances that met the full scale guerilla war criteria
in one of the "working" countries. I suspect that those conditions would
cause a sizable reduction in the ranks of the regular police and the
military would have great difficulty operating among civilian
population. This fits what we have been seeing in Iraq and one of the
modern "working" countries would present an even greater challenge.


True enough, but the fact that some countries are ramshackle doesn't
prove that policing doesn't work. In fact, the US is pretty peaceful.
Watching the world news is pretty convincing.


"Apple Mac OS X v10.3.6 and Mac OS X Server v10.3.6 provides a moderate
level of independently assured security in a conventional TOE and is
suitable for a cooperative non-hostile environment." rather says it all,
it does ok in a non-hostile environment. That's pretty much what I've
said, it does ok because it isn't attacked often.


Standard posterior-protection boilerplate. But they did pass the CAPP
certification. That said, CAPP (and Orange book before it) is silent on
network security issues.


So basically all it's saying is that MacOS provides moderate protection
against a casual walk-up intrusion i.e. login username and password,
something that Windoze also provides.


That's not a fair summary of what I said at all. But I won't repeat the
arguments.


The point is that while everything made by man can be undone by man,
not all things are equally easy to undo. And the greater the required
skill, the fewer the people that can participate. This is universally
true.

Bad analogy, skill is not a requirement in the computer attack world as
evidenced by the script kiddies. Unlike the physical skill required to
pick that Medeco lock, the tools required to attack a computer can be
readily transferred to non-skilled users. One skilled user is all that
is required to identify a vulnerability and then disseminate the code to
exploit it to the script kiddies.


The same is true of mechanical locks, if the attack is really that
simple. Try googling on the exploit to open a kryptonite bike lock
using a bic pen barrel.


Still doesn't really compare to the ease of transfer to script kiddies
to launch large scale automated attacks.


True that automated attacks aren't so easy with mechanical locks (there
are automated safe-crackers), but this brings me back to my original
point, that script kiddies really cannot do that much with MacOS and
Linux/UNIX.


The first amendment? What does that have to do with it? The first
amendment does not protect one from a libel suit by an aggrieved
billion-dollar manufacturer, with a building full of lawyers to press
their case.

Then how did they get off the hook? It was 100% clear that CR had rigged
the test to parameters outside real world conditions.


How can you be so sure that the allegation was true?


I watched their video of their tests and it was pretty obvious. One of
the CR spokespeople also made a passing semi admission that their test
was pretty extreme if I recall.


How did you know which video to believe?


The first amendment governs only the US Government, not ordinary
citizens and companies. And it confers zero protection against a libel
suit.


I think CR escaped liability by the thinnest of hairs due to the fact
that while their test was extreme and not really representative of real
world conditions, it was not physically rigged like the infamous gas
tank video from another source.

Extreme steering input by a professional driver, specifically designed
to roll the vehicle may be outside the region of what would reasonably
be expected in the real world from an average driver, but it is
theoretically possible I guess.


Bingo! By law, this is *not* a libel or a fraud, whatever one may think
of it. It wasn't even close legally.

But in any event, I don't see the applicability to a survey of their
subscribers as to their experience with various products, computers
included.


And still unanswered is what source you would instead recommend.

I'm afraid I don't recommend any source other than personal research
which is the only thing that can be relied on to be objective and if
biased, biased in a way that is acceptable.

Acceptable bias is only that which parallels one's own bias?

Yes. When assessing products yourself, the only bias that can enter is
your own which is inherently acceptable to you.


Um. I would hope for better, to learn things I wasn't born just
*knowing*.


And that is what you'd get if your research skills are decent. The main
thing is that you will not fall victim to the biases of others in an
case.


Only to fall victim to ones own biases? This seems a bit pointless.

All humans are biased in one way or another. So, we must simply deal
with people as we find them.


They may or may not have lied, however their sample, while relatively
large only represents the responses of their readers which is not a
valid sampling of the computer user population as a whole. Not seeing
the report I also don't know if it made any attempt to validate actual
hardware problems vs. user error.


So, go see the report. It would take quite the conspiracy for all those
people to have told the same lie.


Different groups of people can and do have different experiences than
those of a truly random sampling of the population. It's not a lie, it's
simply a function of a non-random sample and a sample consisting only of
CR subscribers who took the time to respond is not a random sample.


With that size of a statistical sample, the biases and lies will cancel
out, unless there is a massive conspiricy, and yet 80,000 people managed
to keep the secret perfectly. Seems a bit remote.


NeXT the company took the BSD core (which is open source) and added the
stuff (GUI, OO development system, etc) needed to make an operating
system for general use. The unexpected thing was that a major market
for NeXT was the Financial world, where people used NeXT machines to
develop and run complex financial models.

Dunno, I've been at a large bank the last 7+ years and I've not seen a
trace of anything NeXT, or anything Apple for that matter. Our server
counts are in the tens of thousands BTW.


Finance is not the same thing as banking. Think Wall Street and rocket
scientists.


A large bank includes those areas, retail, commercial, investment, high
value, trading, etc.


Yes, but the market was Wall Street rocket scientists, the sort of
people who will never work for a bank, with the possible exception of
Citibank.


Not exactly. You have proven only that you don't realize how much you
know. And I have to ask the gurus where to find various Windows
controls.


I perhaps don't realize how much I know, I seem to have a defective ego.
The point though is still that when the Windoze UI changed significantly
I still found what I needed quickly where I was not able to find what I
need quickly under the Mac UI.


It's not the lack of ego that's the problem, it's the lack of education.


MacOS 10 and later follow the UNIX rule, so CLI is back.

Indeed, but MaxOS 10 and later *are* UNIX, with just a Mac UI shell, so
you may as well just run any of the UNIX variants with any of the
various shells, and save the cost of proprietary hardware.


If you like that rough a ride, yes.


I don't find the CDE shell or any of the others I've tried to be "rough"
by any means. The Mac UI is a little more cosmetically polished than
most, but not all of the shells, but I don't count cutsy stylized icons
as a feature that in any way improves function, any more than a cutsy
stylized case for the machine improves function.


Again, the issue is the likely experience of gurus versus ordinary,
non-technical users.


PC maintenance in a corporate environment and in a home environment
are
vastly different. Don't get misguided trying to make the comparison
as
there is none.

The difference is that at work, they have an entire IT department full
of full-time experts. At home, it's just me, myself, and I.

They also have a company full of users trying to find the latest way to
screw up a machine. If they had a company full of Macs they would still
have the same headaches. It takes a dedicated IT dept to maintain a few
thousand desktop computers regardless of OS. It was only different in
the days of the "dumb" terminal.


Possibly, but what has this to do with the original question?


Dunno, it went a bit sideways, but the point still is that it is just
not possible to draw a comparison between the support needs of a
corporate environment and a home environment.


I don't know why it's different - in neither case does the user want
their computer crippled by an infection.



Huh? Apple never had an open architecture to abandon back then. It
was all closed; that was the complaint. Now, with MacOS 10, it is mostly
open, being based on BSD UNIX. The BSD core of MacOS 10 (called
darwin) is open source.

Um, what about the then very popular II+? People built I/O cards for the
II+ and wrote assembler to control what they built. Apple abandoned the
open architecture with the Lisa and then Mac and to some extent the IIc.


People could build for the nubus too. The interface was fully
documented; I had the specs. Not many did, though. Probably because the
nubus was too complex for small companies and homebrew folk to handle.


Probably because the tech world abandoned the Mac well before nubus came
along and didn't look back. Homebrew folks are doing PCI these days
which is at least as complicated as nubus was.


What is this "tech world"? Macs lasted where I worked well past the
nubus days. IT only managed to drive them out when Apple went throught
the bad patch, just before Jobs returned, bringing NeXT. IT was gunning
for a single platform for years, so Apple's problems were the excuse,
not the reason.


And DEC was developing Ethernet along with who was it? They were also
developing DSSI which became SCSI and oddly enough came full circle with
SCSIs return to differential connections like DSSI had to begin with.


Xerox and Intel, if memory serves. Didn't realize that SCSI came from
DEC, though. They did their level best to cripple it in their own
computer line, mainly because it threatened their control of disks on
VAX systems. DEC disks cost twice the market price.


Not sure about that last part, the DEC disk controllers (HSD/HSJ/HSG)
have uses SCSI disks for quite a while. The direct DSSI disks and the
SDI stuff was quite a ways back. There was good reason for not using
SCSI directly for quite a while until SCSI came back around to a
differential bus. Any of the non-desktop machines really need the bus
length afforded by the differential DSSI or other busses like CI in
order to connect to their storage and storage controllers.


In the 1980s I was on a radar project that was considering VAXs as the
main computer. But we got nowhere because the only SCSI controller
available for this computer was for the PDP-11 bus, with two adapters,
one to get to the BI bus, the other to get from BI to the system bus
(whose name I forget). The max theoretical bandwidth of this SCSI
controller was 100 kilobytes/sec (on the PDP-11 bus). My home machine
of the day, a Mac SE, did 600 kilobytes/sec. We planned to interface
the radar hardware by making it look like a big SCSI tape drive. I
looked at this Rube Goldberg setup, and decided that I would never
manage to get it to work. We never did figure out how to hook the VAX
to the radar, and gave up, instead using Harris computers. With a SCSI
interface to the radar.


Like anecdotal reports of Macs that aren't being attacked not being
compromised? Or PCs requiring superhuman efforts to make secure?

One person's personal experience is anecdotal, as no single person can
see that much of the whole. The aggregated reports from many people
can
achieve statistical significance, if there are enough people involved.

They can, but it takes more than just volume. A large sample from one
specific population only shows their impressions and may not reflect the
truth.


Well, when one gets to 80,000 people, it gets pretty close to truth, as
close as one can get.


Only with a random sample. Sample size alone does not guarantee an
unbiased result. Ask 80,000 people of religion X if they believe in god
X and you'll get a 100% positive response. Ask 80,000 people sampled at
random from the worlds population if they believe in god X and you'll
get perhaps a 20% positive response.


Somehow, I don't think this analysis is relevant to the subscribers of
Consumer Reports.


In the Microsoft antitrust case, the Federal Appeals Court specifically
found that Microsoft was a monopoly, in law and in fact. This was an
explicit finding, not a passing inference.

Well, I disagree with them and as I've noted courts are not as much
about finding the truth as they are about finding what is acceptable in
the political climate. It is my belief and contention that there can be
no monopoly when there are alternatives.


Ah, well, they don't care - they are the law.


Still doesn't make them correct. I recall American Express complaining
that the big M/C / Visa monopoly was shutting them out, this at a time
when AmEx didn't even offer a comparable product. How the hell can a
monopoly be shutting you out of a market when you don't even offer a
product for that market?


Whatever. They are the law, and they have the sword.

You might wish to read the appeals court ruling. It's quite
informative, and can be read by a non-lawyer. The judges had to have
known that their ruling would be widely read outside of the legal world.
The ruling is on the web, and can be downloaded for free.

Here is one source: http://www.esp.org/misc/legal/USCA-DC_00-5212.pdf.

Here is another, in plain ascii text:
http://pacer.cadc.uscourts.gov/common/opinions/200106/00-5212a.txt.


Joe Gwinn
  #115   Report Post  
Posted to rec.crafts.metalworking,misc.survivalism
Pete C.
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

Joseph Gwinn wrote:

We seem to have achieved the perfect cyclic debate.


Mostly.

I will let you have
the last word after this.


Alrighty.


How can you be so sure that the allegation was true?


I watched their video of their tests and it was pretty obvious. One of
the CR spokespeople also made a passing semi admission that their test
was pretty extreme if I recall.


How did you know which video to believe?


It was the CR video that I saw that convinced me their "test" was far
outside real world parameters.

They may or may not have lied, however their sample, while relatively
large only represents the responses of their readers which is not a
valid sampling of the computer user population as a whole. Not seeing
the report I also don't know if it made any attempt to validate actual
hardware problems vs. user error.

So, go see the report. It would take quite the conspiracy for all those
people to have told the same lie.


Different groups of people can and do have different experiences than
those of a truly random sampling of the population. It's not a lie, it's
simply a function of a non-random sample and a sample consisting only of
CR subscribers who took the time to respond is not a random sample.


With that size of a statistical sample, the biases and lies will cancel
out, unless there is a massive conspiricy, and yet 80,000 people managed
to keep the secret perfectly. Seems a bit remote.

NeXT the company took the BSD core (which is open source) and added the
stuff (GUI, OO development system, etc) needed to make an operating
system for general use. The unexpected thing was that a major market
for NeXT was the Financial world, where people used NeXT machines to
develop and run complex financial models.

Dunno, I've been at a large bank the last 7+ years and I've not seen a
trace of anything NeXT, or anything Apple for that matter. Our server
counts are in the tens of thousands BTW.

Finance is not the same thing as banking. Think Wall Street and rocket
scientists.


A large bank includes those areas, retail, commercial, investment, high
value, trading, etc.


Yes, but the market was Wall Street rocket scientists, the sort of
people who will never work for a bank, with the possible exception of
Citibank.


I'm not sure of the latest rankings, but were are definitely in the top
10, probably top 5.

What is this "tech world"? Macs lasted where I worked well past the
nubus days. IT only managed to drive them out when Apple went throught
the bad patch, just before Jobs returned, bringing NeXT. IT was gunning
for a single platform for years, so Apple's problems were the excuse,
not the reason.


The "tech world" is those running technical applications, as opposed to
the "tech support world" supporting desktop machines.

In the 1980s I was on a radar project that was considering VAXs as the
main computer. But we got nowhere because the only SCSI controller
available for this computer was for the PDP-11 bus, with two adapters,
one to get to the BI bus, the other to get from BI to the system bus
(whose name I forget). The max theoretical bandwidth of this SCSI
controller was 100 kilobytes/sec (on the PDP-11 bus). My home machine
of the day, a Mac SE, did 600 kilobytes/sec. We planned to interface
the radar hardware by making it look like a big SCSI tape drive. I
looked at this Rube Goldberg setup, and decided that I would never
manage to get it to work. We never did figure out how to hook the VAX
to the radar, and gave up, instead using Harris computers. With a SCSI
interface to the radar.


Which model VAX? Perhaps SCSI was the wrong thing to look at as there
were parallel I/O cards available for Q-bus at least, DMA I think too.
They were used in a check processing system by Banktec and I'm sure
plenty of other applications.

Only with a random sample. Sample size alone does not guarantee an
unbiased result. Ask 80,000 people of religion X if they believe in god
X and you'll get a 100% positive response. Ask 80,000 people sampled at
random from the worlds population if they believe in god X and you'll
get perhaps a 20% positive response.


Somehow, I don't think this analysis is relevant to the subscribers of
Consumer Reports.


I think it is. I think the type of person who has faith in CR represents
one specific segment of the population. It certainly excludes the many
like myself who saw the bias in their reports prior and then discounted
them entirely after the rollover fiasco.



I just saw the CNN.com piece on the new Intel Macs. One thing semi Apple
related that I find troubling is the massive volume of money being
poured into essentially useless (certainly not practical) toys like MP3
players and cell phone ring tones. It points to the sad state of
personal financial managment / responsibility in this country that
people can pour that much money into useless junk and at the same time
not manage to pay their bills.

Pete C.


  #116   Report Post  
Posted to rec.crafts.metalworking
Abrasha
 
Posts: n/a
Default Linux is Driving me $#@!!!! nutz!!!

Gunner wrote:


What the hell am I doing wrong???????

Mommy!!!!!!!!!!!!!!!!!!!

Gunner


Who knows what you're doing wrong. You may never get to the bottom of it.

I've been struggling to get a couple of brand new Winblow$ XP computers
to run fine for the better part of a year now.

I just spend a few days at Mac World.

My next computer will be a Mac. Not that I think that my computer woes
will be completely over, but at least as far as multimedia is concerned
all will be a lot easier.

Has any of you ever tried to do digital video in a PC? A major headache
at best on a PC. A lot easier on the Mac, because of full integration.

Gunner, if you don't absolutely must have a Linux box, get a Mac. Lots
of open source software available for Macs also.

--
Abrasha
http://www.abrasha.com
Reply
Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules

Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Are Linux Lusers Really Displaced Locksmiths? (Foley Belsaw School of Linux Advocacy) Lisa Cottmann Home Repair 0 September 22nd 05 12:11 AM
Please stop this Linux crap!! You are doing NOTHING to advocate Linux David Sizemore Woodworking 3 March 29th 05 03:30 AM
Driving blindfolded Phil Kangas Metalworking 14 January 23rd 05 09:34 AM
Why I Switched to Linux. Peter Von Gasenhausen Home Repair 33 July 13th 03 11:21 PM


All times are GMT +1. The time now is 02:39 PM.

Powered by vBulletin® Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 DIYbanter.
The comments are property of their posters.
 

About Us

"It's about DIY & home improvement"