Woodworking (rec.woodworking) Discussion forum covering all aspects of working with wood. All levels of expertise are encouraged to particiapte.

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
  #121   Report Post  
Duane Bozarth
 
Posts: n/a
Default

Dave Hinz wrote:

On Fri, 27 May 2005 19:03:55 -0500, Duane Bozarth wrote:
Dave Hinz wrote:

On Fri, 27 May 2005 12:55:20 -0500, Duane Bozarth wrote:
Dave Hinz wrote:

...

I just couldn't stand the whole "doing things 4 times" aspect of it.
Hey, I'm gonna use a variable. Here is what it's called. Here is what
it's set to. OK, now, use it.
...

I've used Fortran since before there was a Standard in many incarnations
but fail to recognize the above complaint???

Well, to be fair, it's been ...24 years since I took that class, so my
memory may be "a bit rusty". I do remember not liking it at all.


Might be time to look at F90/95 then?


I'd rather remove my own spleen with a spoon, thank you very much.


Nothing ventured, nothing gained...
  #122   Report Post  
Tim Douglass
 
Posts: n/a
Default

On Tue, 31 May 2005 11:09:59 -0000,
(Robert Bonomi) wrote:

In article ,
Doug Miller wrote:
In article ,
(Robert Bonomi) wrote:
[long snip of interesting idea]
But trying to explain _how_ that trivial little "50*period" incantation
accomplishes that magic is *very* involved.


In similar situations, I have been known once or twice to write comments along
the lines of "Trust me: this works. If you can't figure out why, you probably
shouldn't modify it."

Those comments occasionally produced some chuckles... but never any
complaints.


I did that once. The boss spent the *entire* week-end trying to figure out
_how_ it did what it did. Including setting up a test-bed program to verify
that it really did do what I claimed.

Monday morning, I get called into his office. Whereupon he makes the request
to 'explain this thing to me', and then would I _please_ not do things like
that 'late in the week' -- that it was hard on management when they discovered
it after I was gone for the week-end.

After that, he made me write up complete comments on *how* it worked.
That was the full-page of comments for the one-line (one machine-instruction)
code.


Probably what I would have done had I been your manager. :-) When I
was actually writing code I wasn't the guy who created the ultra-cool
algorithm, I was the one who wrote some god-awful monstrosity to
accomplish the (supposedly) impossible in a ridiculously short period
of time. I always believed that anything I wanted the machine to do I
could make it do. It wasn't always pretty, but they always worked and
could always be maintained. Generally others came along and replaced
my quick and dirty with something elegant - but I got it out the door
on time.

When I moved to managing programmers I came to loathe those who wrote
"clever" algorithms. My ability to read and understand code was well
above average and anything I couldn't figure out in a reasonable
length of time with the included comments was rejected. Those who
persisted in producing those things found employment elsewhere. My
experience has been that the more clever the algorithm the more likely
it is to cause problems a couple years down the road. A good example
is undocumented truth-table algorithms - which were a favorite with
*my* boss when he wrote code (which I inherited). It wasn't hard at
all to lose a week or more just trying to add one option. As hardware
speeds increased I dumped the elegant in favor of brute force that
could be easily understood and maintained.

I always envied the algorithmic geniuses and the games they played,
but in the interests of making money for the long haul their code
wasn't my favorite.

--
"We need to make a sacrifice to the gods, find me a young virgin... oh, and bring something to kill"

Tim Douglass

http://www.DouglassClan.com
  #123   Report Post  
Dave Hinz
 
Posts: n/a
Default

On Tue, 31 May 2005 11:10:48 -0500, Duane Bozarth wrote:
Dave Hinz wrote:

On Fri, 27 May 2005 19:03:55 -0500, Duane Bozarth wrote:


Might be time to look at F90/95 then?


I'd rather remove my own spleen with a spoon, thank you very much.


Nothing ventured, nothing gained...


Eactly. I'll stay over here, fully spleened and programming in the
Pathologically Eclectic Rubbish Lister, thanks.

  #124   Report Post  
Duane Bozarth
 
Posts: n/a
Default

Dave Hinz wrote:

On Tue, 31 May 2005 11:10:48 -0500, Duane Bozarth wrote:
Dave Hinz wrote:

On Fri, 27 May 2005 19:03:55 -0500, Duane Bozarth wrote:


Might be time to look at F90/95 then?

I'd rather remove my own spleen with a spoon, thank you very much.


Nothing ventured, nothing gained...


Eactly. I'll stay over here, fully spleened and programming in the
Pathologically Eclectic Rubbish Lister, thanks.


Would seem to be for differing class of problem than where I would use
Fortran...

I've done quite a lot of Tcl, not much Perl
  #125   Report Post  
Dave Hinz
 
Posts: n/a
Default

On Tue, 31 May 2005 13:11:01 -0500, Duane Bozarth wrote:
Dave Hinz wrote:

On Tue, 31 May 2005 11:10:48 -0500, Duane Bozarth wrote:
Dave Hinz wrote:

On Fri, 27 May 2005 19:03:55 -0500, Duane Bozarth wrote:


Might be time to look at F90/95 then?

I'd rather remove my own spleen with a spoon, thank you very much.

Nothing ventured, nothing gained...


Eactly. I'll stay over here, fully spleened and programming in the
Pathologically Eclectic Rubbish Lister, thanks.


Would seem to be for differing class of problem than where I would use
Fortran...


Yes; in this case "something I'll work with" vs. "something I'd rather
not".

I've done quite a lot of Tcl, not much Perl


I can tweak and/or copy Tcl, but not write from scratch in it.



  #126   Report Post  
Robert Bonomi
 
Posts: n/a
Default

In article ,
Dave Hinz wrote:
On Mon, 30 May 2005 14:23:17 -0000, Robert Bonomi
wrote:

*GRIN* I had a reputation of "If you want something done, hunt up Bonomi,
tell him it's 'impossible', and stay out of his hair for a couple of weeks."


Yeah, I used to do that too, until I twigged to the fact that people
were doing it intentionally to me. In college, (mumble) years ago, for
one of the fabrication classes the assignment was to make a 2-sided PCB
for a logic analyzer. "Can't be done single-sided, so don't even try".
Well, it wasn't _that_ tough to do (he forgot that components are also
jumpers), so I did it single-sided. He threatened me with an
incomplete, because he had specified double-sided...


That's when you hand over the layout for the other side, showing the
etched-away surrounds for all the holes *except* one, which is plated-through
from ground on the other side. And explain it's a start towards TEMPEST
protection. grin

so I handed him the
other version done the way he wanted it.

Point is, the "It's impossible, you can't do it" is a tactic that people
to use to get people like you and me to work on something. Just so you
know. Not saying it's a problem, because those are usually the fun
projects anyway, but just something to be aware of.


*snort* I'd tumbled to _that_ by about fifth grade. Never got sucked into
anything unless I *wanted* to.

It's like you say, though -- for those who enjoy "problem solving" those are
the 'fun' projects. beats the h*ll out of doing routine support work
any day.

Then there was the day I made the 'mistake' of suggesting to the boss that
I might be able to improve the performance of our 'critical path' project
scheduling software (which was believed to be the fastest package then
in existence at that work). The boss was doubtful -- *very* doubtful in
fact -- giving me permission to try, but saying "I'm 'Thomas'". I found
out *later* that that program was his pride-and-joy, that he had sweated
blood optimizing it. Well, 4 days later, I'm back to see him. Requesting
an appointment with "Mr. Thomas". Puzzlement, he can't place any client
named "Thomas". then he remembers "Oh??" "Yup!" (with a *BIG* grin.)
And show him a quick-and-dirty that is more than *one*hundred*times* faster
than the production software. Disbelief was rampant, to put it mildly.
It took numerous runs, with radical changes to the data, before he'd believe
that it was calculating the entire solution each time.

Boss had a *real* problem with that piece of work. Terribly conflicted.
*SERIOUS* "pride of authorship" in the work he had done on the one side, and
cupidity on the other. On same-size networks, on roughly equivalent CPUs,
competing software would have run-times measured in hours. Our production
product gave you enough time to go have a somewhat leisurely cup of coffee
(i.e. the 10-15 minute range). My 'improved' version reduced the calculation
time to under 10 *seconds*. Not even enough time to stand up. The
'economic advantage' of _that_ was obvious to him.

It did get a little funny though -- the performance of the whole package when
I got through 'playing with it' was so high that there was a degree of
disbelief that it could "really" be doing what it claimed. One example;
IBM was building a new chip-fabrication plant on the East Coast. They had
a project-scheduling package that ran on their mid-range systems. the
complexity of the fab plant construction was exceeding the limits of
that software, _and_ taking almost an entire shift (8 hours) to calculate
a set of solutions. The general contractor was _insisting_ that they
user our service; so, eventually, an entire troop of IBM big-wigs came
out to our offices for the 'dog and pony show'. We had already input
the data for their project into the system, to for the demonstration.
*I* am the "lucky" guy who gets to do the demo. I'm showing the date-entry
capabilities, review and editing stuff, etc. to establish that it's the
real project data, and then I do the 'generate solution' command.
( About this point, the top dog of the visiting crew gets distracted by
what's going on outside the window of our offices -- mid-level of a
high-rise, with a _new_ building going up on the adjacent property;
construction up to about our floor-level.)
12 seconds later, our computer says the solution is ready; so I'm showing
those who are paying attention that we do have a solution. "now, lets
change some data and re-solve", Doing this for several iterations, I push
the 'critical path' all around the project, leaving no doubt that actual
solutions are being generated. One final set of data edits, to remove
all the changes, and another solution generation, About this point, the
'top dog' pulls his awareness back to the demo -- 'how long till we see
some answers?' he asks. His disbelief, when *his* people insist to him
that it's =already= been done, and not just once, but _half_a_dozen_times_,
was a sight to see. He came "thisclose" to calling the first guy a liar
right to his face, only when the rest of the crew was all jumping in and
insisting it 'really was true', did he calm down. I re-ran a couple
of change cycles, _while_ he was paying attention, and he got *very* quiet.

Needless to say, we got that project. And a fair number more from IBM.


Note: the more the boss looked at what I had done to that software, the
*more* ****ed-off he got. He *knew*, for example, that the fastest way
to search is a 'binary search'. And had carefully organized the data in
sorted order so that his finely tuned and optimized binary search would
operate at maximum efficiency. Whereas _I_, the freshly-out-of-college
'young twerp', wasn't bothering to maintain any kind of ordering in the
data, and was using brute-force _linear_ searches. The fact that he was
doing a binary search of a _disk_file_, and I was doing a linear search
of an array _in_memory_ had a *great* deal to do with my approach being
faster. I _could_ have don binary searches, too, but this was a *very*
restricted memory environment (total address-space of 32k 16-bit words)
and the additional _space_ consumed by the binary-search code - and the
requisite sorting of the data - wasn't worth the space penalty,

The boss understood _programming_. I understood *systems*. He had done
his best optimizing the software -- and the algorithm _was_ very highly
optimized. I looked at the _system_, and engineered away the _systemic_
bottlenecks. I/O is *slow*, so you minimize it. 'Record-based' I/O also
lots of overhead, so you do large block I/O instead. Organize the data
in the way the _machine_ uses it, rather than 'people-rational' style,
etc.,etc.

By the time I got _done_ playing, performance was just over 3-orders of
magnitude better then the prior version -- on a 'maximum-size' network,
*and* the maximum-size of the project it could handle was _60%_ larger.
(this is a textbook "order n-squared" problem, so the total performance
differential was about 3.5 orders of magnitude. Not too shabby!

  #127   Report Post  
Duane Bozarth
 
Posts: n/a
Default

Dave Hinz wrote:

....
Yes; in this case "something I'll work with" vs. "something I'd rather
not".


I meant more in the sense that Fortran (or C/C++/...) are not what I
would think of for solution language(s) for what I would consider
suitable for Perl/Tcl/..., not that any one of them as being a personal
choice within the general class...


I've done quite a lot of Tcl, not much Perl


I can tweak and/or copy Tcl, but not write from scratch in it.


About my state wrt Perl although if tweaking were to get at all involved
"I'd rather not".

Out of curiousity have you ever looked at current Fortran or are your
past experiences the extent of knowledge? Many complaints cross-posted
in comp.lang.fortran arise apparently from such long held perceptions.
But, C++, it isn't even though there are forms of OOP making it into
current proposals for the next Standard. (Of course, it isn't intended
to be.)
  #129   Report Post  
Duane Bozarth
 
Posts: n/a
Default

Robert Bonomi wrote:

....
"sufficiently advanced PERL is indistinguishable from line-noise".o

grin


The old ploy of typing in your name or other phrase into TECO comes to
mind?
  #130   Report Post  
Dave Hinz
 
Posts: n/a
Default

On Tue, 31 May 2005 18:31:26 -0000, Robert Bonomi wrote:
In article ,
Dave Hinz wrote:
On Mon, 30 May 2005 14:23:17 -0000, Robert Bonomi
wrote:

*GRIN* I had a reputation of "If you want something done, hunt up Bonomi,
tell him it's 'impossible', and stay out of his hair for a couple of weeks."


Yeah, I used to do that too, until I twigged to the fact that people
were doing it intentionally to me. In college, (mumble) years ago, for
one of the fabrication classes the assignment was to make a 2-sided PCB
for a logic analyzer. "Can't be done single-sided, so don't even try".
Well, it wasn't _that_ tough to do (he forgot that components are also
jumpers), so I did it single-sided. He threatened me with an
incomplete, because he had specified double-sided...


That's when you hand over the layout for the other side, showing the
etched-away surrounds for all the holes *except* one, which is plated-through
from ground on the other side. And explain it's a start towards TEMPEST
protection. grin


Once again, I bow to the superiour deviousness. Deviousity.
Deveveiance. Whatever. Well played, Sir, is what I'm saying here.

Point is, the "It's impossible, you can't do it" is a tactic that people
to use to get people like you and me to work on something. Just so you
know. Not saying it's a problem, because those are usually the fun
projects anyway, but just something to be aware of.


*snort* I'd tumbled to _that_ by about fifth grade. Never got sucked into
anything unless I *wanted* to.


Just checking. Because I used to work with a really dumb smart guy who
still hadn't figured it out by age 30, and the sight of it dawning on
him as I explained what was going on, was _priceless_.

It's like you say, though -- for those who enjoy "problem solving" those are
the 'fun' projects. beats the h*ll out of doing routine support work
any day.


I don't mind the end-users so much now that I don't have to deal with
them, unless there's a "Sherlock Holmes" thing going on.

Then there was the day I made the 'mistake' of suggesting to the boss that
I might be able to improve the performance of our 'critical path' project
scheduling software (which was believed to be the fastest package then
in existence at that work). The boss was doubtful -- *very* doubtful in
fact -- giving me permission to try, but saying "I'm 'Thomas'". I found
out *later* that that program was his pride-and-joy, that he had sweated
blood optimizing it. Well, 4 days later, I'm back to see him. Requesting
an appointment with "Mr. Thomas". Puzzlement, he can't place any client
named "Thomas". then he remembers "Oh??" "Yup!" (with a *BIG* grin.)
And show him a quick-and-dirty that is more than *one*hundred*times* faster
than the production software. Disbelief was rampant, to put it mildly.
It took numerous runs, with radical changes to the data, before he'd believe
that it was calculating the entire solution each time.


Easy to step on someone's toes by making a radical change to something
they've been improving incrementally. Not that that stops me, but it's
understandable. I'm told that at times like that, there's this thing
called "tact" that I should try to use (shrug?) whatever that is.

Boss had a *real* problem with that piece of work. Terribly conflicted.
*SERIOUS* "pride of authorship" in the work he had done on the one side, and
cupidity on the other. On same-size networks, on roughly equivalent CPUs,
competing software would have run-times measured in hours. Our production
product gave you enough time to go have a somewhat leisurely cup of coffee
(i.e. the 10-15 minute range). My 'improved' version reduced the calculation
time to under 10 *seconds*. Not even enough time to stand up. The
'economic advantage' of _that_ was obvious to him.


Lovely, that!

It did get a little funny though -- the performance of the whole package when
I got through 'playing with it' was so high that there was a degree of
disbelief that it could "really" be doing what it claimed. One example;
IBM was building a new chip-fabrication plant on the East Coast. They had
a project-scheduling package that ran on their mid-range systems. the
complexity of the fab plant construction was exceeding the limits of
that software, _and_ taking almost an entire shift (8 hours) to calculate
a set of solutions. The general contractor was _insisting_ that they
user our service; so, eventually, an entire troop of IBM big-wigs came
out to our offices for the 'dog and pony show'.


That's normally a clear setup for a catastrophic setup during a demo.
As I'm sure you know.

We had already input
the data for their project into the system, to for the demonstration.
*I* am the "lucky" guy who gets to do the demo. I'm showing the date-entry
capabilities, review and editing stuff, etc. to establish that it's the
real project data, and then I do the 'generate solution' command.


....and here it comes...

12 seconds later, our computer says the solution is ready; so I'm showing
those who are paying attention that we do have a solution. "now, lets
change some data and re-solve", Doing this for several iterations, I push
the 'critical path' all around the project, leaving no doubt that actual
solutions are being generated. One final set of data edits, to remove
all the changes, and another solution generation, About this point, the
'top dog' pulls his awareness back to the demo -- 'how long till we see
some answers?' he asks. His disbelief, when *his* people insist to him
that it's =already= been done, and not just once, but _half_a_dozen_times_,
was a sight to see.


I _love_ that look.

The fact that he was
doing a binary search of a _disk_file_, and I was doing a linear search
of an array _in_memory_ had a *great* deal to do with my approach being
faster.


Still applies today. Index that database & keep the index in RAM.
Seems blisteringly obvious, but at the time, the field was still full of
surprises. What am I saying - it's still full of surprises...

The boss understood _programming_. I understood *systems*.


Right. The best possible improvement to the wrong solution still gives
you the wrong solution, in the end. In his case the fundamental design
flaw was masked by the improvements he was making to it.

By the time I got _done_ playing, performance was just over 3-orders of
magnitude better then the prior version -- on a 'maximum-size' network,
*and* the maximum-size of the project it could handle was _60%_ larger.
(this is a textbook "order n-squared" problem, so the total performance
differential was about 3.5 orders of magnitude. Not too shabby!


Fun stuff...


  #131   Report Post  
Dave Hinz
 
Posts: n/a
Default

On Tue, 31 May 2005 13:34:33 -0500, Duane Bozarth wrote:
Dave Hinz wrote:

...
Yes; in this case "something I'll work with" vs. "something I'd rather
not".


I meant more in the sense that Fortran (or C/C++/...) are not what I
would think of for solution language(s) for what I would consider
suitable for Perl/Tcl/..., not that any one of them as being a personal
choice within the general class...


I've done quite a lot of Tcl, not much Perl


I can tweak and/or copy Tcl, but not write from scratch in it.


About my state wrt Perl although if tweaking were to get at all involved
"I'd rather not".

Out of curiousity have you ever looked at current Fortran or are your
past experiences the extent of knowledge?


It's been over 20 years, and I really don't want to go back there.
I'm doing sysadmin-project stuff now, so between shell scripting and
perl, I can do 95% of what I want to do.

Many complaints cross-posted
in comp.lang.fortran arise apparently from such long held perceptions.
But, C++, it isn't even though there are forms of OOP making it into
current proposals for the next Standard. (Of course, it isn't intended
to be.)


Right, C++ is an entirely different set of problems.


  #132   Report Post  
Dave Hinz
 
Posts: n/a
Default

On Tue, 31 May 2005 18:38:30 -0000, Robert Bonomi wrote:
In article ,
Dave Hinz wrote:


I used to work with a guy whose Perl is like that. (Sean, I'm gonna
kill you.)


"sufficiently advanced PERL is indistinguishable from line-noise".o


I wouldn't exactly call the PERL in question "advanced", but the guy who
wrote it would probably loudly disagree.

grin


It's not funny, trust me on this one

  #134   Report Post  
Dave Hinz
 
Posts: n/a
Default

On Tue, 31 May 2005 16:27:19 -0400, Norman D. Crow wrote:

I wasn't in the software end of the business, but 27yr. with NCR Corp. as
large scale system tech. produced a lot of funnies. First one was an
insurance company, program running solid for about 3yr. suddenly starts
blowing up, always at the same address. Turned out it was a situation which
original programmer knew could occur, but had never debugged!


You mean like the 9/9/99 date bug that I personally wrote into more than
a few applications? I mean, come on, who in their right mind would
still be using this POS in 15 years...

....and oddly enough, in September '99, I got a phone call and found out.
It was OK though, the guy wasn't _in_ his right mind. "Hey, your
program broke!" "Um, OK, can you be less vague please?" "Yeah, this is
(name that rhymes with, oh, let's say, 'Dennis'). That program you
wrote for me just died."
"Let's see - that was....1983? 84? I think the warranty period
is over. My consulting rate these days is..."

The conversation was over, just like the previous time he called me for
immediate help. On Easter Sunday. He doesn't call me any more.


  #135   Report Post  
Duane Bozarth
 
Posts: n/a
Default

Dave Hinz wrote:
....
It's been over 20 years, and I really don't want to go back there.
I'm doing sysadmin-project stuff now, so between shell scripting and
perl, I can do 95% of what I want to do.

....

No, wasn't implying that at all...just that what started this sub-thread
was a comment based on 20+ year old perceptions of a language that has
changed dramatically...


  #136   Report Post  
Dave Hinz
 
Posts: n/a
Default

On Tue, 31 May 2005 16:09:19 -0500, Duane Bozarth wrote:
Dave Hinz wrote:
...
It's been over 20 years, and I really don't want to go back there.
I'm doing sysadmin-project stuff now, so between shell scripting and
perl, I can do 95% of what I want to do.


No, wasn't implying that at all...just that what started this sub-thread
was a comment based on 20+ year old perceptions of a language that has
changed dramatically...


Oh, I'm sure it has, and it definatly has a place. GE's MRI scanners
are still using Fortran in some of the heavy-duty math processing, or at
least were a few years ago. No reason to change it - it's great for
what it's great for.

  #137   Report Post  
Robert Bonomi
 
Posts: n/a
Default

In article ,
Norman D. Crow wrote:

"Tim Douglass" wrote in message
.. .
On Tue, 31 May 2005 11:09:59 -0000,
(Robert Bonomi) wrote:

When I moved to managing programmers I came to loathe those who wrote
"clever" algorithms. My ability to read and understand code was well
above average and anything I couldn't figure out in a reasonable
length of time with the included comments was rejected. Those who
persisted in producing those things found employment elsewhere. My
experience has been that the more clever the algorithm the more likely
it is to cause problems a couple years down the road. A good example
is undocumented truth-table algorithms - which were a favorite with
*my* boss when he wrote code (which I inherited). It wasn't hard at
all to lose a week or more just trying to add one option. As hardware
speeds increased I dumped the elegant in favor of brute force that
could be easily understood and maintained.


I wasn't in the software end of the business, but 27yr. with NCR Corp. as
large scale system tech. produced a lot of funnies. First one was an
insurance company, program running solid for about 3yr. suddenly starts
blowing up, always at the same address. Turned out it was a situation which
original programmer knew could occur, but had never debugged!


I released a program with a 'known bug' in it. Once. The boss made me
do it. I'd spent two solid weeks trying to chase it down, _without_ any
luck. A memory corruption problem. Brute-force tracking. scatter a
bunch of output statements through the program and see where the thing
gets clobbered. then scatter more output between where the last point it
wasn't clobbered, and where the first place it _was_ clobbered. Only to
find that *now* that the crime is happening "somewhere else". This was
a 'bug' with *legs* -- it MOVED!

In desperation, played with the order in which the modules were linked,
found a sequence where (presumably *still* happening) memory corruption
did not affect the answers coming out, and released it that way. Put a
*LOUD* note in the source, documenting the existence of the problem; that
if the existing code was linked in _this_ order, the problem was 'harmless',
and that _any_ modification of any component was "highly risky".

This was documentation that *literally* started off:
"Warning, Will Robinson. WARNING!!"

and ended with "All hope abandon, ye who press 'enter' here."

It seemed appropriate.


  #138   Report Post  
Robert Bonomi
 
Posts: n/a
Default

In article ,
Dave Hinz wrote:
On Tue, 31 May 2005 18:31:26 -0000, Robert Bonomi
wrote:
In article ,
Dave Hinz wrote:
On Mon, 30 May 2005 14:23:17 -0000, Robert Bonomi


That's when you hand over the layout for the other side, showing the
etched-away surrounds for all the holes *except* one, which is plated-through
from ground on the other side. And explain it's a start towards TEMPEST
protection. grin


Once again, I bow to the superiour deviousness. Deviousity.
Deveveiance. Whatever. Well played, Sir, is what I'm saying here.


grin

I've had *LOTS* of practice.

It also runs in the family. My brother had a college course in Mech. Eng.,
where on one exam, they had to design a mechanism for keeping a particular
pressure differential between two tanks. And there was a list of components
that you had to use -- anything else in addition was OK, but you had to use
_those_ pieces. My brother looks at the problem, realizes that there is an
'absurdly simple' solution, involving nothing more than a couple of
appropriately-sized pieces of pipe, a free cylinder, and a couple of pin-holes
in the pipes. *But* he has to use the required hardware. so he adds a
pulley, with a wire over it, hangs all the electronics on the end of the
wire -- as dead-weight *ballast*, to provide for some hysteresis in the
movement of the free cylinder. He told us, later, that there was a *LOT*
of discussion on how to grade that answer.


I never managed one _quite_ like that, although I did manage to sabotage
a final in a course I wasn't even taking. Freshman year, and I'm struggling
through the regular 2nd-quarter integral calculus course. an acquaintance
is in the "honors" advanced calculus course. They had a 'take home' final
in that course, with all of 5 questions on it. And an entire _week_ to
do it in. Explicit rules were "Anything goes" for references, etc. except
for consulting other students in the class. He brings it with him to lunch
in the cafeteria. I look it over, don't even _understand_ the first 4
problems; look at #5 and say "Oh! that one's easy!" Not quite an "AHA!"
solution, but close. *IF* you approached the problem the way it was laid
out it was a real doozy. If you took "two steps back", and squinted at it
sideways, there was a far, far, simpler method of getting at the answer.
elementary integral calculus. I didn't "know enough" to tackle it the
hard way, but, in my ignorance, immediately saw the simple approach.
It turned out that the professor had overlooked the existence of the simple
approach when designing the problem. that was the problem that "nobody"
was expected to get entirely right. OOPS!

For some reason, I was _not_ terribly popular with TAs, and even some
professors, in college.

OTOH, many of the profs and I got along fabulously. Intro assembler-language
course, and about the 3rd assignment is to write a sort routine -- trivial
one -- given an array of numbers in memory, sort into order. Everybody else
is struggling with a simple bubble sort -- I turn in an insertion sort that
doesn't use _any_ additional storage; everything is in registers. Comes
back from grading with a single comment across the top:
"And now, for something COMPLETELY different..."
I _liked_ that professor!


On the other hand, there was the 'algorithms' class. where, at one point,
we had to write, and *time*, various sorting algorithms. In the language
of our choice. So we could 'see for ourselves' the relative performance
of the various algorithms. I chose to write in assembler, and the prof
got upset. Two reasons: 1) he didn't understand assembler, 2) my results
didn't show what they were supposed to show -- he was convinced I'd done
something wrong. We had to turn in the programs, so he could critique the
style, and verify the performance. My "inconsistent" performance numbers
*did* verify, somewhat to his consternation. He had to go consult some
of the gurus over at the computer center about the code. They looked at
it, and verified that they were all 'good quality' implementations of the
algorithms.

So, *why* was 'bubble sort' _faster_ than all the other algorithms, for
anything that would fit in memory on that machine? Things *AINT* supposed
to work that way! _Hardware_ quirk. that machine had a very small
instruction cache. Just barely big enough to hold a carefully coded
bubble sort. Any 'smarter' algorithm didn't fit in the cache. and you
had constant reloading from main memory. Plus, physical memory was
small enough that _within the constraints of available memory, you
didn't get enough advantage from the lower 'order' to make up for the
much larger constant.

That Prof was seriously miffed at having the 'real world' trample all over
his nice theories.

Point is, the "It's impossible, you can't do it" is a tactic that people
to use to get people like you and me to work on something. Just so you
know. Not saying it's a problem, because those are usually the fun
projects anyway, but just something to be aware of.


*snort* I'd tumbled to _that_ by about fifth grade. Never got sucked into
anything unless I *wanted* to.


Just checking. Because I used to work with a really dumb smart guy who
still hadn't figured it out by age 30, and the sight of it dawning on
him as I explained what was going on, was _priceless_.


I daresay.

It's like you say, though -- for those who enjoy "problem solving" those are
the 'fun' projects. beats the h*ll out of doing routine support work
any day.


I don't mind the end-users so much now that I don't have to deal with
them, unless there's a "Sherlock Holmes" thing going on.


I was talking about 'maintenance coding', not user hand-holding. I generally
enjoyed that, too. Except for the guy who got it into his head that I was
a Lotus 1-2-3 guru. A program I knew *NOTHING* about. He'd ask me to come
out to his desk, and say "I've got this problem. I'm trying to do to thus-
and-such, and it won't work." My invariable response was "Give me the 1-2-3
manual. Lessee, the index says 'thus-and-such' is on page xxx. Page xxx
says..." and I would read verbatim from the manual. "Gee. THANKS!!! That
does it!" *sigh* I'm glad _he_ understood it. Frequently _I_ didn't!
But, that's how a consultant gets a reputation as an expert. wry grin


Then there was the day I made the 'mistake' of suggesting to the boss that
I might be able to improve the performance of our 'critical path' project
scheduling software (which was believed to be the fastest package then
in existence at that work). The boss was doubtful -- *very* doubtful in
fact -- giving me permission to try, but saying "I'm 'Thomas'". I found
out *later* that that program was his pride-and-joy, that he had sweated
blood optimizing it. Well, 4 days later, I'm back to see him. Requesting
an appointment with "Mr. Thomas". Puzzlement, he can't place any client
named "Thomas". then he remembers "Oh??" "Yup!" (with a *BIG* grin.)
And show him a quick-and-dirty that is more than *one*hundred*times* faster
than the production software. Disbelief was rampant, to put it mildly.
It took numerous runs, with radical changes to the data, before he'd believe
that it was calculating the entire solution each time.


Easy to step on someone's toes by making a radical change to something
they've been improving incrementally. Not that that stops me, but it's
understandable. I'm told that at times like that, there's this thing
called "tact" that I should try to use (shrug?) whatever that is.


This wasn't a tact issue -- it was just that he'd been tuning that program
for a number of years. and it was known to the fastest thing going, vs.
what _any_ of the competition had. The new version simply _couldn't_ be
*that*much* faster. He _had_ sort of lit-up (friendly!) when he figured
out the "Mr. Thomas" reference -- I think figuring I might have found a
way to squeeze out maybe 10-20% faster runs. 100x, on the other hand wasn't
credible. 'Incredible' it was, in the *literal* meaning of the word

Boss had a *real* problem with that piece of work. Terribly conflicted.
*SERIOUS* "pride of authorship" in the work he had done on the one side, and
cupidity on the other. On same-size networks, on roughly equivalent CPUs,
competing software would have run-times measured in hours. Our production
product gave you enough time to go have a somewhat leisurely cup of coffee
(i.e. the 10-15 minute range). My 'improved' version reduced the calculation
time to under 10 *seconds*. Not even enough time to stand up. The
'economic advantage' of _that_ was obvious to him.


Lovely, that!


Yeah. Unfortunately *he* didn't realize/recognize the depth of the conflict,
While cupidity governed his primary decisions, there _was_ still the other
stuff going on.

It did get a little funny though -- the performance of the whole package when
I got through 'playing with it' was so high that there was a degree of
disbelief that it could "really" be doing what it claimed. One example;
IBM was building a new chip-fabrication plant on the East Coast. They had
a project-scheduling package that ran on their mid-range systems. the
complexity of the fab plant construction was exceeding the limits of
that software, _and_ taking almost an entire shift (8 hours) to calculate
a set of solutions. The general contractor was _insisting_ that they
user our service; so, eventually, an entire troop of IBM big-wigs came
out to our offices for the 'dog and pony show'.


That's normally a clear setup for a catastrophic setup during a demo.
As I'm sure you know.


We had already input
the data for their project into the system, to for the demonstration.
*I* am the "lucky" guy who gets to do the demo. I'm showing the date-entry
capabilities, review and editing stuff, etc. to establish that it's the
real project data, and then I do the 'generate solution' command.


...and here it comes...

12 seconds later, our computer says the solution is ready; so I'm showing
those who are paying attention that we do have a solution. "now, lets
change some data and re-solve", Doing this for several iterations, I push
the 'critical path' all around the project, leaving no doubt that actual
solutions are being generated. One final set of data edits, to remove
all the changes, and another solution generation, About this point, the
'top dog' pulls his awareness back to the demo -- 'how long till we see
some answers?' he asks. His disbelief, when *his* people insist to him
that it's =already= been done, and not just once, but _half_a_dozen_times_,
was a sight to see.


I _love_ that look.


My boss, back in the corner, was thoroughly enjoying it.
I *really* didn't know enough to appreciate it. 2nd job, fresh out of college.

The IBM bigwig _really_ thought the guy was pulling his leg, at first.

He knew what kind of 'toy' equipment we were running on (Data General mini),
and had been using the IBM in-house product for years -- where the turn-
around time on _much_smaller_ projects was 10s of minutes, if not
hours. This *exact* dataset -- on their (much bigger) platform -- had a
run-time that was just over eight _hours_.

I _think_ that crew came out with the intention of a quick once-over, and
then dismiss it as 'unworkable' for any reason they could think up. You
realize that this was IBM corporate, coming to a _non-IBM-equipment_ shop
for data-processing services. What kind of odds would you offer _against_
that =actually= happening?

They went away sold. They hired us.

The fact that he was
doing a binary search of a _disk_file_, and I was doing a linear search
of an array _in_memory_ had a *great* deal to do with my approach being
faster.


Still applies today. Index that database & keep the index in RAM.
Seems blisteringly obvious, but at the time, the field was still full of
surprises. What am I saying - it's still full of surprises...


Part of it was 'database organization', too. Split the database into
multiple parallel tables (although this violates 'standard' database design),
so that you can pull in *only* the critical fields for any particular
operation. Then pull the entire table into memory in -one- operation,
rather than a row-by-row read. (not forgetting the 'incidentals', like
declaring that disk allocation for that file was to be contiguous, and
creating it at maximum size.

Even the report generator worked the same way. compose an entire block
of pages _in_memory_ (I had room for 5 print pages), and then issue _one_
system call to output the whole mess. Made the report writer more than
100x faster, just due to the I/O reductions.


When somebody was using that package, it was *immediately* obvious on
the system status monitoring tools. Bam, the system I/O rate goes through
the roof (like 8x the normal load rate); BANG! CPU utilization hits 100%,
and stays there, while I/O plummets back to normal; Then the CPU load
drops to normal, with another big spike in I/O. Nothing else on the system
came close to touching _either_ the CPU utilization or the I/O rates.
It _was_ fun to watch.

The boss understood _programming_. I understood *systems*.


Right. The best possible improvement to the wrong solution still gives
you the wrong solution, in the end. In his case the fundamental design
flaw was masked by the improvements he was making to it.


Even deadlier. He's started out developing something to solve _small_
networks (that did fit into memory). As he got the software "good",
he needed to be able to deal with more network elements than there was
memory available. "Paging" the data to disk was an 'obvious' solution.
(no hardware VM, no O/S support, memory management _was_ the domain of
the application program.)

I had a significant advantage -- looking at initially as a 'big' problem.
And, if you didn't have room for all the data, and all the code, you
keep only the 'critical' part for any given step in memory (but keep _all_
of that), and do the same thing with the code. modularize it, with only
a single module in memory at at time. Not even 'overlays', a whole
flock of stand-alone programs, and an entirely separate 'user interface'
program that did nothing in and of itself, just the function menu display
and invoked the other tools, as needed -- some menu options invoked only
one tool, others invoked multiple tools in succession.

Running with a _total_ address-space of 32 k words, most of the tools had
declared data areas in excess of 27K words this left d*mn little room
for code.


By the time I got _done_ playing, performance was just over 3-orders of
magnitude better then the prior version -- on a 'maximum-size' network,
*and* the maximum-size of the project it could handle was _60%_ larger.
(this is a textbook "order n-squared" problem, so the total performance
differential was about 3.5 orders of magnitude. Not too shabby!


Fun stuff...



  #140   Report Post  
lgb
 
Posts: n/a
Default

In article , -
bonomi.com says...
And show him a quick-and-dirty that is more than *one*hundred*times* faster
than the production software. Disbelief was rampant, to put it mildly.
It took numerous runs, with radical changes to the data, before he'd believe
that it was calculating the entire solution each time.

Back in '58 or so, I worked for SRA, the folks who did the National
Merit Scholarship tests. They did correlations between test
questions/categories a pair at a time. I wrote a program (on a Readix
computer for all you real oldtimers) that did a 7x7 matrix of
correlations. I could only get 14 scores on an 80 column card :-).

I don't think anyone but the Readix rep ever believed it worked, no
matter how many times I tested it.

Being the only high school dropout in a shop of PhDs might have had
something to do with it :-).

--
BNSF = Build Now, Seep Forever


  #141   Report Post  
Mark & Juanita
 
Posts: n/a
Default

On 31 May 2005 20:54:07 GMT, Dave Hinz wrote:

On Tue, 31 May 2005 16:27:19 -0400, Norman D. Crow wrote:

I wasn't in the software end of the business, but 27yr. with NCR Corp. as
large scale system tech. produced a lot of funnies. First one was an
insurance company, program running solid for about 3yr. suddenly starts
blowing up, always at the same address. Turned out it was a situation which
original programmer knew could occur, but had never debugged!


You mean like the 9/9/99 date bug that I personally wrote into more than
a few applications? I mean, come on, who in their right mind would
still be using this POS in 15 years...


You mean you broke one of the first rules of programming? i.e, don't use
conditions that are possible, even if highly improbable, to terminate
loops. Or at least it was one of the first ones in our intro to
programming course when I was in college 20+ years ago.

My first assignment out of college was designing and building automated
test equipment. I was assigned to a "senior" engineer, I think he had
around 8 years of experience :-) My assignment was writing the code to
control an eprom programmer, he was designing, building, and designing the
code for a much more sophisticated interface panel, designed to serve as an
interface between the system being tested and a controlling computer. He
had someone writing the assembler code for this panel to his design. When
I finished my assignment I wound up helping him debug the code and hardware
for the interface panel. While I had taken a modular, test a step at a
time approach to my design, this more senior guy took a more wholistic
approach, he had the first interface panel hardware assembled, did a
cursory visual power signal check, blew all 30k bytes of assembler code
into eprom, then turned on the panel expecting to get some sort of readout
on the front panel -- he didn't. It took several months of time to work
through the hardware errors before we even got to the point of having the
software start functioning.

One of the funniest parts of the assembler code, he had a module of code
that had the comment, "should not have gotten here, set a flag so we don't
get here again".

After he left the company, I inherited this monstrosity -- I convinced my
supervisor that it would be cheaper and better for all involved if I just
started over and redesigned the whole thing, hardware, code and all from
the ground up. My supervisor agreed. The resulting system did *not* have
any places where I set a flag to make sure I didn't get somewhere I should
not have gotten in the first place.




...and oddly enough, in September '99, I got a phone call and found out.
It was OK though, the guy wasn't _in_ his right mind. "Hey, your
program broke!" "Um, OK, can you be less vague please?" "Yeah, this is
(name that rhymes with, oh, let's say, 'Dennis'). That program you
wrote for me just died."
"Let's see - that was....1983? 84? I think the warranty period
is over. My consulting rate these days is..."

The conversation was over, just like the previous time he called me for
immediate help. On Easter Sunday. He doesn't call me any more.




+--------------------------------------------------------------------------------+

If you're gonna be dumb, you better be tough

+--------------------------------------------------------------------------------+
  #142   Report Post  
Dave Hinz
 
Posts: n/a
Default

On Tue, 31 May 2005 22:35:31 -0000, Robert Bonomi wrote:
Only to
find that *now* that the crime is happening "somewhere else". This was
a 'bug' with *legs* -- it MOVED!


Sounds like a timing problem (race condition)?

In desperation, played with the order in which the modules were linked,
found a sequence where (presumably *still* happening) memory corruption
did not affect the answers coming out, and released it that way. Put a
*LOUD* note in the source, documenting the existence of the problem; that
if the existing code was linked in _this_ order, the problem was 'harmless',
and that _any_ modification of any component was "highly risky".


I love finding those.

This was documentation that *literally* started off:
"Warning, Will Robinson. WARNING!!"
and ended with "All hope abandon, ye who press 'enter' here."


My personal favorite is writing error messages like "There is no way
that you should ever see this error message." But, you don't want an
unhandled exception, even if it's "impossible".


  #144   Report Post  
Dave Hinz
 
Posts: n/a
Default

On Tue, 31 May 2005 20:28:19 -0700, Mark & Juanita wrote:
On 31 May 2005 20:54:07 GMT, Dave Hinz wrote:

On Tue, 31 May 2005 16:27:19 -0400, Norman D. Crow wrote:

I wasn't in the software end of the business, but 27yr. with NCR Corp. as
large scale system tech. produced a lot of funnies. First one was an
insurance company, program running solid for about 3yr. suddenly starts
blowing up, always at the same address. Turned out it was a situation which
original programmer knew could occur, but had never debugged!


You mean like the 9/9/99 date bug that I personally wrote into more than
a few applications? I mean, come on, who in their right mind would
still be using this POS in 15 years...


You mean you broke one of the first rules of programming? i.e, don't use
conditions that are possible, even if highly improbable, to terminate
loops. Or at least it was one of the first ones in our intro to
programming course when I was in college 20+ years ago.


I specifically remember being taught _to_ use 9999 as a terminator, to
say "We're done processing data now". I'm sure that that particular
terminator was responsible for more than that Y2K bug.

My first assignment out of college was designing and building automated
test equipment. I was assigned to a "senior" engineer, I think he had
around 8 years of experience :-) My assignment was writing the code to
control an eprom programmer, he was designing, building, and designing the
code for a much more sophisticated interface panel, designed to serve as an
interface between the system being tested and a controlling computer. He
had someone writing the assembler code for this panel to his design. When
I finished my assignment I wound up helping him debug the code and hardware
for the interface panel. While I had taken a modular, test a step at a
time approach to my design, this more senior guy took a more wholistic
approach, he had the first interface panel hardware assembled, did a
cursory visual power signal check, blew all 30k bytes of assembler code
into eprom, then turned on the panel expecting to get some sort of readout
on the front panel -- he didn't.


Ouch. I tend to over-test and iterate as I'm writing something, myself.
Makes the problem more immediate with less "Now WTF did I do wrong and
when" head-scratching.

It took several months of time to work
through the hardware errors before we even got to the point of having the
software start functioning.


Ow. Sounds like the run of multi-layer boards they had made, before
debugging the design. Turns out the component library had the wrong
info for a dual j/k flipflop - so all the resets were tied to the wrong
place through their resistors. I spent a non-trivial amount of time
making transparencies of the layers, tracing traces, and finding out
where I could drill which trace to get the right layer and not the wrong
layers, so I could avoid fly-wiring the whole damn thing. Engineer
never _looked at_ the design or he would have said "Hey, waitaminute,
you can't tie that there, it'll stay in reset all the time".

One of the funniest parts of the assembler code, he had a module of code
that had the comment, "should not have gotten here, set a flag so we don't
get here again".


I must remember that one, thank you. I've found old code of mine which
includes comments like:

# Now we're going to do (thing). I'm not quite sure just yet how I'm
# going to pull this off, so let's read on and find out together, shall
# we?

(And I just learned that this version of vi automatically continues
comment blocks by adding the # after a cr/lf. That's kinda cool...)

After he left the company, I inherited this monstrosity -- I convinced my
supervisor that it would be cheaper and better for all involved if I just
started over and redesigned the whole thing, hardware, code and all from
the ground up. My supervisor agreed. The resulting system did *not* have
any places where I set a flag to make sure I didn't get somewhere I should
not have gotten in the first place.


A total rewrite is hard to agree to, but when it's needed, it's needed.
We're re-writing something right now (ref: earlier death threats to
Sean), which unfortunately has to be done without breaking prod - but,
to the guy's credit, at least it's somewhat modular so we can fix one
chunk at a time, just keep the gazintas and gazottas the same.


  #146   Report Post  
Robert Bonomi
 
Posts: n/a
Default

In article ,
Dave Hinz wrote:
On Tue, 31 May 2005 22:35:31 -0000, Robert Bonomi
wrote:
Only to
find that *now* that the crime is happening "somewhere else". This was
a 'bug' with *legs* -- it MOVED!


Sounds like a timing problem (race condition)?


highly doubtful. *Absolutely* reliable for occuring and at what location in
any particular build.

It was just that inserting/changing the diagnostic instrumentation caused
the location to migrate.

Also, hard to get a timing problem, with a single, self-contained, module
that is merely doing a sh*tload of computations.

In desperation, played with the order in which the modules were linked,
found a sequence where (presumably *still* happening) memory corruption
did not affect the answers coming out, and released it that way. Put a
*LOUD* note in the source, documenting the existence of the problem; that
if the existing code was linked in _this_ order, the problem was 'harmless',
and that _any_ modification of any component was "highly risky".


I love finding those.


I figured that the next guy along deserved the information.
Heck, he makes some trivial one-line change, and the "Heisen-bug"
re-appears, and there he goes, trying to figure out how *THAT* change
causes the error "somewhere *totally* unrelated" to what he did.
Possibly even happens _before_ the program gets to the point where he
made the change.

This was documentation that *literally* started off:
"Warning, Will Robinson. WARNING!!"
and ended with "All hope abandon, ye who press 'enter' here."


My personal favorite is writing error messages like "There is no way
that you should ever see this error message." But, you don't want an
unhandled exception, even if it's "impossible".

I did enough of those, I had a standard format for it:
"Impossible error {{number}}
Internal program logic error.
Notify programming immediately!"


  #147   Report Post  
Dave Hinz
 
Posts: n/a
Default

On Thu, 02 Jun 2005 11:42:05 -0000, Robert Bonomi wrote:
In article ,
Dave Hinz wrote:
On Tue, 31 May 2005 22:35:31 -0000, Robert Bonomi
wrote:
Only to
find that *now* that the crime is happening "somewhere else". This was
a 'bug' with *legs* -- it MOVED!


Sounds like a timing problem (race condition)?


highly doubtful. *Absolutely* reliable for occuring and at what location in
any particular build.
It was just that inserting/changing the diagnostic instrumentation caused
the location to migrate.


That's what we call 'Heisenberging the system' here. Change it's
behavior by trying to figure out WTF is going on.

Also, hard to get a timing problem, with a single, self-contained, module
that is merely doing a sh*tload of computations.


True, that.

I love finding those.


I figured that the next guy along deserved the information.
Heck, he makes some trivial one-line change, and the "Heisen-bug"


Ah, OK, so that's not a unique term. Figured it couldn't be.

re-appears, and there he goes, trying to figure out how *THAT* change
causes the error "somewhere *totally* unrelated" to what he did.
Possibly even happens _before_ the program gets to the point where he
made the change.


And people wonder why IT folks drink.

My personal favorite is writing error messages like "There is no way
that you should ever see this error message." But, you don't want an
unhandled exception, even if it's "impossible".


I did enough of those, I had a standard format for it:
"Impossible error {{number}}
Internal program logic error.
Notify programming immediately!"


Very nice. I should expand my error message vocabulary accordingly.

Reply
Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules

Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Raised Panel Doors Software CNT Woodworking 15 May 2nd 05 01:57 PM
3D Software & NLE Software CDs ::::::: , updated 28/Mar/2005 futa Metalworking 0 March 30th 05 07:06 PM
Arts and Crafts Network Software CraigJ Metalworking 1 March 14th 04 05:09 AM
Best deck design software? Dave K. Home Repair 1 February 24th 04 06:24 PM
Small Table Making Software - Instructional Software Program JK Woodworking 0 February 15th 04 07:50 PM


All times are GMT +1. The time now is 01:04 AM.

Powered by vBulletin® Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 DIYbanter.
The comments are property of their posters.
 

About Us

"It's about DIY & home improvement"