Home Repair (alt.home.repair) For all homeowners and DIYers with many experienced tradesmen. Solve your toughest home fix-it problems.

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
  #41   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 2,405
Default Whirlpool Fridge lock up during an apparent power glitch

On Wed, 20 Apr 2016 16:27:59 -0700, Don Y
wrote:

On 4/20/2016 8:02 AM, Vic Smith wrote:
For all the anti-Windows talk, nobody has come up *their* replacement.
Not even close.


Have you heard of iOS and Android? Together, they represent 93% of
the mobile market. "Windows Phone" accounts for all of 2.5% of
that market.

I was referring to desktop/laptop OSes. I don't use other OSes.
My wife has an Android phone though, and it seems to work.
  #42   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 2,879
Default Whirlpool Fridge lock up during an apparent power glitch

On 4/20/2016 8:18 PM, Vic Smith wrote:
On Wed, 20 Apr 2016 16:27:59 -0700, Don Y
wrote:

On 4/20/2016 8:02 AM, Vic Smith wrote:
For all the anti-Windows talk, nobody has come up *their* replacement.
Not even close.


Have you heard of iOS and Android? Together, they represent 93% of
the mobile market. "Windows Phone" accounts for all of 2.5% of
that market.

I was referring to desktop/laptop OSes. I don't use other OSes.
My wife has an Android phone though, and it seems to work.


The fact that desktop/laptop OS's are in decline suggests the market
has moved past them in favor of other implementations -- that the
market considers "more attractive".

  #43   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 4,321
Default Whirlpool Fridge lock up during an apparent power glitch

"Don Y" wrote in message
...
On 4/19/2016 9:38 PM, Robert Green wrote:
Without seeing one shred of code, documentation or list of clients or
projects, things are whatever you say they are your qualifications.
It's like judging a photo contest over the radio.


So, its obviously not worth a moment of my time to respond, here.

Thanks!


Hey facts are facts. I've hired lots of coders. Many talk a very good game
but the proof is always in the code, the specs and the documentation.

You're only on the hook because both Trader and I agree that what you've
said about "doing it right" and code re-use isn't right. For Trader and I
to agree is pretty damn rare. I can tell from his comments that he's
actually been involved in projects where people have to consider resources
and make design tradeoffs.

You've set yourself up as a programmer that doesn't make the mistakes other
people do. You're a smart guy Don. You know that's an invitation to "put
up or shut up."

I forgot to add one more thing about lone cowboy/sole practitioner
programmers. They usually don't tolerate criticism well at all. (-:

--
Bobby G.


  #44   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 4,321
Default Whirlpool Fridge lock up during an apparent power glitch

"trader_4" wrote in message

stuff snipped code reuse v. starting from scratch

Never mind MS and Oracle. It's not true even for embedded applications,
like the fridge or a piece of communications gear. Unless the existing
code is a total wreck because it was done by amateurs or there is some
specific need to redo the whole thing, the existing software is typically
what's used as a starting point.


It's actually that way two stories up, so to speak, because almost all
programming languages I'm familiar with come with code libraries of some
sort. OS's like Windows also have libraries as well. DLL stands for
Dynamic Link Library. So reusuability is important at multiple levels. I
believe the opposite of what I think Don is saying. It's libraries that
introduce stability to code and provide a similar user experience. It
drives me crazy that phone/tablet apps are all over the map. It's
apparently no longer trendy to put a close dialog X in the corner, where
everyone in the world might look first to close a window. Sigh.

Being able to keep, reuse that existing
code is usually the top priority, over anything else.


It may not be for a sole practioner like I assume Don to be. If each
project is a one-off deal, why not code with home-grown everything? The
catch is that takes 10 times as long and not many people are willing to pay
for that in either time or money.

For example, hardware
engineers might want to switch to a different microprocessor based on
performance, cost, power, etc. Software managers, project managers,
engineering management
will overrule them because the cost and investment on new software,

starting
from scratch, exceeds those other considerations by an order of magnitude.


When I read that, I know you've been involved in such a project. I have,
too. It's very, very seductive to want to change to a new, more capable
microprocessor but most instances I know of if there was an upgrade, it was
to the same chip family so that old code could still run.

Sure, there are exceptions, if it's some simple design, where the software
is no big deal. But even for a fridge, the natural starting point would
likely be the existing code, not start with new code.


One thing that I think could trigger a "fresh start" migration would be to
make such a fridge web-enabled where you had to switch to a controller that
could interface with ethernet or WiFi. It's been my experience that the
micro is spec'ed for the original task and adding WiFi support will overtax
it. At least I can remember my microcoders complaining frequently about
being out of interrupt lines, memory or time.

The fridge is one device that can make use of the net to report out of
bounds conditions on the unit to someone who can come and investigate.
There are already monitors to do that, so integrating them in the future is
likely. Consumers like Art will discover they have saved a lot of money
when the fridge texts their cellphone and says "I'm getting warm" and they
can intervene in time.

--
Bobby G.


  #45   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 2,405
Default Whirlpool Fridge lock up during an apparent power glitch

On Wed, 20 Apr 2016 20:44:49 -0700, Don Y
wrote:

On 4/20/2016 8:18 PM, Vic Smith wrote:
On Wed, 20 Apr 2016 16:27:59 -0700, Don Y
wrote:

On 4/20/2016 8:02 AM, Vic Smith wrote:
For all the anti-Windows talk, nobody has come up *their* replacement.
Not even close.

Have you heard of iOS and Android? Together, they represent 93% of
the mobile market. "Windows Phone" accounts for all of 2.5% of
that market.

I was referring to desktop/laptop OSes. I don't use other OSes.
My wife has an Android phone though, and it seems to work.


The fact that desktop/laptop OS's are in decline suggests the market
has moved past them in favor of other implementations -- that the
market considers "more attractive".


More like "If I can do what I do on a phone (browse the internet) why
should I pay up for a PC?"
Many people used to buy a PC solely to browse the internet and steal
music.
Now they can do that on a cheap phone - if their eyes are good enough.
I built PC's for all my kids, but 4 of 7 now rely on mobile devices
only.
Desktops/laptops aren't going away. The market is simply adjusting.
And Windows remains the king OS of desktops/laptops.



  #46   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 149
Default Whirlpool Fridge lock up during an apparent power glitch

On Wed, 20 Apr 2016 23:48:33 -0500, Vic Smith wrote:

I built PC's for all my kids, but 4 of 7 now rely on mobile devices


You named your kid after a Star Trek character?

--
http://mduffy.x10host.com/index.htm
  #47   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 2,879
Default Whirlpool Fridge lock up during an apparent power glitch

On 4/20/2016 9:48 PM, Vic Smith wrote:
On Wed, 20 Apr 2016 20:44:49 -0700, Don Y
wrote:

On 4/20/2016 8:18 PM, Vic Smith wrote:
On Wed, 20 Apr 2016 16:27:59 -0700, Don Y
wrote:

On 4/20/2016 8:02 AM, Vic Smith wrote:
For all the anti-Windows talk, nobody has come up *their* replacement.
Not even close.

Have you heard of iOS and Android? Together, they represent 93% of
the mobile market. "Windows Phone" accounts for all of 2.5% of
that market.

I was referring to desktop/laptop OSes. I don't use other OSes.
My wife has an Android phone though, and it seems to work.


The fact that desktop/laptop OS's are in decline suggests the market
has moved past them in favor of other implementations -- that the
market considers "more attractive".


More like "If I can do what I do on a phone (browse the internet) why
should I pay up for a PC?"
Many people used to buy a PC solely to browse the internet and steal
music.


Or send email. Or, look at photos -- that they took on their CAMERA.
Etc.

The phone offers the same sorts of capabilities without the bulk and
lack of mobility that the desktop/laptop imposes.

And, you can make phone calls/SMS with your phone (a relatively rare
activity for a laptop/desktop).

Now they can do that on a cheap phone - if their eyes are good enough.
I built PC's for all my kids, but 4 of 7 now rely on mobile devices
only.
Desktops/laptops aren't going away. The market is simply adjusting.
And Windows remains the king OS of desktops/laptops.


The office space is where desktops remain popular. If you look at usage
patterns for work days vs. weekend, you can see this most prominently.
On the weekend, folks aren't sitting down at their laptop to steal
music, browse the internet, review their photos (taken on a different
device), teleconference, etc.

You'll eventually see a "dock" for phones that lets you use the computational
horsepower in the phone with the user interface of a desktop. This will offer
a cheaper alternative to businesses TRAPPED into MS's perpetual hardware and
software upgrade cycle. It will also allow employees to "take their desk
with them" when they walk to another part of the building, GO HOME, etc.

"The Cloud" has just as much appeal to a mobile user as to a desktop user.

Windows has been in the phone market (and the PDA market before that). And,
the phone market has opted to replace it -- with "something better".
Despite the fact that Windows was there, first!

  #48   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 2,405
Default Whirlpool Fridge lock up during an apparent power glitch

On Wed, 20 Apr 2016 22:06:34 -0700, Don Y
wrote:



The office space is where desktops remain popular. If you look at usage
patterns for work days vs. weekend, you can see this most prominently.
On the weekend, folks aren't sitting down at their laptop to steal
music, browse the internet, review their photos (taken on a different
device), teleconference, etc.


We do that - save the music stealing - all the time on our desktops at
my house. My wife and MIL are on Skype teleconferencing with friends
and relatives in Poland.
But I spend quite a bit of time gaming too.

You'll eventually see a "dock" for phones that lets you use the computational
horsepower in the phone with the user interface of a desktop. This will offer
a cheaper alternative to businesses TRAPPED into MS's perpetual hardware and
software upgrade cycle. It will also allow employees to "take their desk
with them" when they walk to another part of the building, GO HOME, etc.


People are always upgrading their phones, more often than I undergo a
PC upgrade. PC upgrades have lengthened quite a bit. I think I've
been upgrading every 5 years or so, due to the technological demands
of games.

"The Cloud" has just as much appeal to a mobile user as to a desktop user.


Don't use any cloud storage.

Windows has been in the phone market (and the PDA market before that). And,
the phone market has opted to replace it -- with "something better".
Despite the fact that Windows was there, first!


I guess they miscalculated somehow.

  #49   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 2,879
Default Whirlpool Fridge lock up during an apparent power glitch

On 4/20/2016 10:56 PM, Vic Smith wrote:
On Wed, 20 Apr 2016 22:06:34 -0700, Don Y
wrote:

The office space is where desktops remain popular. If you look at usage
patterns for work days vs. weekend, you can see this most prominently.
On the weekend, folks aren't sitting down at their laptop to steal
music, browse the internet, review their photos (taken on a different
device), teleconference, etc.


We do that - save the music stealing - all the time on our desktops at
my house. My wife and MIL are on Skype teleconferencing with friends
and relatives in Poland.


We "visit" with people with whom we want to chat. Our long distance
costs are in the single digits for any given year (e.g., a $20 phone
card). SWMBO spends a few hours a year on the phone with her sister;
usually just before or after one of the sister's trips ("China
was great! I think I'll head back there, next year! But, I'm
busy packing for Africa, next week...")

Most of the folks with whom I converse are local and/or I see them
regularly. Clients are invariably out of town and contact is via
email (phone calls take up too much time and usually result in
either or both parties waiting for the other to think something
through; email forces you to do your thinking up front!)

I use PC's to rip CD's and convert between file formats. And, to
access and maintain my music archive (e.g., so I can copy selections
onto particular PMP's). My music tastes haven't changed in decades
so there is very little that is "available" that I don't already
have (though I have many LPs that I still need to rip). But,
I don't LISTEN to any of that music "on" a PC. Even when working
*at* a PC, I'll defer to a nearby PMP, instead.

But I spend quite a bit of time gaming too.

You'll eventually see a "dock" for phones that lets you use the computational
horsepower in the phone with the user interface of a desktop. This will offer
a cheaper alternative to businesses TRAPPED into MS's perpetual hardware and
software upgrade cycle. It will also allow employees to "take their desk
with them" when they walk to another part of the building, GO HOME, etc.


People are always upgrading their phones, more often than I undergo a
PC upgrade. PC upgrades have lengthened quite a bit. I think I've
been upgrading every 5 years or so, due to the technological demands
of games.


I see businesses replacing entire fleets of PC's every 3 years. I've
known some to do so every 18 months! The local University regularly pushes
reasonably new machines out as grants (typ a year or two) come to an end
(the terms of the grant usually require any equipment purchased to be
sold at auction -- you can't just "inherit" the equipment when the grant
expires)

Friends and colleagues seem to similarly "upgrade" every few years
(this is how I acquired most of my laptops).

I am slow to update simply because of the cost (in time) to do so
(the effort required to install all of the applications, fetch new
licenses, etc.) and the potential for losing capabilities that always
ensues:
- lack of a particular driver support in a new OS (rendering the
associated hardware prematurely obsolete)
- lack of support for a particular application
- lack of particular hardware capabilities in the new machine
(lack of bus slots or particular types of bus slots)
- vendor not making new licenses available for a legacy application

I am contractually obligated to support each of my past clients/projects
"indefinitely". As most folks aren't prudent enough to preserve their
development environments, INTACT, moving forward (so they can revise
an existing product "on a moments notice"), it behooves me to make sure
*I* can -- failing to do so leaves me reliant on *their* efforts.

[If a casino operator claims "guests" (gamblers) have found an exploit
that is allowing them to STEAL hundreds of dollars every MINUTE -- from
EACH machine -- I need to be able to defend *my* implementation:
"Here's a copy of the code *I* delivered. Show me how the exploit
affects *it*? Ah, so the exploit is only applicable to the codebase
that YOU modified! And, how is this MY problem? Why would *I* be a
logical choice to troubleshoot a problem introduced by YOUR staff?"
(No, I'm not interested in that job, thankyouverymuch -- its neither
interesting nor rewarding, for ME)]

And, as most applications (esp anything that runs under Windows!) are
incapable of supporting "previous file formats", implementing some
"little change" might prove to be a major ordeal! E.g., I was asked
to change the type of battery used on an old design. This is a no-brainer
job -- replace one battery holder with another. There's no real
"engineering" involved. And, there are only two "wires" (contacts)
that the PCB layout would have to accommodate.

Plus, any client would KNOW that sort of thing. Refusing to "lend a
hand with this simple change" (while not a contractual obligation)
starts to look "****y":
"You can't afford to give us an HOUR of your time? Even if we pay
you for the *day*?"

Do you explain to the client that you'll have to resurrect some ancient
application -- and the OS on which it ran! -- in order to make that
change? So, that "hour" is really more like a *day*?

Do you claim that you don't even HAVE that capability available? And,
have the client thinking that you're "irresponsible" -- despite the
fact that THEY ALSO have lost that capability?

Better business practice is to HELP the client, earn additional good will,
be seen as "resourceful" and "reliable". And, with that in mind, take
it on myself to make these sorts of activities "less expensive" -- to
me, personally.

[I'll still have to bear the cost of being diverted from my CURRENT
project to address this "more pressing issue". A day is easy to swallow.
SEVERAL days? Not so much so. Current clients might grumble -- to
which you reply: "If YOU contact me a year hence with a similar request,
wouldn't you want me to make time for your 'pressing need'?"]

I don't play games online or on any of my PC's. I'd rather turn on
an (arcade) pinball machine or video game -- why set aside the space
for them if you aren't going to PLAY them? : (there's also a
certain nostalgia involved -- an entirely different sort of user experience)

"The Cloud" has just as much appeal to a mobile user as to a desktop user.


Don't use any cloud storage.


Nor do I -- except to occasionally post photos for others to access in
"public" discussions so I don't have to email to multiple recipients;
I tend to guard my email accounts pretty jealously -- discarding them
frequently (save for my business accounts)

But, the cloud has appeal to businesses. I see many adopting it -- even
those folks who are paranoid about the security/privacy issues involved.
People (esp businesses) have a hard time saying NO to something that is
(relatively) "free".

[In this case, "free" pertains to the capability to access that data
from a multitude of locations, simultaneously. I.e., we are repeating the
perpetual "big-centralized-computer-with-terminal-clients vs.
distributed-smart-workstations Flip-Flop that the industry has experienced
since its inception. Now, the "terminals" are PC's (and phones) while
the "big centralized computer" is The Cloud.]

Windows has been in the phone market (and the PDA market before that). And,
the phone market has opted to replace it -- with "something better".
Despite the fact that Windows was there, first!


I guess they miscalculated somehow.


They "missed" the mobile OS market AND appear to have missed the mobile
application market! Just like they missed the "advertising" aspect that
the 'net presented.

  #50   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 15,279
Default Whirlpool Fridge lock up during an apparent power glitch

On Thursday, April 21, 2016 at 12:48:38 AM UTC-4, Vic Smith wrote:
On Wed, 20 Apr 2016 20:44:49 -0700, Don Y
wrote:

On 4/20/2016 8:18 PM, Vic Smith wrote:
On Wed, 20 Apr 2016 16:27:59 -0700, Don Y
wrote:

On 4/20/2016 8:02 AM, Vic Smith wrote:
For all the anti-Windows talk, nobody has come up *their* replacement.
Not even close.

Have you heard of iOS and Android? Together, they represent 93% of
the mobile market. "Windows Phone" accounts for all of 2.5% of
that market.

I was referring to desktop/laptop OSes. I don't use other OSes.
My wife has an Android phone though, and it seems to work.


The fact that desktop/laptop OS's are in decline suggests the market
has moved past them in favor of other implementations -- that the
market considers "more attractive".


More like "If I can do what I do on a phone (browse the internet) why
should I pay up for a PC?"
Many people used to buy a PC solely to browse the internet and steal
music.
Now they can do that on a cheap phone - if their eyes are good enough.
I built PC's for all my kids, but 4 of 7 now rely on mobile devices
only.
Desktops/laptops aren't going away. The market is simply adjusting.
And Windows remains the king OS of desktops/laptops.


I mostly agree. In the early days of the PC in the consumer market
you typically had either no PC or one PC per home. AS the market
grew and costs came down, that shifted to many homes having multiple PCs.
Smartphones and tablets have now changed that, with those replacing
some of those multiple PCs. But I'd sure still want at least one real
PC in my house. And Windows has about 10%+ market share in tablets,
so there is some Windows existent in that market too. IDK what will
ultimately become of Windows, but it's going to be here for a long
time to come. And I also agree that for what it does, it works well
for me and I have few complaints, nor do I see anything else that does
as much, supports so many different hardware configurations, etc.


  #51   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 15,279
Default Whirlpool Fridge lock up during an apparent power glitch

On Thursday, April 21, 2016 at 1:56:45 AM UTC-4, Vic Smith wrote:


Windows has been in the phone market (and the PDA market before that). And,
the phone market has opted to replace it -- with "something better".
Despite the fact that Windows was there, first!


I guess they miscalculated somehow.


I think what you really missed was when Microsoft was in the cell phone business with Windows before the cell phone companies had cell phones out
that supported data, email, etc under their own software. I missed it too,
because it never happened.
  #52   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 15,279
Default Whirlpool Fridge lock up during an apparent power glitch

On Thursday, April 21, 2016 at 12:26:26 AM UTC-4, Robert Green wrote:


Being able to keep, reuse that existing
code is usually the top priority, over anything else.


It may not be for a sole practioner like I assume Don to be. If each
project is a one-off deal, why not code with home-grown everything?


Because even there, if the "new" product is similar to the existing
one, which is often the case, it's easier, faster, cheaper to use
the existing code. Again that assumes that it's not a complete hack
job, that it's documented, etc. If the new project shares little in
common with the existing, then you start from scratch. But those are
pretty much the same rules for any project. For example, if you
have a driver for a communications interface, be it USB, Ethernet,
Wifi, etc, why re-write that?



The
catch is that takes 10 times as long and not many people are willing to pay
for that in either time or money.


That would depend on who's paying the bill and what they are paying for.
More typical is for anyone who owns the rights to the software to reuse
what they have a right to reuse and charge what the market will bear.
In other words, if I already have software for a project where it will
save 80% of the time of development, I would use that, then come up
with a price for the whole thing, including the new work.




For example, hardware
engineers might want to switch to a different microprocessor based on
performance, cost, power, etc. Software managers, project managers,
engineering management
will overrule them because the cost and investment on new software,

starting
from scratch, exceeds those other considerations by an order of magnitude.


When I read that, I know you've been involved in such a project. I have,
too. It's very, very seductive to want to change to a new, more capable
microprocessor but most instances I know of if there was an upgrade, it was
to the same chip family so that old code could still run.


Absolutely. In many markets, the cost of one CPU vs a competing alternative
could be huge, but they won't port the code over, because the cost of
doing that, the time, the validation, etc would be staggering. The cost
of the CPUs is almost irrelevant. Now, if it's some simple app, on some
non-critical consumer $10 item, then the rules obviously can change.
But even there, if it's say some child's toy, I would expect Mattel or
whoever build it for Mattel to re-use what they have, provided it's
similar.


Sure, there are exceptions, if it's some simple design, where the software
is no big deal. But even for a fridge, the natural starting point would
likely be the existing code, not start with new code.


One thing that I think could trigger a "fresh start" migration would be to
make such a fridge web-enabled where you had to switch to a controller that
could interface with ethernet or WiFi. It's been my experience that the
micro is spec'ed for the original task and adding WiFi support will overtax
it. At least I can remember my microcoders complaining frequently about
being out of interrupt lines, memory or time.


Exactly. There the chipset that supports the WiFi may have a CPU
that could also do the other simple control stuff to run the fridge.
That's justification for starting a new software design.



The fridge is one device that can make use of the net to report out of
bounds conditions on the unit to someone who can come and investigate.
There are already monitors to do that, so integrating them in the future is
likely. Consumers like Art will discover they have saved a lot of money
when the fridge texts their cellphone and says "I'm getting warm" and they
can intervene in time.

--
Bobby G.


That's a good point. Was talking with a friend the other day about
"The Internet of Things" and WTF that meant. I cited hooking up
thermostats and alarms as examples. I was having trouble thinking
of another example, I did think of fridges actually, but dismissed
them, not thinking of the out of temp alarm possibility. So, yes,
fridges are an example, I'd pay some small premium for a fridge
that had a wifi alarm feature.
  #53   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 149
Default Whirlpool Fridge lock up during an apparent power glitch

On Thu, 21 Apr 2016 06:29:27 -0700 (PDT), trader_4 wrote:

[...] it's easier, faster, cheaper to use
the existing code. Again that assumes that it's not a complete hack
job, that it's documented, etc. If the new project shares little in
common with the existing, then you start from scratch. But those are
pretty much the same rules for any project.


I large part of my career was as you call it a "code cowboy". Working
completely by myself on small (a few days, a few weeks, occaisionally a few
months) programs which were ALWAYS specified as being one-shot deals. No
need then, to extensively test for inputs out of range of the given
datasets, no need to document for easy re-use by myself, and no need to
collaborate with others.

Most of the time though, a request would eventually be made to resurrect
the code with modified requirements, or input data that had not been
subject to the same levels of quality control as previous runs, etc.

The only thing more vexing than code that is difficult to follow is when
you can't even complain to the author about how ****ty it is.

So I got into the habit of using descriptive variable names, adding
comments, documenting formats, and most importantly, little messages to my
future self about why a seemingly simpler method cannot be used.


fridges are an example, I'd pay some small premium for a fridge
that had a wifi alarm feature.


What we really need is a little robot that crawls around inside the fridge
looking for stuff that hasn't been touched for a while or is emitting decay
byproducts.

--
http://mduffy.x10host.com/index.htm
  #54   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 14,845
Default Whirlpool Fridge lock up during an apparent power glitch

On Thursday, April 21, 2016 at 10:19:41 AM UTC-4, Mike Duffy wrote:
On Thu, 21 Apr 2016 06:29:27 -0700 (PDT), trader_4 wrote:

[...] it's easier, faster, cheaper to use
the existing code. Again that assumes that it's not a complete hack
job, that it's documented, etc. If the new project shares little in
common with the existing, then you start from scratch. But those are
pretty much the same rules for any project.


I large part of my career was as you call it a "code cowboy". Working
completely by myself on small (a few days, a few weeks, occaisionally a few
months) programs which were ALWAYS specified as being one-shot deals. No
need then, to extensively test for inputs out of range of the given
datasets, no need to document for easy re-use by myself, and no need to
collaborate with others.

Most of the time though, a request would eventually be made to resurrect
the code with modified requirements, or input data that had not been
subject to the same levels of quality control as previous runs, etc.

The only thing more vexing than code that is difficult to follow is when
you can't even complain to the author about how ****ty it is.

So I got into the habit of using descriptive variable names, adding
comments, documenting formats, and most importantly, little messages to my
future self about why a seemingly simpler method cannot be used.


A wise man once said:

"Code Tells You How, Comments Tell You Why

I hang out in an Excel forum and often write macros for other people.
I use descriptive variables, add comments, etc. for 2 reasons:

1 - To help the requester understand what the code is doing (Many of them
want to learn how to write macros, so the comments help "teach" them.)

2 - Months, and even years later, I often get 2nd requests (either from
the original requester or from someone who found the code in the archives)
asking for a modification to the code.

If I didn't include the comments when I first wrote it, I'd often have no
clue what the original purpose was or why I wrote it that way. Heck, I don't
even recognize some of the code that I use at work everyday. :-) Some of my
macros are 5 - 10 years old. Luckily they still work for their original
purpose and rarely need updating.

The worst part of the "forum job" comes from those who ask for a macro to do a
specific task and then keep adding on requirements. I fulfill their original
requirements and they come back with "Hey, thanks! That does exactly what I
wanted. Now can you make it to 'this' also?"

This often happens more than once which means I sometimes need to start
from scratch to avoid just bolting on a bunch of haphazard instructions and
ending up with bloated, inefficient code.

When it gets out of hand, I will politely ask the requester what they think
would happen if they had signed a contract for a "product" based on a set of
requirements and then kept changing the requirements every time the product
was delivered.

Of course, that assumes that I get requirements that can actually be used.
Sometimes it's not much more than "I need macro to create an invoice from
my data sheet. Can someone please post some code?" Uh...no.
  #55   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 4,321
Default Whirlpool Fridge lock up during an apparent power glitch

"Don Y" wrote in message
On 4/20/2016 8:18 PM, Vic Smith wrote:


stuff snipped

I was referring to desktop/laptop OSes. I don't use other OSes.
My wife has an Android phone though, and it seems to work.


The fact that desktop/laptop OS's are in decline suggests the market
has moved past them in favor of other implementations -- that the
market considers "more attractive".


That probably has nothing to do with software superiority or attractiveness
and everything to do with a laptop not being able to fit in your shirt
pocket or purse.

--
Bobby G.






  #56   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 4,321
Default Whirlpool Fridge lock up during an apparent power glitch

"Vic Smith" wrote in message
...
On Wed, 20 Apr 2016 00:38:23 -0400, "Robert Green"
wrote:



Unlike some others here, I believe the fact that Windows and everything

that
runs on it *mostly* works is a miracle of our time and one that's almost
over. Do you know how many different types of motherboards and board
configurations something like XP is expected to run on?


+1
For all the anti-Windows talk, nobody has come up *their* replacement.
Not even close.


It's too massive an effort. Google Chromebooks and Android phones skim the
email/browser end of the market but CAD/CAM, DTP and lots of other tasks
will still be done on PC's.

MS didn't forsee the growth of the mobile end of the market and was too
wedded to the Windows interoperability paradigm to come out with a clean,
innovative OS for phones. Ironically the did come out with a tablet version
of Windows very early on when tablets weren't quite portable enough to
penetrate the market.

Apple's got it easy compared to Windows. IOS only has to support Apple's
carefully controlled hardware. Windows has to support 100's of machines and
configurations. Android has it even easier, having only to support phones
and tablets in limited ways. AFAIK, you still can't even hook up a scanner
to a Chromebook so they are not really Windows replacements, just
cream-skimmers. (-:

--
Bobby G.


  #57   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 2,879
Default Whirlpool Fridge lock up during an apparent power glitch

On 4/21/2016 7:19 AM, Mike Duffy wrote:
On Thu, 21 Apr 2016 06:29:27 -0700 (PDT), trader_4 wrote:

[...] it's easier, faster, cheaper to use
the existing code. Again that assumes that it's not a complete hack
job, that it's documented, etc. If the new project shares little in
common with the existing, then you start from scratch. But those are
pretty much the same rules for any project.


I large part of my career was as you call it a "code cowboy". Working
completely by myself on small (a few days, a few weeks, occaisionally a few
months) programs which were ALWAYS specified as being one-shot deals. No
need then, to extensively test for inputs out of range of the given
datasets, no need to document for easy re-use by myself, and no need to
collaborate with others.


That approach doesn't work when your product processes large amounts of
CASH; tells a patient they have/don't have a particular disease process;
controls a mechanism moving at a high rate of speed (or large masses);
produces a product that people INGEST; pilots a vessel; is relied upon
to produce tens of thousands of dollars of saleable (vs SCRAPPED!) product
each hour; etc. Folks tend to want designs that can be PROVEN to work.
And/or, go through formal (legally required) certification/validation
processes.

It also falls down when "your part" has to interface seamlessly with a
part that another developer in another part of the world is producing
in parallel with your activities.

Or, when the prototype hardware costs $1M and you can, at best, *borrow*
a few hours per month on which to "test" your designs.

Of course, you also have scheduling and cost factors to consider. So,
you REALLY want to be able to pull something "proven" off a shelf and
use it as is -- without RE-writing it or RE-bugging it. "How do I know
this works? *HOW* does it work? etc."

Most of the time though, a request would eventually be made to resurrect
the code with modified requirements, or input data that had not been
subject to the same levels of quality control as previous runs, etc.


Or, you will forget that there is some inherent limitation (or assumption)
embodied in the implementation (hardware or software) when you opt to
modify/repurpose it for some later use.

The only thing more vexing than code that is difficult to follow is when
you can't even complain to the author about how ****ty it is.


I have a distinctive "style" to the way I write code, design hardware and
design "systems". But, have been frequently complimented:
"Once I realized how you do things, everything was OBVIOUS!"
There is value to consistency and thoroughness.

(Do you write "++x;" or "x++;"? Are you consistent in your choice?)

So I got into the habit of using descriptive variable names, adding
comments, documenting formats, and most importantly, little messages to my
future self about why a seemingly simpler method cannot be used.


And, chances are, there are holes large enough to drive a truck through
in the thoroughness of your documentation, test suite, etc. Because you
weren't PLANNING on reusing it. You succumb to the "just throw something
together" thinking.

This is even worse with PHB's -- most of which have never written anything
of even 10KLoC complexity (yet, somehow feel they have The Inside Track on
how to manage 1 MLoC projects -- cuz they read about some "technique du jour")
They bow to marketing, manufacturing, stockholder, etc. pressures to
"get something out the door". Then, are perpetually playing "catch-up"
and "wack-a-mole" with bugs that COULD have been caught in the specification
stage.

But, they thought spending the time to write formal specifications would
let too much calendar time slip away in which they "aren't being productive".

[No, a marketing document is not a specification; it's a mealey-mouthed wish
list that says absolutely nothing about what the product will do and how
it will be EXPECTED to perform]

There's this attitude that "We're not sure how well this product will be
received; so, we aren't keen on making a big commitment to it. If
the market likes it, we'll go back and fix all these things -- or design
Model 2". Of course, if they cut too many corners, the product is crap
and no one *wants* it! OTOH, if the market sees a value and embraces
it, they find themselves too busy trying to shoehorn in the upgrades
(because now competitors see an opportunity!) and the product quality
eventually suffers -- and no one wants it.

For a fun exercise, try to track down any formal documentation for ANY
product (software and/or hardware) that you've been involved with.
Big difference when you have a financial/contractual arrangement with
someone (client/provider) and that document DEFINES the work to be done!

[Employees can rarely DEMAND such a document; and employers rarely have
to provide it -- they can just change their mind "on a whim" -- knowing
the employees will somehow have to make it work (and, of course, all of
the work they've done up to that point should be magically usable in the
new project definition! : ]

What happens if the pointer I pass to free(2c) is NOT a pointer previously
obtained from malloc(2c)? [I'll wait here while you see if you can find
a description of the EXPECTED outcome -- for that library that you
PURCHASED : ] Does your test suite check for this condition? Does your
code/development environment ENSURE you CAN'T encounter that situation?
Or, if it can't prevent it, how does your code/system respond WHEN that
happens? Or, is this just the same old bug that will rear its ugly
head 2, 6 or 10 months down the road?

[For fun, try passing a pointer to nonexistent memory and see how long AFTER
for your code to crash. Or, a "legitimate" pointer that you'd already
previously free(2c)-ed!]

I'm always amused at how many misconceptions folks have regarding floating
point data types; ignorance of overflow/underflow, cancellation, etc.
As if "simply" using floats absolves them from understanding how their
equations (and data) perform. "Where did this outrageous number come from?
Clearly that's not the result of the equation that I wrote!" (really?
did you think the machine just fabricated random numbers of its own
free will??)

And, of course, folks who are ignorant of the underlying iron won't
see any difference in these two code fragments:

for (row=0; rowMAX_ROW; row++)
for (col=0; colMAX_COL; col++)
data[row][col] = (row == col) ? 1 : 0;

for (col=0; colMAX_COL; col++)
for (row=0; rowMAX_ROW; row++)
data[row][col] = (row == col) ? 1 : 0;

And, folks who are ignorant of the underlying OS's capabilities
will be even MORE clueless!

When you encounter this IN some code, will there be a comment
drawing attention to why the particular idiom was employed?
Will you be expected to know? Will the original author have
known??

When you "lift" the code for some other similar application,
will you think about whether or not the assumptions in place
in the code BEFORE you lifted it are still valid in the new
application? Will YOU leave a comment indicating that?

And, when you go to reuse a piece of software, will you KNOW
(or, be able to accurately determine) which of those issues
are pertinent to the reuse effort?

If it's all wrapped up in a library, will you even have a CHOICE
(will the library document this or will it be another detail
that you stumble on after the fact -- and then try to purchase
sources for the libraries so you can FIX the problem, as well
as identify other similar problems that are lurking and yet to
bite you?) Will the library that the NEXT compiler vendor
supplies behave similarly?

fridges are an example, I'd pay some small premium for a fridge
that had a wifi alarm feature.


What we really need is a little robot that crawls around inside the fridge
looking for stuff that hasn't been touched for a while or is emitting decay
byproducts.


(some) New refrigerators do inventory control. It's not hard to go from that
to an expert system that knows how long particular products (or classes of
products) can be kept in particular storage conditions. [This would be
actually relatively simple to do with a small set of productions] It would
also be easy to know the EXACT conditions incurred by the items in the 'frig
(YOUR particular temperature setting, humidification in the crisper drawer,
etc.). No more "does this smell spoiled, to you?".

And, from that, to advertisers pitching coupons to you based on their
observations of your usage ("He's almost out of milk. Let's arrange
for a SERIALIZED coupon for soy milk to appear in the next 'mailing'
and we can later see if he actually uses it -- by tracking the serial
number on the coupon")

How much is that worth? Or, is it a nuisance? If I could check the
contents of my refrigerator while I *happen* to be at the store, can
I save myself an extra trip back there, later, to pick up something
that I didn't yet know I needed? Can the refrigerator tell me about
the latest "recalls" (Trader Joes seems to have a new one each week!)
and whether it applies to any items that I currently have in the
refrigerator/cupboard? If it keeps me from spending a weekend
praying to the porcelain god, how much is that worth to me?
  #58   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 149
Default Whirlpool Fridge lock up during an apparent power glitch

On Thu, 21 Apr 2016 10:47:56 -0700, Don Y wrote:

That approach doesn't work when
[...] CASH [...] disease [...] a product that people INGEST
[...] pilots a vessel [...] thousands of dollars [...] each hour; etc.


True, true, true, etc. My app was analysing data to produce summary results
& graphics for publication in scientific journals. A fault in my
programming would not cost or hurt anything but the pride of those who
trusted me.


Folks tend to want designs that can be PROVEN to work.
And/or, go through formal (legally required) certification
/validation processes.


True again. In my case though, a publishing deadline was ususally coming
up, and the scientist(s) realized that simple equations can be complicated
to program, and Excel graphics can be a little short of publication quality
without putting a lot of time into learning how to program using VBS.


It also falls down when "your part" has to interface seamlessly with a
part that another developer in another part of the world is producing
in parallel with your activities.


Very true, especially with language and time zone barriers. I was lucky to
be the 'lone wolf' on all such projects.


Or, when the prototype hardware costs $1M and you can, at best, *borrow*
a few hours per month on which to "test" your designs.


Money talks. In my case, there was never any hard 'deliverable' that used
any of my code. Otherwise there WOULD be legal requirements for reliability
and all those things you mentioned previously.


Of course, you also have scheduling and cost factors to consider.


Like I said, deadlines were approaching and there was no money nor time to
let a contract specifying everything it detail. But there was a guy on
staff whose job description actually included a section about programming
to aid scientific research, and that guy (me) actually had a degree
majoring in physics unlike all of the other computer people who had IT
degrees.


(Do you write "++x;" or "x++;"? Are you consistent in your choice?)


This depends a LOT on the semantic context of the surrounding code.


And, chances are, there are holes large enough to drive a truck through
in the thoroughness of your documentation, test suite, etc.


Guilty. But then again, the main onus was to get the job done ASAP.

I found that I could save a LOT of time just by doing bulk QC on the inputs
before looking at the real task at hand.


Because you weren't PLANNING on reusing it. You succumb to the
"just throw something together" thinking.


Like I said earlier, I started to assume that the code would be re-visited
despite the assurances that it was a quick & dirty request. I did this just
to make my own future more pleasant.


For a fun exercise, try to track down any formal documentation for ANY
product (software and/or hardware) that you've been involved with.
Big difference when you have a financial/contractual arrangement with
someone (client/provider) and that document DEFINES the work to be done!



The stories I could tell would probably not surprise you. In a previous
job, I was involved with contract monitoring for major ($M) government
projects. Also, I was on the team that developed operational code for all
the 24hr Canadian Government GOES ground stations for meterological
satellite images. As far as I know, the software ran okay for decades with
no problems, even transiting the dreaded Y2K apocalypse.


I'm always amused at how many misconceptions folks have regarding floating
point data types; ignorance of overflow/underflow, cancellation, etc.


I did take a "Numercal Approximation" course at college. (I have a CS minor
as well) True, what you say. Some people have no idea. Scientists seem to
have a bad oversight regarding their own misunderstanding about things they
have never studied in detail. In general, this is because they are actually
more intelligent than others. But intelligence combined with ignorance is a
pretty good match for stupidity.

If you are at all interested in the sort of code I'm happy to put my name
on (i.e. not the crappy 'throwaway' code I described earlier), check out
the sources available in the 'Programs' section of my personal
not-for-profit website referenced in my signature. The Javascript for the
gauge demo is in 'wlib.js', the other sources are downloadable.

--
http://mduffy.x10host.com/index.htm
  #59   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 22,192
Default Whirlpool Fridge lock up during an apparent power glitch

On Thu, 21 Apr 2016 06:01:38 -0700 (PDT), trader_4
wrote:

Smartphones and tablets have now changed that, with those replacing
some of those multiple PCs. But I'd sure still want at least one real
PC in my house.


Imagine intensive applications on a small device. They can't run
Autodesk, or Photo Shop if I get it
  #60   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 2,879
Default Whirlpool Fridge lock up during an apparent power glitch

On 4/21/2016 12:50 PM, Mike Duffy wrote:
On Thu, 21 Apr 2016 10:47:56 -0700, Don Y wrote:

That approach doesn't work when
[...] CASH [...] disease [...] a product that people INGEST
[...] pilots a vessel [...] thousands of dollars [...] each hour; etc.


True, true, true, etc. My app was analysing data to produce summary results
& graphics for publication in scientific journals. A fault in my
programming would not cost or hurt anything but the pride of those who
trusted me.


You (or your "user") also gets a chance to check your results:
"Gee, that graph doesn't seem to correlate with my IMPRESSION
of the raw data. Let me spot check a few values..."

If my device is monitoring the production of a pharmaceutical
product (e.g., tablets) -- at rates approaching 1,000,000/hour -- you
can't even BEGIN to think you can sort through even a minute's
worth of production (at 200/second) to verify the device hasn't
screwed up somewhere in that batch.

If I'm accepting dollar bills from a user (gambler) and screw
up and pay out thousands of dollars, erroneously, do you really
think the user is going to "complain to management"? :
Wanna *bet* "Management" would complain to *me*! (once they
examine the logs and notice the "rate of return" for certain machines
is "way down"!)

Folks tend to want designs that can be PROVEN to work.
And/or, go through formal (legally required) certification
/validation processes.


True again. In my case though, a publishing deadline was ususally coming
up, and the scientist(s) realized that simple equations can be complicated
to program, and Excel graphics can be a little short of publication quality
without putting a lot of time into learning how to program using VBS.


Yes. We had to go through that exercise with some health data just
last night. "I'm SURE there is a way to get the program to produce
the graphs we want; why don't they look right?"

It also falls down when "your part" has to interface seamlessly with a
part that another developer in another part of the world is producing
in parallel with your activities.


Very true, especially with language and time zone barriers. I was lucky to
be the 'lone wolf' on all such projects.


As I'm writing this, I'm discussing with a colleague in Germany the draft of
a specification that I wrote. Not only do I have to ensure there are no
errors in the design of the specification, but I also have to ensure
it is clear and unambiguous. *And*, as folks in many nations will be
using it as "Gospel", I have to hope I haven't relied on some particular
language idioms that don't make sense in some other culture. Or, that
the design itself won't be "impractical"/unacceptable in other parts
of the world.

I designed a system with a partner firm abroad. Made perfect sense
in technological terms and economic terms -- *here*. But, pricing
in the US is considerably different for bits of tech than it is in
other places! Things that were cheap/affordable here were essentially
Unobtanium, there. And, in still other markets, actually subject to
export restrictions (how many folks have sat down with their corporate
lawyer to determine what aspects of their products might not be
legally exportable?)

Some years ago, a friend designed a video game with graphics/characters
suitable for the US market. Among his implementation choices was the
use of skull & crossbones to signify "player death". (Hey, it's not
a REAL death! Chill out!) The game had to be redesigned with different
graphics as some markets found the symbol offensive.

You don't learn these things as a lone wolf. Or, in MOST "teams",
regardless of size.

Or, when the prototype hardware costs $1M and you can, at best, *borrow*
a few hours per month on which to "test" your designs.


Money talks. In my case, there was never any hard 'deliverable' that used
any of my code. Otherwise there WOULD be legal requirements for reliability
and all those things you mentioned previously.


I "build THINGS". My code only runs "in a workstation" for test purposes
(or under a simulator). It ultimately runs *in* a piece of equipment.
There's usually no keyboard or (general purpose) display. No way for me
to tell a user that the software "divided by zero" -- or tried to free()
unallocated memory, etc.

A software "bug" is interpreted as "IT is broken" -- not "there's a bug
in the software that makes it what it is"

Of course, you also have scheduling and cost factors to consider.


Like I said, deadlines were approaching and there was no money nor time to
let a contract specifying everything it detail. But there was a guy on
staff whose job description actually included a section about programming
to aid scientific research, and that guy (me) actually had a degree
majoring in physics unlike all of the other computer people who had IT
degrees.


I was part of a ~$5M project for a software-driven device... that hadn't
considered the need for a "programmer"! They eventually assigned the
task to a technician -- because he tinkered with software, at home!

{Typical PHB scenario -- clueless as to what the real issues in a
particular project were!]

(Do you write "++x;" or "x++;"? Are you consistent in your choice?)


This depends a LOT on the semantic context of the surrounding code.

And, chances are, there are holes large enough to drive a truck through
in the thoroughness of your documentation, test suite, etc.


Guilty. But then again, the main onus was to get the job done ASAP.


And that's what happens in most organizations. "We don't have time to
do it right -- but, we'll have time to do it OVER!"

I spend about 40% of my "effort" writing specifications -- BEFORE I
write a line of production code! I may hack together some code
fragments to test the effectiveness and viability of particular
ideas before committing them to the specification; but, that code
is all disposable, back-of-the-napkin stuff. In most shops, that
code finds its way into the product -- because they consider
discarding it to be a "waste" of the time that it took to hack together!

(would you draw a sketch of your "dream house" -- and expect to find
the final blueprints drawn OVER that sketch??)

I found that I could save a LOT of time just by doing bulk QC on the inputs
before looking at the real task at hand.


Every function/module that I build begins with a stanza of invariants.
These formally (re)declare the interface requirements (for example,
"denominator != 0") AND ENFORCE THEM! So, if some other piece of code
violates that contractual interface, the code "tells me". EVERY TIME
the function is invoked!

Because you weren't PLANNING on reusing it. You succumb to the
"just throw something together" thinking.


Like I said earlier, I started to assume that the code would be re-visited
despite the assurances that it was a quick & dirty request. I did this just
to make my own future more pleasant.


-----^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Exactly. Employers don't care about how comfortable you are in your
job. Nor how effective you are (as a consequence of THEIR policies).
You work for them. You put up with your work conditions in exchange
for a paycheck. If you ever find the balance unfavorable, you leave!

[This is what drove me to work on my own; the apparent willingness of
employers to waste "my life" (i.e., my time) doing and redoing things
that they COULD HAVE done right in the first place. But were afraid
to do so or unable to recognize. With clients, each job is an
agreed upon mutual arrangement. If I don't like what they are asking me
to do -- or if they are unhappy with the way I do it -- then we
simply choose not to enter into an Agreement. No hard feelings.]

It has been fun to "rib" old employers and clients regarding
the bad decisions they made. You don't have to explicitly SAY
"I told you so" for them to understand the intent! :

OTOH, it's disheartening to see them repeat the same mistakes
despite acknowledging the truth of that reality! :

For a fun exercise, try to track down any formal documentation for ANY
product (software and/or hardware) that you've been involved with.
Big difference when you have a financial/contractual arrangement with
someone (client/provider) and that document DEFINES the work to be done!


The stories I could tell would probably not surprise you. In a previous
job, I was involved with contract monitoring for major ($M) government
projects. Also, I was on the team that developed operational code for all
the 24hr Canadian Government GOES ground stations for meterological
satellite images. As far as I know, the software ran okay for decades with
no problems, even transiting the dreaded Y2K apocalypse.


Software (esp prepackaged libraries) is sold like automobiles; all
of the "detail" is missing. You're just supposed to think of it as a
magical turn-key product. Despite the fact that it all has definite
usage constraints. Those constraints are never spelled out (at
least you might be able to find them in a vehicle's workshop
manual!)

Notice how many "undefined behaviors" are listed in interface specs.
And, how many of the nitty-gritty details are just "omitted"!

If I provide a clean heap and perform a series of allocations and
releases, can you tell me what state the heap *will* be in?
(it's a deterministic machine; you SHOULD be able to predict this!)
Ah, "little" details missing from the description of that subsystem,
eh? How can you know, at design time, that it WILL meet your
needs AT RUNTIME? If you can't predict how it operates (because
you don't have the sources; or, if you have the sources but not
the test suite; or, the test suite but no temporal performance
data; or... : )

By claiming something is "undefined", the spec writer has made his
(and the implementor's) job easier -- but YOURS, harder!

And, the subsystem covered by the specification is "designed and
implemented ONCE; reused (in theory) endlessly! Why make a one-time
job easier at the expense of a recurring activity??

I'm always amused at how many misconceptions folks have regarding floating
point data types; ignorance of overflow/underflow, cancellation, etc.


I did take a "Numercal Approximation" course at college. (I have a CS minor
as well) True, what you say. Some people have no idea. Scientists seem to
have a bad oversight regarding their own misunderstanding about things they
have never studied in detail. In general, this is because they are actually
more intelligent than others. But intelligence combined with ignorance is a
pretty good match for stupidity.


"People don't know what they DON'T know". The few managers/clients
that I've respected were aware of their limitations. Most are too
insecure to admit same.

[Goldberg (I forget his first name) has an excellent treatment of
the sorts of issues that folks writing floating point code should
be aware of. I can probably find the exact title...]

I have to provide a "numerical capability" to users in my home
automation system that doesn't require them to understand the limitations
of the implementation. I.e., N * 1/N should be 1 for all N != 0
(among other guarantees). It would be arrogant to expect homeowners
(business owners) to have to understand that order of operations
bears on the resulting operation of the calculation, etc.

So, my implementation (written ONCE) takes care of that so the users'
applications (written MANY times) don't have to!

If you are at all interested in the sort of code I'm happy to put my name
on (i.e. not the crappy 'throwaway' code I described earlier), check out
the sources available in the 'Programs' section of my personal
not-for-profit website referenced in my signature. The Javascript for the
gauge demo is in 'wlib.js', the other sources are downloadable.


Thanks, I'll have a peek (probably later tonight as I'm running out of
"workday hours" with my German colleague).

[To rephrase my comments in light of your last comment, here, I have
to be "happy to put my name on" EVERY line of code that I write.
Others will judge the quality of my work from it -- after all, *it*
is the PRODUCT that I ultimately "deliver"!]


  #61   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 2,879
Default Whirlpool Fridge lock up during an apparent power glitch

On Thu, 21 Apr 2016 10:47:56 -0700, Don Y wrote:

[Goldberg (I forget his first name) has an excellent treatment of
the sorts of issues that folks writing floating point code should
be aware of. I can probably find the exact title...]


https://docs.oracle.com/cd/E19957-01/806-3568/ncg_goldberg.html

*EXCELLENT* resource! (It's easier to understand if you've actually
written a floating point package)

The comments on the quadratic formula should have damn near everyone
rethinking how they've "solved that" in the past! :
  #62   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 15,279
Default Whirlpool Fridge lock up during an apparent power glitch

On Thursday, April 21, 2016 at 1:48:24 PM UTC-4, Don Y wrote:
On 4/21/2016 7:19 AM, Mike Duffy wrote:
On Thu, 21 Apr 2016 06:29:27 -0700 (PDT), trader_4 wrote:

[...] it's easier, faster, cheaper to use
the existing code. Again that assumes that it's not a complete hack
job, that it's documented, etc. If the new project shares little in
common with the existing, then you start from scratch. But those are
pretty much the same rules for any project.


I large part of my career was as you call it a "code cowboy". Working
completely by myself on small (a few days, a few weeks, occaisionally a few
months) programs which were ALWAYS specified as being one-shot deals. No
need then, to extensively test for inputs out of range of the given
datasets, no need to document for easy re-use by myself, and no need to
collaborate with others.


That approach doesn't work when your product processes large amounts of
CASH; tells a patient they have/don't have a particular disease process;
controls a mechanism moving at a high rate of speed (or large masses);
produces a product that people INGEST; pilots a vessel; is relied upon
to produce tens of thousands of dollars of saleable (vs SCRAPPED!) product
each hour; etc. Folks tend to want designs that can be PROVEN to work.
And/or, go through formal (legally required) certification/validation
processes.


And it's your experience and your opinion that companies in
those product areas don't value and re-use their existing code base?
That with each new product or each new version of a product,
they don't typically start with the existing code, and use
much of it for the new product? That instead, they pretty
much write all new code, each time? That's my take on what
you're arguing here.



There's this attitude that "We're not sure how well this product will be
received; so, we aren't keen on making a big commitment to it. If
the market likes it, we'll go back and fix all these things -- or design
Model 2".


That's not my experience. Is that how they build products in all
those product areas you listed, copied below, from your experience?

"product processes large amounts of
CASH; tells a patient they have/don't have a particular disease process;
controls a mechanism moving at a high rate of speed (or large masses);
produces a product that people INGEST; pilots a vessel; is relied upon
to produce tens of thousands of dollars of saleable (vs SCRAPPED!) product
each hour; "






(some) New refrigerators do inventory control.


Show us one example. How exactly does this miracle inventory
system work? How does it know I just put a lettuce in the fridge,
or a carton of eggs, or how many eggs are left, etc?


It's not hard to go from that
to an expert system that knows how long particular products (or classes of
products) can be kept in particular storage conditions. [This would be
actually relatively simple to do with a small set of productions] It would
also be easy to know the EXACT conditions incurred by the items in the 'frig
(YOUR particular temperature setting, humidification in the crisper drawer,
etc.). No more "does this smell spoiled, to you?".

And, from that, to advertisers pitching coupons to you based on their
observations of your usage ("He's almost out of milk. Let's arrange
for a SERIALIZED coupon for soy milk to appear in the next 'mailing'
and we can later see if he actually uses it -- by tracking the serial
number on the coupon")

  #63   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 4,321
Default Whirlpool Fridge lock up during an apparent power glitch

"trader_4" wrote in message

stuff snipped

Because even there, if the "new" product is similar to the existing
one, which is often the case, it's easier, faster, cheaper to use
the existing code. Again that assumes that it's not a complete hack
job, that it's documented, etc. If the new project shares little in
common with the existing, then you start from scratch. But those are
pretty much the same rules for any project. For example, if you
have a driver for a communications interface, be it USB, Ethernet,
It's very, very seductive to want to change to a new, more capable
microprocessor but most instances I know of if there was an upgrade, it

was
to the same chip family so that old code could still run.


Absolutely. In many markets, the cost of one CPU vs a competing

alternative
could be huge, but they won't port the code over, because the cost of
doing that, the time, the validation, etc would be staggering. The cost
of the CPUs is almost irrelevant.


I know it's painful for microcoders to see all the new (usually dirt cheap)
chips/chipsets but not be able to use them because the rewrite and retesting
effort would be massive. I've only seen it happen when the device is so
popular that people think up additional things it could do and there's a
potential for greater sales. Even then, the bean counters have been known
to say no - better the devil we know.

--
Bobby G.


  #64   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 4,321
Default Whirlpool Fridge lock up during an apparent power glitch

"Mike Duffy" wrote in message

stuff snipped

I large part of my career was as you call it a "code cowboy". Working
completely by myself on small (a few days, a few weeks, occaisionally a

few
months) programs which were ALWAYS specified as being one-shot deals. No
need then, to extensively test for inputs out of range of the given
datasets, no need to document for easy re-use by myself, and no need to
collaborate with others.


As the expert system said "I were one, two." I think I hurt Don's feelings
when I said I wouldn't hire him but I wouldn't hire me either! At least not
back then when I was a "sole practitioner" and that's *because* I was a sole
practitioner.

Most of the time though, a request would eventually be made to resurrect
the code with modified requirements, or input data that had not been
subject to the same levels of quality control as previous runs, etc.


Some of my most profitable gigs were converting stand-alone versions of
programs I had written to networkable ones. Obtuse coding got me the
repeat business because they shopped the upgrade elsewhere and were told "it
would take us weeks to figure out what it does, let alone make it do
something else." It was "ahead of its time" coding - outputting dBase II in
neatly formatted boxes using ASCII line commands. We burned out at least
one dot matrix a month because the client spec'ed 30 active jobs but there
were really 300.

If you're dealing with a limited data entry pool you can skip a lot of
bulletproofing, I agree. For something that doesn't face hostile actors
time is money and I worked 16 hours a day as a lone cowboy. Making it
bulletproof wasn't as important as making it work.

To top it off, big jobs seemed to come in all at the same time so I really
had to work as fast as I could all the time because I might lose a big job
because I was too busy.

The only thing more vexing than code that is difficult to follow is when
you can't even complain to the author about how ****ty it is.


Even if the code isn't great, a second pass is never a bad thing. "Build
one to throw away - you will anyway." -- Mythical Man Month

I do agree that the first time you have to update your own quick and dirty
code is the time you learn to be a little slower and a little cleaner.

So I got into the habit of using descriptive variable names, adding
comments, documenting formats, and most importantly, little messages to my
future self about why a seemingly simpler method cannot be used.


Ain't it great to have to rediscover an issue that you solved months or
years ago and work around and around it until you remember why you had to do
it the way you did?

fridges are an example, I'd pay some small premium for a fridge
that had a wifi alarm feature.


What we really need is a little robot that crawls around inside the fridge
looking for stuff that hasn't been touched for a while or is emitting

decay
byproducts.


Just opened a tub of in-date yogurt that had a nice big green mold spot
where the foil didn't seal. That would be hard to catch, I suspect. A true
self cleaning fridge would be great, though. I've taken to writing the date
I open pickle jars and such on the bottom of the jar with a sharpie.
Sometimes it shocks me how old some stuff is. (0:

--
Bobby G.


  #65   Report Post  
Posted to alt.home.repair
external usenet poster
 
Posts: 4,321
Default Whirlpool Fridge lock up during an apparent power glitch

"Don Y" wrote in message

stuff snipped

Instead, consider the software INSIDE your:
- keyboard
- mouse
- CD/DVD drive
- hard disk drive
- monitor, etc.


I'm pretty sure that's firmware, not software. (-:

stuff snipped about how many devices have CPUs of some sort in them

The thing that all of these appliances have in common (for the most part)
is that they can't be easily/inexpensively/compassionately "updated"!


But these are usually very close to trivial programs that operate with
little user input and that interact with relatively few sensors - at least
soldering irons and other items like that. And I would expect that as with
new cars, updating will be more and more feasible because so many new
appliances
will be web-enabled.

It's still quite unfair to compare a microwave's program to something like
Linux or Windows that has to host an nearly infinite variety of applications
on hardware that's very, very variable. It's largely a function of the
number of lines of code. As that rises, the likelihood of finding all the
bugs drops in proportion.

You can't just "turn on automatic updates" and magically expect bugs
to be patched. So, there needs to be extra care up front to ensure
problems/bugs don't get out "into the wild".


How much extra care? If it's extra care, doesn't that imply extra cost?
Which gets us back to the comments I took issue with. To be more careful
requires more cost and more time. Companies reuse code and build code
libraries to be competitive. If they had to write everything from scratch
every time it would be prohibitively expensive. Can a "sole practioner"
afford to take more time and care? Yes, of course, but even then the more
time you spend on one program the less you can spend on other programs (and
sources of income). So making programs as perfect as you can make them cuts
into your revenue stream. Is it worth the tradeoff? Maybe it is if your
client reputation shines as a result and you get more business and clients
who are willing to pay for the increased reliability.

--
Bobby G.


Reply
Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules

Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Anyone with a Whirlpool fridge / freezer Slider UK diy 1 March 17th 09 08:55 PM
Anyone with a Whirlpool fridge / freezer Slider UK diy 1 March 17th 09 06:16 PM
Whirlpool fridge freezer [email protected] UK diy 2 October 17th 06 06:50 AM
Whirlpool Fridge Problem George Home Repair 2 June 14th 06 07:20 PM
Which start kit for Whirlpool fridge? Dan Olson Home Repair 3 March 29th 06 07:44 PM


All times are GMT +1. The time now is 03:02 PM.

Powered by vBulletin® Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 DIYbanter.
The comments are property of their posters.
 

About Us

"It's about DIY & home improvement"