Home |
Search |
Today's Posts |
|
UK diy (uk.d-i-y) For the discussion of all topics related to diy (do-it-yourself) in the UK. All levels of experience and proficency are welcome to join in to ask questions or offer solutions. |
Reply |
|
LinkBack | Thread Tools | Display Modes |
#1
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
What a load of ********.
|
#2
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
On 30/05/2017 08:03, harry wrote:
What a load of ********. OK a power surge 'could' take out one data centre, but as they have class 3 independent power feeds - never been known. They have huge battery backup which keeps all running while own generators run up. However how would it take out other data centres as the newtork would be dual redundant in Geographically remote locations. Be interesting to see what comes out of this. |
#3
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
On 30/05/17 08:20, rick wrote:
On 30/05/2017 08:03, harry wrote: What a load of ********. OK a power surge 'could' take out one data centre, but as they have class 3 independent power feeds - never been known. They have huge battery backup which keeps all running while own generators run up. However how would it take out other data centres as the newtork would be dual redundant in Geographically remote locations. Be interesting to see what comes out of this. crap design by incompetent yes men in middle management -- "If you dont read the news paper, you are un-informed. If you read the news paper, you are mis-informed." Mark Twain |
#4
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
On 30/05/2017 08:38, The Natural Philosopher wrote:
On 30/05/17 08:20, rick wrote: On 30/05/2017 08:03, harry wrote: What a load of ********. OK a power surge 'could' take out one data centre, but as they have class 3 independent power feeds - never been known. They have huge battery backup which keeps all running while own generators run up. However how would it take out other data centres as the newtork would be dual redundant in Geographically remote locations. It appears that for some reason it took out everything including their phone system and that it took almost forever to get it going again. Way longer than you would expect for any PLC recovery plan. Something went badly wrong - last remaining sysyop wizard away for the Bank Holiday? Be interesting to see what comes out of this. crap design by incompetent yes men in middle management Possibly, or more likely I think that they made redundant all the people who actually knew how things worked and how to maintain it. The system might well have been OK if it had been maintained and rebooted by people who actually knew what they were doing. -- Regards, Martin Brown |
#5
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
In article , Martin Brown '''newspam'''@nezumi
..demon.co.uk scribeth thus On 30/05/2017 08:38, The Natural Philosopher wrote: On 30/05/17 08:20, rick wrote: On 30/05/2017 08:03, harry wrote: What a load of ********. OK a power surge 'could' take out one data centre, but as they have class 3 independent power feeds - never been known. They have huge battery backup which keeps all running while own generators run up. However how would it take out other data centres as the newtork would be dual redundant in Geographically remote locations. It appears that for some reason it took out everything including their phone system and that it took almost forever to get it going again. Way longer than you would expect for any PLC recovery plan. Something went badly wrong - last remaining sysyop wizard away for the Bank Holiday? Be interesting to see what comes out of this. crap design by incompetent yes men in middle management Possibly, or more likely I think that they made redundant all the people who actually knew how things worked and how to maintain it. The system might well have been OK if it had been maintained and rebooted by people who actually knew what they were doing. Indeed but since when has senior "management" ever known anything about IT?. -- Tony Sayer |
#7
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
On 30/05/17 10:50, Terry Casey wrote:
In article , says... However how would it take out other data centres as the newtork would be dual redundant in Geographically remote locations. Its been suggested over at uk.railway that: "BA appear to have both their main data centres on the same industrial site near Heathrow. Presumably both on the same mains distribution feed!" That would make sense to an airline, wouldn't it? After all, you wouldn't have the co-pilot flying in a different plane to the pilot, would you? Sigh. Its seems that 'power surge' = We powercyled the kit, like what you do with Windows and Netgear innit? and it didn't all come up and play nice. -- "I guess a rattlesnake ain't risponsible fer bein' a rattlesnake, but ah puts mah heel on um jess the same if'n I catches him around mah chillun". |
#8
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
In article , The Natural Philosopher
wrote: On 30/05/17 10:50, Terry Casey wrote: In article , says... However how would it take out other data centres as the newtork would be dual redundant in Geographically remote locations. Its been suggested over at uk.railway that: "BA appear to have both their main data centres on the same industrial site near Heathrow. Presumably both on the same mains distribution feed!" That would make sense to an airline, wouldn't it? After all, you wouldn't have the co-pilot flying in a different plane to the pilot, would you? Sigh. Its seems that 'power surge' = We powercyled the kit, like what you do with Windows and Netgear innit? and it didn't all come up and play nice. I suffered a "power surge" at home 2 weeks ago. One of my computers started emitting magic smoke. -- from KT24 in Surrey, England |
#9
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
On 30-May-17 8:54 AM, Martin Brown wrote:
.... The system might well have been OK if it had been maintained and rebooted by people who actually knew what they were doing. From what was being said last night, it might have been OK if somebody had thought to check whether the backup system actually worked when needed. -- -- Colin Bignell |
#10
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
In article ,
tony sayer wrote: Possibly, or more likely I think that they made redundant all the people who actually knew how things worked and how to maintain it. The system might well have been OK if it had been maintained and rebooted by people who actually knew what they were doing. Indeed but since when has senior "management" ever known anything about IT?. They apparently know enough about it to decide it will be better out sourced to India. Hope the penalties etc they have to pay after this fiasco shows them a headline cost may not always be the true one. -- *Ham and Eggs: Just a day's work for a chicken, but a lifetime commitment Dave Plowman London SW To e-mail, change noise into sound. |
#11
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
On Tue, 30 May 2017 11:08:36 +0100, Nightjar wrote:
From what was being said last night, it might have been OK if somebody had thought to check whether the backup system actually worked when needed. Wouldn't be the first time (or the last) that the "regulary tested" backup generator fails to start (or the auto start system doesn't) or if it does start falls over within minutes. Reasons: It takes a brave man to truely simulate a mains failure by opening the breaker on the incoming supply(s) without warning or any preparation of down stream systems(*). It's "safer" to leave the supply(s) connected, press "start" on the generator and when warmed and up to speed, manually operate the changeover, run on generator for a cuple of mins manually drop back to the supplies (that have never been isolated) and shutdown generator. How many things *doesn't* that test? (*) One assumes BA would have doubley rendundant, hot spare, core systems, so "just in case" you make sure that the breaker(s) you are about to pull are only feeding the "non-live" core system. What ever it was I've seen one report that there had to be some "hardware changes", followed by databases being out of sync with each other. A few hours ought to sort the hardware, what really screws you up is bad/missing/mismatched data. -- Cheers Dave. |
#12
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
On 30/05/17 11:17, Dave Plowman (News) wrote: In article , tony sayer wrote: Possibly, or more likely I think that they made redundant all the people who actually knew how things worked and how to maintain it. The system might well have been OK if it had been maintained and rebooted by people who actually knew what they were doing. Indeed but since when has senior "management" ever known anything about IT?. They apparently know enough about it to decide it will be better out sourced to India. Hope the penalties etc they have to pay after this fiasco shows them a headline cost may not always be the true one. Some years ago, in the early 80s, I was in Pakistan on Business. I was doing some equipment trials. I decided to test the equipment in the hotel room. I was in a major city. I had a simple linear power supply which 'dropped' the mains voltage to that required for the equipment. The equipment itself would normally work over a wide range of input voltages but it wouldn't power up correctly. Out came the multimeter and at first it looked like the power supply had failed. However, it turned out the mains voltage was way below the supposed 240V, as I recall something under 200V. I spoke to our local rep and it seems this was far from unusual- as were power cuts etc. This was in a major city. Sure enough, that night when I needed to answer a 'call of nature', there was no power. The next morning, the mains measured 265V. Across the road, they were building another hotel. The scaffolding was all made of bamboo. The hotel was already up to about 10 floors. The workman were climbing it in bare feet, no hard hats or any safety kit to be seen. |
#13
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
On Tue, 30 May 2017 08:23:34 +0100, Chris Hogg wrote:
PS: what exactly is a 'power surge' on the grid, and how's it caused? Good question, and not one that seems to have been answered so far. |
#14
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
On 30/05/2017 12:47, mechanic wrote:
On Tue, 30 May 2017 08:23:34 +0100, Chris Hogg wrote: PS: what exactly is a 'power surge' on the grid, and how's it caused? Good question, and not one that seems to have been answered so far. The ones I have seen were direct lightning strikes to the metal roof or nearby electric pylons. In both cases damage was done to kit that was notionally protected by heavy duty industrial grade surge arresters. The worst one vaporised the telephone lines in reception leaving a 4" wide charred mess down the wall and a dazed unconsolable receptionist. Took days to get the line mended and the phone switchboard was toast. That was after a strike to the metal roof of the building. It was notable how getting one mainframe back online was made more difficult by multiple systems failures in different zones. Normal CPU failures usually are single point and neighbouring bits whereas a big lightning strike seems to find perverse and unusual destructive paths. Sacrificial protection bits somehow manages to save themselves at the expense of parts that are much harder to access and replace. However, BA don't seem to be claiming that it was from a lightning strike but a "Power surge". Presumably they can back up their claim of power surge by peak voltage/current spike detection and logging... -- Regards, Martin Brown |
#15
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
In article ,
Brian Reay wrote: On 30/05/17 11:17, Dave Plowman (News) wrote: In article , tony sayer wrote: Possibly, or more likely I think that they made redundant all the people who actually knew how things worked and how to maintain it. The system might well have been OK if it had been maintained and rebooted by people who actually knew what they were doing. Indeed but since when has senior "management" ever known anything about IT?. They apparently know enough about it to decide it will be better out sourced to India. Hope the penalties etc they have to pay after this fiasco shows them a headline cost may not always be the true one. Some years ago, in the early 80s, I was in Pakistan on Business. I was doing some equipment trials. I decided to test the equipment in the hotel room. I was in a major city. I had a simple linear power supply which 'dropped' the mains voltage to that required for the equipment. The equipment itself would normally work over a wide range of input voltages but it wouldn't power up correctly. Must admit I'd thought the actual hardware would still be in the UK? But under the control of their newly out sourced experts? Who might just have a problem popping down the corridor to press the reset button in event of a failure? -- *No husband has ever been shot while doing the dishes * Dave Plowman London SW To e-mail, change noise into sound. |
#16
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
On Tue, 30 May 2017 10:59:57 +0100, charles wrote:
In article , The Natural Philosopher wrote: On 30/05/17 10:50, Terry Casey wrote: In article , says... However how would it take out other data centres as the newtork would be dual redundant in Geographically remote locations. Its been suggested over at uk.railway that: "BA appear to have both their main data centres on the same industrial site near Heathrow. Presumably both on the same mains distribution feed!" That would make sense to an airline, wouldn't it? After all, you wouldn't have the co-pilot flying in a different plane to the pilot, would you? Sigh. Its seems that 'power surge' = We powercyled the kit, like what you do with Windows and Netgear innit? and it didn't all come up and play nice. I suffered a "power surge" at home 2 weeks ago. One of my computers started emitting magic smoke. That's why I have three UPS units. -- My posts are my copyright and if @diy_forums or Home Owners' Hub wish to copy them they can pay me £1 a message. Use the BIG mirror service in the UK: http://www.mirrorservice.org *lightning surge protection* - a w_tom conductor |
#17
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
"Chris Hogg" wrote in message ... PS: what exactly is a 'power surge' on the grid, and how's it caused? The power surge quote was courtesy of Alex Cruz, BA CEO. The same Alex Cruz who, after a total breakdown of BA's IT systems caused a suspension of all the airlines services worldwide for around 24 hours, was quoted as saying "We would never compromise the integrity and security of our IT systems" Which does make you wonder how much worse things could have been, if that had been his intention all along michael adams .... |
#18
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
On 30/05/2017 13:20, Tim Streater wrote:
In article , Martin Brown wrote: That was after a strike to the metal roof of the building. No lighting conductors? There were but they didn't seem to help. There was even a supergrid pylon much taller and less than 100m away but it still hit our roof. But the building to building potential difference also did a lot of damage as only one was struck. You can't really do much about that since all the current has to go somewhere when it reaches the ground. -- Regards, Martin Brown |
#19
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
In article ,
Martin Brown writes: On 30/05/2017 12:47, mechanic wrote: On Tue, 30 May 2017 08:23:34 +0100, Chris Hogg wrote: PS: what exactly is a 'power surge' on the grid, and how's it caused? Good question, and not one that seems to have been answered so far. The ones I have seen were direct lightning strikes to the metal roof or nearby electric pylons. In both cases damage was done to kit that was notionally protected by heavy duty industrial grade surge arresters. The worst one vaporised the telephone lines in reception leaving a 4" wide charred mess down the wall and a dazed unconsolable receptionist. Took days to get the line mended and the phone switchboard was toast. That was after a strike to the metal roof of the building. It was notable how getting one mainframe back online was made more difficult by multiple systems failures in different zones. Normal CPU failures usually are single point and neighbouring bits whereas a big lightning strike seems to find perverse and unusual destructive paths. Sacrificial protection bits somehow manages to save themselves at the expense of parts that are much harder to access and replace. However, BA don't seem to be claiming that it was from a lightning strike but a "Power surge". Presumably they can back up their claim of power surge by peak voltage/current spike detection and logging... One cause can be a disconnected neutral, which causes the voltage on the least loaded phase to rise (and the most loaded phase to drop). Worse case, that gets you ~400V. That could destroy the input stage of a switched mode PSU designed for 240VAC. About 17 years ago, there was a power surge at home, which did for the ethernet port on an Sun Ultra 5 workstation. I don't know the nature of it, but it was due to a high voltage fault and the electricity company instantly offered to get the computer repaired. In the event, my employer just gave me another one and I don't think they bothered claiming back from Southern Electric. -- Andrew Gabriel [email address is not usable -- followup in the newsgroup] |
#20
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
On 5/30/2017 12:15 PM, Dave Liquorice wrote:
On Tue, 30 May 2017 11:08:36 +0100, Nightjar wrote: From what was being said last night, it might have been OK if somebody had thought to check whether the backup system actually worked when needed. Wouldn't be the first time (or the last) that the "regulary tested" backup generator fails to start (or the auto start system doesn't) or if it does start falls over within minutes. Reasons: It takes a brave man to truely simulate a mains failure by opening the breaker on the incoming supply(s) without warning or any preparation of down stream systems(*). It's "safer" to leave the supply(s) connected, press "start" on the generator and when warmed and up to speed, manually operate the changeover, run on generator for a cuple of mins manually drop back to the supplies (that have never been isolated) and shutdown generator. How many things *doesn't* that test? (*) One assumes BA would have doubley rendundant, hot spare, core systems, so "just in case" you make sure that the breaker(s) you are about to pull are only feeding the "non-live" core system. What ever it was I've seen one report that there had to be some "hardware changes", followed by databases being out of sync with each other. A few hours ought to sort the hardware, what really screws you up is bad/missing/mismatched data. My thoughts too |
#21
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
On 5/30/2017 3:12 PM, Chris Hogg wrote:
On Tue, 30 May 2017 13:20:46 +0100, Tim Streater wrote: In article , Martin Brown wrote: That was after a strike to the metal roof of the building. No lighting conductors? AIUI a lightning conductor's primary purpose is not to conduct the lightning strike to earth, but to discharge the cloud above it in an attempt to stop the lightning in the first place. I believe there's a significant current flowing in a lightning conductor some time before a strike happens. If there is a lightning flash, it means the conductor has failed in its primary purpose. But I can't believe that's why lightning conductors were used in the beginning; they must have been actually intended to conduct the lightning and prevent damage to the church tower or whatever, even if in those days no-one realised the true action. Two good points. |
#22
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
On Tue, 30 May 2017 12:26:56 +0100, Tim Streater wrote:
I suffered a "power surge" at home 2 weeks ago. One of my computers started emitting magic smoke. I've had that, more of "melt-down" though. Tower case mother board on edge, for some unknown reason CPU power regulators got so hot they slid down the board... So why haven't you got a UPS? It's not as if a small APC costs very much. Something like this: https://www.amazon.co.uk/APC-Back-UP...-BE700G-UK/dp/ B002RXED6A/ I'm reasonably sure that will an "off-line" UPS. ie the mains is normally fed straight through or via a simple 1:1 transformer with a bit of filtering. Not sure if the Back-UPS models can also boost/buck the mains via altering transformer taps, might be a Smart-UPS feature. The invertor side only fires up to provide power when the mains fails. An "on-line" UPS powers the kit via the batteries and invertor secion all the time. The incoming mains just feeds a hefty charger with enough umph to charge the batteries whilst they are being drained by the invertor. Only an on-line UPS truely separates the suported kit from the mains. -- Cheers Dave. |
#23
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
On Tuesday, 30 May 2017 12:16:02 UTC+1, Dave Liquorice wrote:
On Tue, 30 May 2017 11:08:36 +0100, Nightjar wrote: From what was being said last night, it might have been OK if somebody had thought to check whether the backup system actually worked when needed. Wouldn't be the first time (or the last) that the "regulary tested" backup generator fails to start (or the auto start system doesn't) or if it does start falls over within minutes. Reasons: It takes a brave man to truely simulate a mains failure by opening the breaker on the incoming supply(s) without warning or any preparation of down stream systems(*). It's "safer" to leave the It's donr once a month at UK hospitals. |
#24
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
On 30/05/17 12:18, Brian Reay wrote:
Across the road, they were building another hotel. The scaffolding was all made of bamboo. The hotel was already up to about 10 floors. The workman were climbing it in bare feet, no hard hats or any safety kit to be seen. I read somewhere that bamboo makes first class scaffolding. In Hong Kong they build skyscrapers with it. Agree with the rest of what you say though. Another Dave -- Change nospam to techie |
#25
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
On 5/30/2017 4:16 PM, Another Dave wrote:
On 30/05/17 12:18, Brian Reay wrote: Across the road, they were building another hotel. The scaffolding was all made of bamboo. The hotel was already up to about 10 floors. The workman were climbing it in bare feet, no hard hats or any safety kit to be seen. I read somewhere that bamboo makes first class scaffolding. In Hong Kong they build skyscrapers with it. Plus: Strong, light, tough, flexible (so it shares load well), remains elastic at high bending strains. Minus: More difficult to join (I believe it is normally tied rather than clamped) |
#26
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
In article , Tim Streater
wrote: In article , charles wrote: In article , The Natural Philosopher wrote: On 30/05/17 10:50, Terry Casey wrote: In article , says... However how would it take out other data centres as the newtork would be dual redundant in Geographically remote locations. Its been suggested over at uk.railway that: "BA appear to have both their main data centres on the same industrial site near Heathrow. Presumably both on the same mains distribution feed!" That would make sense to an airline, wouldn't it? After all, you wouldn't have the co-pilot flying in a different plane to the pilot, would you? Sigh. Its seems that 'power surge' = We powercyled the kit, like what you do with Windows and Netgear innit? and it didn't all come up and play nice. I suffered a "power surge" at home 2 weeks ago. One of my computers started emitting magic smoke. So why haven't you got a UPS? I was using a UPS - its output "surged". -- from KT24 in Surrey, England |
#27
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
In article ,
Another Dave wrote: On 30/05/17 12:18, Brian Reay wrote: Across the road, they were building another hotel. The scaffolding was all made of bamboo. The hotel was already up to about 10 floors. The workman were climbing it in bare feet, no hard hats or any safety kit to be seen. I read somewhere that bamboo makes first class scaffolding. In Hong Kong they build skyscrapers with it. 20 years ago, in Lithuania, they were using larch poles. Any metal scaff tended to "walk" overnight. -- from KT24 in Surrey, England |
#28
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
In article ,
Terry Casey scribeth thus In article , says... However how would it take out other data centres as the newtork would be dual redundant in Geographically remote locations. Its been suggested over at uk.railway that: "BA appear to have both their main data centres on the same industrial site near Heathrow. Presumably both on the same mains distribution feed!" That would make sense to an airline, wouldn't it? After all, you wouldn't have the co-pilot flying in a different plane to the pilot, would you? LOL!.... I Often wonder about this dual mains feed in a data centre after all where do they manage to get two separate off differing transformers from and differing HV feeds?. Unless someone has the idea that different phases are different feeds?.. -- Tony Sayer |
#29
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
Some years ago, in the early 80s, I was in Pakistan on Business. I was doing some equipment trials. I decided to test the equipment in the hotel room. I was in a major city. I had a simple linear power supply which 'dropped' the mains voltage to that required for the equipment. The equipment itself would normally work over a wide range of input voltages but it wouldn't power up correctly. Out came the multimeter and at first it looked like the power supply had failed. However, it turned out the mains voltage was way below the supposed 240V, as I recall something under 200V. I spoke to our local rep and it seems this was far from unusual- as were power cuts etc. This was in a major city. Sure enough, that night when I needed to answer a 'call of nature', there was no power. The next morning, the mains measured 265V. Mind you some SMPS's do cope with a very wide mains input variation these days.. Across the road, they were building another hotel. The scaffolding was all made of bamboo. The hotel was already up to about 10 floors. The workman were climbing it in bare feet, no hard hats or any safety kit to be seen. Standard practice there.. -- Tony Sayer |
#30
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
In article , Martin Brown '''newspam'''@ne
zumi.demon.co.uk scribeth thus On 30/05/2017 12:47, mechanic wrote: On Tue, 30 May 2017 08:23:34 +0100, Chris Hogg wrote: PS: what exactly is a 'power surge' on the grid, and how's it caused? Good question, and not one that seems to have been answered so far. The ones I have seen were direct lightning strikes to the metal roof or nearby electric pylons. In both cases damage was done to kit that was notionally protected by heavy duty industrial grade surge arresters. The worst one vaporised the telephone lines in reception leaving a 4" wide charred mess down the wall and a dazed unconsolable receptionist. Took days to get the line mended and the phone switchboard was toast. That was after a strike to the metal roof of the building. It was notable how getting one mainframe back online was made more difficult by multiple systems failures in different zones. Normal CPU failures usually are single point and neighbouring bits whereas a big lightning strike seems to find perverse and unusual destructive paths. Sacrificial protection bits somehow manages to save themselves at the expense of parts that are much harder to access and replace. However, BA don't seem to be claiming that it was from a lightning strike but a "Power surge". Presumably they can back up their claim of power surge by peak voltage/current spike detection and logging... Well broadcast site's do cope very well but they are well protected agin Jove's bolts. Wasn't there a lot of lighting around the channel area on Friday night?.. -- Tony Sayer |
#31
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
In article , Chris Hogg
scribeth thus On Tue, 30 May 2017 13:20:46 +0100, Tim Streater wrote: In article , Martin Brown wrote: That was after a strike to the metal roof of the building. No lighting conductors? AIUI a lightning conductor's primary purpose is not to conduct the lightning strike to earth, but to discharge the cloud above it in an attempt to stop the lightning in the first place. I believe there's a significant current flowing in a lightning conductor some time before a strike happens. If there is a lightning flash, it means the conductor has failed in its primary purpose. But I can't believe that's why lightning conductors were used in the beginning; they must have been actually intended to conduct the lightning and prevent damage to the church tower or whatever, even if in those days no-one realised the true action. The idea is to "shunt" the discharge around whatever it is your trying to protect. Messers Furse & Co did a very good online book on the subject once.. -- Tony Sayer |
#32
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
En el artÃ*culo , Martin Brown '''newspam''
escribió: However, BA don't seem to be claiming that it was from a lightning strike but a "Power surge". Presumably they can back up their claim of power surge by peak voltage/current spike detection and logging... Report on the so-called "power surge" in the Grauniad: https://www.theguardian.com/business...sh-airways-it- failure-experts-doubt-power-surge-claim "Experts have questioned British Airways claim that this weekend's catastrophic IT failure was down to a 'power surge', as the company's chief executive has claimed" I was sceptical from the word go. Cruz is probably more familiar with cashing in his stock options and awarding himself fat bonuses than the subtleties of running a data centre. -- (\_/) (='.'=) "Between two evils, I always pick (")_(") the one I never tried before." - Mae West |
#33
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
On Tue, 30 May 2017 18:31:19 +0100, tony sayer wrote:
I Often wonder about this dual mains feed in a data centre after all where do they manage to get two separate off differing transformers from and differing HV feeds?. Give enough money to your local DNO and they will provide diversity power. Something like BA's data centre I'd expect to have at least two diversly routed 11 kV feeds from different Primary subsations (33 kV 11 kV) that don't share a main 33 kV supply or the backups to those 33 kV (might be 11 kV, our local Primary has 11 kV backup). There are documents on the web that decribe the power arrangements for football stadiums, that may host the larger competitions. Search for "fifa lighting power supply" -- Cheers Dave. |
#34
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
On 30/05/2017 08:23, Chris Hogg wrote:
On Tue, 30 May 2017 00:03:03 -0700 (PDT), harry wrote: What a load of ********. I expect the sun suddenly came out from behind a cloud, thousands of solar panels suddenly started pumping out squillions of watts and the grid didn't react quickly enough. Expect lots more in the future. ;-) PS: what exactly is a 'power surge' on the grid, and how's it caused? Maybe a toe-rag stealing the surge protection gear from the substation. -- Max Demian |
#35
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
On 5/30/2017 7:58 PM, Chris Hogg wrote:
On Tue, 30 May 2017 19:38:52 +0100, Max Demian wrote: PS: what exactly is a 'power surge' on the grid, and how's it caused? Maybe a toe-rag stealing the surge protection gear from the substation. From the replies, it seems that a power surge is a nothing more than very short duration voltage spike caused by a lightning strike. I had wondered if that was all it was, or whether some massive bit of electrical gear like a carbon-arc furnace in a steel works suddenly shutting down might cause it, but as no-one mentioned anything like that, I guess I'm wrong. Steel works! We are talking about the home counties :-) |
#36
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
In article ,
tony sayer wrote: In article , Terry Casey scribeth thus In article , says... However how would it take out other data centres as the newtork would be dual redundant in Geographically remote locations. Its been suggested over at uk.railway that: "BA appear to have both their main data centres on the same industrial site near Heathrow. Presumably both on the same mains distribution feed!" That would make sense to an airline, wouldn't it? After all, you wouldn't have the co-pilot flying in a different plane to the pilot, would you? LOL!.... I Often wonder about this dual mains feed in a data centre after all where do they manage to get two separate off differing transformers from and differing HV feeds?. When BBC TV Centre was built, it was on the West London 33kV ring. it also had a reserve feed direct from Battersea. "It will be fine as long as Battersea doesn't blow up" - which, of course it did - well the main power board at any rate (April 1964) -- from KT24 in Surrey, England |
#37
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
On 30/05/2017 13:31, Dave Plowman (News) wrote:
In article , Brian Reay wrote: On 30/05/17 11:17, Dave Plowman (News) wrote: In article , tony sayer wrote: Possibly, or more likely I think that they made redundant all the people who actually knew how things worked and how to maintain it. The system might well have been OK if it had been maintained and rebooted by people who actually knew what they were doing. Indeed but since when has senior "management" ever known anything about IT?. They apparently know enough about it to decide it will be better out sourced to India. Hope the penalties etc they have to pay after this fiasco shows them a headline cost may not always be the true one. Some years ago, in the early 80s, I was in Pakistan on Business. I was doing some equipment trials. I decided to test the equipment in the hotel room. I was in a major city. I had a simple linear power supply which 'dropped' the mains voltage to that required for the equipment. The equipment itself would normally work over a wide range of input voltages but it wouldn't power up correctly. Must admit I'd thought the actual hardware would still be in the UK? But under the control of their newly out sourced experts? Who might just have a problem popping down the corridor to press the reset button in event of a failure? You don't need a reset button, proper servers can by powered up or down over a separate, independent network connection. This is independent of the rest of the server, so it not affected by the system being locked up by a crash. They can even have their operating systems installed from scratch remotely. A person only needs to be there if a physical item such as a hard-disk or fan needs replacing. The network itself should be designed so that it automatically restarts after power loss or consists of devices that can be remotely restarted in sequence 'til the full network is up and running. SteveW |
#38
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
On 30/05/2017 18:32, tony sayer wrote:
Some years ago, in the early 80s, I was in Pakistan on Business. I was doing some equipment trials. I decided to test the equipment in the hotel room. I was in a major city. I had a simple linear power supply which 'dropped' the mains voltage to that required for the equipment. The equipment itself would normally work over a wide range of input voltages but it wouldn't power up correctly. Out came the multimeter and at first it looked like the power supply had failed. However, it turned out the mains voltage was way below the supposed 240V, as I recall something under 200V. I spoke to our local rep and it seems this was far from unusual- as were power cuts etc. This was in a major city. Sure enough, that night when I needed to answer a 'call of nature', there was no power. The next morning, the mains measured 265V. Mind you some SMPS's do cope with a very wide mains input variation these days.. Some, well designed, linear ones did as well. I remember using a 14-track reel-to-reel tape recorder to record vibrations for later analysis 30-odd years ago. That recorder would run off anything from 24V to 400V, a.c. or d.c. SteveW |
#39
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
On 30/05/2017 10:55, The Natural Philosopher wrote:
On 30/05/17 10:50, Terry Casey wrote: In article , says... However how would it take out other data centres as the newtork would be dual redundant in Geographically remote locations. Its been suggested over at uk.railway that: "BA appear to have both their main data centres on the same industrial site near Heathrow. Presumably both on the same mains distribution feed!" That would make sense to an airline, wouldn't it? After all, you wouldn't have the co-pilot flying in a different plane to the pilot, would you? Sigh. Its seems that 'power surge' = We powercyled the kit, like what you do with Windows and Netgear innit? and it didn't all come up and play nice. Which shouldn't have mattered. The guy from Ryanair was on the radio yesterday saying that they have a primary and two backup datacentres, with automatic failover. All at completely different locations. Surely BA should have similar? And without the primary coming back up properly, one of the backup datacentres should have kept everything running as normal. SteveW |
#40
Posted to uk.d-i-y
|
|||
|
|||
BA -----"Power Surge" was to blame???
In article l.net,
Dave Liquorice scribeth thus On Tue, 30 May 2017 18:31:19 +0100, tony sayer wrote: I Often wonder about this dual mains feed in a data centre after all where do they manage to get two separate off differing transformers from and differing HV feeds?. Give enough money to your local DNO and they will provide diversity power. Yes If you do have deep and very deep pockets! A backup diseasel is cheaper im most all cases. Something like BA's data centre I'd expect to have at least two diversly routed 11 kV feeds from different Primary subsations (33 kV 11 kV) that don't share a main 33 kV supply or the backups to those 33 kV (might be 11 kV, our local Primary has 11 kV backup). Well had this debate with a friend who works in data centres and he reckons that not many have that luxury - one went down near that London Telehouse a while ago only had the one mains feed as there wasn't another available for quite some distance. Decent UPS batteries and a well tested and fuelled diesel and the hope of course that BT's fibre plant is on some reserve power too;! What to hear that one where so BT term box wasn't on a backed up supply to a data centre ,just got overlooked.. There are documents on the web that decribe the power arrangements for football stadiums, that may host the larger competitions. Search for "fifa lighting power supply" Yes mainly to do with lighting for TV and yes if you do have two separate mains supplies nice, and the diesel genny not a lot of money for a Footie club a weeks wages for a player perhaps;!. -- Tony Sayer |
Reply |
Thread Tools | Search this Thread |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Forum | |||
looking for "soft start", low power surge, sump pump | Home Repair | |||
Left-wing race obsession - race baiting - again: "'White racism,'NRA are to blame for lax gun rules, Martin O'Malley says" | Metalworking | |||
Supply 2 Pole,3 Pole Gas Tube Surge Arresters,Gas Discharge Tube,(Ceramic Surge Arresters | UK diy | |||
X-Box power surge - blew out switching power supply | Electronics Repair | |||
Difference between whole-house surge supressor and secondary surge arrestor | Home Repair |