Tony Evans Posted January 24, 2014 Posted January 24, 2014 But I'm not promoting a false dichotomy. Before we go down the dancing of angels on pin-heads you qualified the above claim twice. Not only students. But serious students. I'd make the claim that your enlightened and ideal serious student of technology is a rarer creature than you imagine. Your confusing recognition of a fact with the existence of that fact. Whether or not a person knows that technology and society are for all intents and purposes a monolithic whole, they still are. You're also making a vice out of a mere fact. Relatively few people are in fact students of technology. And the students of technology are, almost by definition, the only people who really and regularly think about it at the depth we're discussing it. So they're the only people who are likely to have an opinion about the technical and the social in relation to each other. In the 1960s a factory roof collapsed on a snowy day. It seemed a technical failure. The components and/or materials couldn't take the weight and so collapsed. But then a researcher looked at it and saw that the design assumed a level covering of snow with the weight thereby distributed. They also discovered that 20 others roofs of the same design had collapsed that night because drift snow had built up only on certain parts of the roof. What seemed a technical failure was in fact a fallible decision on the part of the designer. And that fallible decision propagated wildly. You talk of root cause analysis - the whole point is that prior to much research in the 60s and 70s the root cause would have been identified as a technical failure (that was the initial conclusion actually). But in reality it was a set of fallible decisions that caused the collapse. Source: Barry Turner - Man Made Disasters. James Reason - Human Error also talks about fallible decisions. You're kidding me, right? What information about snow loading on roofs did the designer have? What was the relevance of that information to his particular design? Could the designer have actually predicted the uneven snow loading from his knowledge of the subject and his knowledge of roof design? If he was ignorant of snow loading, was it culpable negligence on his part, or was the information just unavailable? It's entirely possible that the failures were the result of nothing more than general ignorance that the designer couldn't be held accountable for. If the failures were the result of poor or negligent design practice, what makes you think that wouldn't have been arrived at by a court in a malpractice action? The legal concept of architectural and engineering malpractice certainly predates the 1960s. Some common sense would ironically help here. An organisation with no culpability would lead to an anything goes culture and would ironically reduce reporting information as there'd be nothing to report. You're taking what should be a common sense position on the level of culpability an act should receive, and taking it to some theoretical and abstract extreme. Sorry, Phil, but you're working overtime to miss the point. Damn near your entire thing here has been about how baaad "blamism" is. Oh I imagine so. But who is most familiar with a system on a daily operating basis? Who learns about it at the coal-face and how does that information get to these high level domain persons? I am sure they are indeed most familiar with the design - but they are often some of the least informed people regarding how that design interacts and operates on a routine basis. "[O]ften"? Bullshit. Sometimes? Everything happens sometimes. But by and large the operators couldn't tell you the first thing about what goes on under the hood. Yeah, they know all the rattles and squeaks, and maybe even how to patch them up and keep things going. But they don't possess anything like the knowledge it takes to put the thing together from the ground up. But what it all boils down to is you rejecting research and substituting it with nothing. Indeed worse than nothing, you assume a level of common sense which even a cursory knowledge of some accidents and disasters shows that it is clearly absent in the extreme. I'm not substituting anything for anything else. I'm just doing the most horrible thing in the world -- telling you that all of the academic argle-bargle you're in love with is simply irrelevant. Organizations and disciplines are full of people that can figure out how to fix problems once they know they have them. And the solutions are mostly common sense, your personal imperative to make saviors out your academic heroes notwithstanding.
Phil Posted January 24, 2014 Posted January 24, 2014 That's one of he dumbest posts you've ever made. Your bias is obvious. I remember when you were in community college and then you'd bang on about how other universities had nothing to teach them bla bla bla. It isn't irrelevant Tony. It's only irrelevant to you because you assume the world corresponds to your experiences. Your experiences are completely generalisable to you. I'm sure if I introduced an example involving software or the Corps you'd stroke out. Anyway. The point about the snow load was that the problem lied with a person. Previously the problem would have been thought technical. So that person could then be re-trained or improved and his fallible decisions no longer spread rapidly due to the negentropic properties of modern society. It seems so obvious a point to me and so many others but you don't seem to be grasping the mettle. Research helped PROVE that many technical failures were actually the result of social acts or decisions or assumptions. Which means now that investigations look more deeply for root causes beyond simple technical failures and immediate personal culpability. As for my opinion on blamism I can prove it cuts down information flow. I can show that information is required for an organisation to learn. Thus I can show it affects organisational reliability. You haven't shown anything at all other than try to discount an argument about I haven't made. As for organisational knowledge. You simply don't know what you're talking about. I'd give you some sources but they are all irrelevant to you just like years of psychological, sociological and natural applied science in relation to organisational reliability. If it's so obvious to you it must be to everyone else and your ilk of technologists whatever one of them is. So there's no need to explain why ferries sail with bow doors open, why oil refineries explode, why stadium crushes happen or why space shuttles explode when everyone knew there was a potential problem with components. Honestly that was a shit post Tony that showed nothing more than your obvious bias to personal experience and projecting your thought process on the world.
rmgill Posted January 24, 2014 Posted January 24, 2014 if your interface sucks, you lose in the marketplace. And it's always been that way. The steering wheel was a better interface than the tiller, and won the market on ships, cars, and planes (except for special cases of the latter, where the stick was better, such as the tight cockpit of the fighter). The idea that technologists have only recently considered what it is humans are doing with their products, and how -- and then only under the tutelage of academic researchers -- is pure garbage.You can hang onto market place for a while but you must adapt, eventually. Witness the clutching hold IBM had for a long time in the face of the Mac, which was billed for a long time as not a serious business machine. IBM and Windows machines managed to hold onto the market, at least for a while due to the game market in the Home PC. They got into the business place because of IBM. And I'm not saying that the technologists haven't considered interface its just that there are enough deep aspects to it that having folks that specialize in it can be useful as the good ones intimately understand and can explain it properly just as a good engineer can intimately understand his mechanical/logical systems. Sometimes it's a matter of perspective, I've seen many people who can see one side or the other but not both and really understand the best interface. Sometimes it takes a non engineer to build a better interface by working with the engineer because the guy designing the guts understands it and that colors his view of how and why the interface works but not how. I've even seen this myself with databases I've thrown together for something I was working on. I don't use it for a while forget how I did something and then the wart on the interface's intuitiveness rears up and I see "hey that's not a good way to present that data at all!".
Tony Evans Posted January 25, 2014 Posted January 25, 2014 (edited) That's one of he dumbest posts you've ever made. Your bias is obvious. I remember when you were in community college and then you'd bang on about how other universities had nothing to teach them bla bla bla. I never said any such stupid thing. What I said was students at small colleges (not community colleges -- my school had started as a two year college, but was a four year institution when I went there) have more and better access to the highest qualified faculty members. For example, most of my classes were taught by PhDs and I never sat in the stereotypical big university lecture hall with 100+ other students, where the instructor didn't even know my name. That's not a bias, those are simple facts.It isn't irrelevant Tony. It's only irrelevant to you because you assume the world corresponds to your experiences. Your experiences are completely generalisable to you. I'm sure if I introduced an example involving software or the Corps you'd stroke out. Hardly. I just don't subscribe to your prejudices and biases.Anyway. The point about the snow load was that the problem lied with a person. Previously the problem would have been thought technical. So that person could then be re-trained or improved and his fallible decisions no longer spread rapidly due to the negentropic properties of modern society. It seems so obvious a point to me and so many others but you don't seem to be grasping the mettle. Research helped PROVE that many technical failures were actually the result of social acts or decisions or assumptions. Which means now that investigations look more deeply for root causes beyond simple technical failures and immediate personal culpability. By that logic, in 1912, if a ship runs into an iceberg and sinks, the response should have been next time build a ship that won't sink when it runs into a million tons of ice. Or, in 1889, if a dam breaks and kills 2,200 people downstream, the solution is simply build a better dam. Yet, in reality, in both the cases of the Titanic and the Johnstown Flood, all of the poor human decision making, in all of its sordid detail, did in fact come out, and most of it right soon after the incident. Unless you can point to some added value of all of the research you are so in love with, that just isn't obvious in what you've said so far, I don't see how anyone with any sense and a little historical knowledge could possibly credit it with anything particularly new. As for my opinion on blamism I can prove it cuts down information flow. I can show that information is required for an organisation to learn. Thus I can show it affects organisational reliability. You haven't shown anything at all other than try to discount an argument about I haven't made. It's not about what you can prove, Phil. It's about your overwhelming emphasis on something that's just a feature of holding people accountable for their culpable acts. IOW, it's not that assessing accountability causes people to hide from accountability. That's well understood. It's the idea that turning misdemeanors and infractions into nothing more than teachable moments is necessarily a good thing, when in fact it leads to tolerance of sloppy behavior, at least for a time. Then you wind up assessing culpability anyway. I know your research won't tell you this, but what happens in practice is that some people accept better procedures and some don't. If you don't discipline the non-compliers, you wind up with chaos. At best what your preferred strategy does is shift culpability, and attempts to avoid it, downstream. As for organisational knowledge. You simply don't know what you're talking about. I'd give you some sources but they are all irrelevant to you just like years of psychological, sociological and natural applied science in relation to organisational reliability. I'm not talking about corporate knowledge from experience. I'm talking about the knowledge to design and build complex systems. That kind of knowledge is highly concentrated. If you think it isn't go ask a ditch digger, who may in fact know everything there is to know about digging ditches competently and efficiently, why he's digging his ditch in a particular place, at a particular time. Or ask a power plant operator, who may know everything there is to know about the history of the system in operation, why he's monitoring a particular system through a particular instrument, why the instrument scale is in some set of units, why the scale encompasses a certain range of values, and why the green, yellow, and red zones are what they are. In both cases you'll find that the holders of all of the organizational knowledge really don't know all that much. That's the reason we build complex systems in the first place. Simple systems require highly knowledgeable operators, of which there can never be enough. So complex systems are built that can be operated by slightly above average intelligence routiners, of which there are millions. If it's so obvious to you it must be to everyone else and your ilk of technologists whatever one of them is. So there's no need to explain why ferries sail with bow doors open, why oil refineries explode, why stadium crushes happen or why space shuttles explode when everyone knew there was a potential problem with components. There's every need to understand all of those things. The problem here is that your biases, not mine, cause you to refuse to see that organizations and disciplines can and do solve complex problems -- and always have -- without a bunch of academic research backing up their thinking or decision making. Honestly that was a shit post Tony that showed nothing more than your obvious bias to personal experience and projecting your thought process on the world. Nope. You just know an awful lot about a very little, and view the entire world through the lens of your love for the narrow scope of things that you do know. Edited January 25, 2014 by Tony Evans
Phil Posted January 25, 2014 Posted January 25, 2014 (edited) Tony we're never going to agree and I am not so much of a narcissist that I can't bear the thought of you not thinking like me or holding a different opinion. So I'm calling it a day before we end up doing the usual angels on a pinhead death-spiral that characterises debates with you. You keep trusting in common sense, the ubiquitousness of your experiential knowledge and its endless validity and generalisability. And please keep denying why some of the safest organisations on this planet have confidential reporting systems. I'll continue to keep asking why people and organisations do stupid shit that kills other people and continue to know more about very little... And I'll leave you with an example of an academic telling a professional organisation why things go wrong https://www.rcn.org.uk/development/practice/cpd_online_learning/making_sense_of_patient_safety/core_concepts_in_patient_safety Out. Edited January 25, 2014 by Phil
Tony Evans Posted January 27, 2014 Posted January 27, 2014 (edited) 1. You keep trusting in common sense, the ubiquitousness of your experiential knowledge and its endless validity and generalisability. 2. And please keep denying why some of the safest organisations on this planet have confidential reporting systems. 3. I'll continue to keep asking why people and organisations do stupid shit that kills other people and continue to know more about very little... 4. And I'll leave you with an example of an academic telling a professional organisation why things go wrong https://www.rcn.org.uk/development/practice/cpd_online_learning/making_sense_of_patient_safety/core_concepts_in_patient_safety Out. 1. I don't trust in common sense like it was some kind of god or even just magic. But I do know that solutions to problems, once the organization is aware of them, are generally in the common sense category. 2. I'm not denying anything, Phil. I'm pointing out that your mental model of confidential reporting is incomplete. It doesn't take into account practical consequences of the concept in practice. Specifically, whether or not intended, it can and does lead to the accountability measures sooner or later, simply because, in the natural course of things, it uncovers culpable action or inaction. So your intent to avoid "blamism" is fraught with unintended consequences that do in fact include the assessment of blame. It has to, because whether or not the intention is to assign blame, blame will come out, when the root cause of almost any selfish or negligent action is discovered. 3. Organizations and the general public at large ask "why" all of the time, Phil. And they generally get very good answers. And pretty much always have -- see Titanic and Johnstown Flood for classic cases. 4. Jeepers. That's not even analysis. Its metaphor. And the mental models promoted there aren't even useful to someone actually working the job. A nurse doesn't have the time to think about Swiss cheese or "three buckets". They're unnecessary distractions from the simple thought process of identifying and reporting unsafe conditions. And that's all about interpersonal politics and organizational environment. And a poor management stance towards reporting of problems doesn't take a research grant to find out. It's right in the organization's face, every day. What you don't get, Phil, is that over the last twenty years, I've spent a lot of time in a lot of meetings, listening to supposedly expert trouble shooters drone on and on about some high-minded approach to organizational excellence. And the answer always boils down to "do something when somebody notices something needs to be done". Who needs 36 years of research to figure that out? (Actually, I know who -- the same kind of people that need scientific studies to tell them that yes, it probably is a good idea to stay out of the sun if you have sensitive skin.) The real problem is not identifying what makes organizations fail. The real problem is in doing the simple things that avert failure. And no amount of study or advice is going to matter if an organization doesn't want to learn, even from its own mistakes. Coda: We have managed to recapitulate here, in large part, a phenomenon of your British life over the last 200 or so years -- the intellectual and social struggle between the doers of the commercial and professional middle class and the thinkers of the academic middle class. I have to say that you have played your part well. But it's still a good thing that the doers still know enough not to let the thinkers' "help" get in their way. Edited January 27, 2014 by Tony Evans
Phil Posted January 27, 2014 Posted January 27, 2014 once the organization is aware of them Which is often too late. Phil. I'm pointing out that your mental model of confidential reporting is incomplete. It's not my mental model. Its a synthesis of several analysis of several confidential reporting systems. Again I have not argued for the abolition of blame, or that it does not represent some thorny issues but it is an established fact that organisational learning requires information, and that confidential reporting increases that information flow. The sum of the information gathered is more effective than enforcing culpability on all discovered acts. http://asrs.arc.nasa.gov/overview/immunity.html The above is the practical example of how such a system is administered. Been going a long time. That's not even analysis. Its metaphor. No shit Tony. It's the distillation of an enormous amount of research conducted since before 1990 for a particular audience. What you don't get, Phil, is that over the last twenty years, I've spent a lot of time in a lot of meetings, listening to supposedly expert trouble shooters drone on and on about some high-minded approach to organizational excellence. I think we all get it Tony, we've had it ad naseum from you. I get it certainly. Your experiences trump all. It doesn't matter if enormous amounts of cross disciplinary research and real world analysis and practice say you're wrong about where knowledge lies in organisations for example. You have a pathological belief that your experiences are infinitely generalisable to the entire gamut of the human existence. It's why I've put about as much effort into this post as scratching my arse because I'm talking to a brick wall. You completely reject an evidence based approach to reliability because you're anti-academia. A discussion would not even be possible if we were trying to argue about a natural science problem if we did not draw on the research and findings of the past but here you're willing to forego that because you like to think your experiences are particularly insightful because you've lived it all and sat in some meetings.
Phil Posted January 27, 2014 Posted January 27, 2014 (edited) We have managed to recapitulate here, in large part, a phenomenon of your British life over the last 200 or so years -- the intellectual and social struggle between the doers of the commercial and professional middle class and the thinkers of the academic middle class. I have to say that you have played your part well. But it's still a good thing that the doers still know enough not to let the thinkers' "help" get in their way. All that bollocks says to me is that you, from several thousand miles away physically, and several trillion intellectually and philosophically, don't really understand the context here and certainly the context of how research and practice interact here. Unless that was brought up on one of the agenda's you sat in on or perhaps was discussed during those long nights as a SNCO in Kuwait? Don't get me wrong, I am certainly not exclaiming idealism, merely that things are a bit more complicated than that. Oh, and in case, I'm not an academic either. So I've not been playing the "part" you seem to be hinting at. Edited January 27, 2014 by Phil
Tony Evans Posted January 27, 2014 Posted January 27, 2014 Which is often too late. +10 for pathos, -10 for logos. The consequences of bad design or bad implementation often have to be felt before anyone knows there was a bad design or implementation. Taking one of your examples, how does anyone know -- or even imagine -- that a cabin crew member wouldn't report an engine fire until it actually happened? No amount of reporting, confidential or otherwise, can overcome failure of imagination. It's not my mental model. Its a synthesis of several analysis of several confidential reporting systems. Again I have not argued for the abolition of blame, or that it does not represent some thorny issues but it is an established fact that organisational learning requires information, and that confidential reporting increases that information flow. The sum of the information gathered is more effective than enforcing culpability on all discovered acts. http://asrs.arc.nasa.gov/overview/immunity.html The above is the practical example of how such a system is administered. Been going a long time. Yes. And the one big, fat, obvious hole in that document's claims totally evades you. Yes, the system doesn't report the confidential informant's identity to the enforcement arm, but the enforcement arm has its own ways of finding out what happened, and will in fact find out what happened, in detail, as it investigates the report. People that work in aviation are relatively smart. You think somebody's going to become a confidential informant for an action he knows is culpable and which he can pretty easily figure will be found out in the subsequent investigation? Like I said, your mental model is positively blindfolded in this respect. No shit Tony. It's the distillation of an enormous amount of research conducted since before 1990 for a particular audience. And all it says is not very relevant to that particular audience. It is a nice intellectual framework, but a nurse doing his or her job just needs to be told, "You see something wrong, report it or fix it." And even then, the advice given totally ignores nursing reality. So what is being tired on the job or being less than fully trained for every eventuality are dangerous conditions? They're simply a reality of the work environment. I think we all get it Tony, we've had it ad naseum from you. I get it certainly. Your experiences trump all. It doesn't matter if enormous amounts of cross disciplinary research and real world analysis and practice say you're wrong about where knowledge lies in organisations for example. You have a pathological belief that your experiences are infinitely generalisable to the entire gamut of the human existence. It's why I've put about as much effort into this post as scratching my arse because I'm talking to a brick wall. "[P]athological", I'm not an ROE whiner, but I suggest that you rethink whether you really meant to say that. And I don't think my experiences are infinitely generalizable. But they are considerable -- perhaps infinitely more so than yours, given your apparent failure to recognize and acknowledge a process that corporate America undergoes routinely, which is the calling in of "experts" to tell people what they already know, in (supposedly) new and exciting ways. Heck, there's a whole long-running and widely syndicated comic strip based on this kind of thing in American business -- Dilbert. How would it gain such traction and recognition if my opinion of this kind of thing was so unique to myself? You completely reject an evidence based approach to reliability because you're anti-academia. A discussion would not even be possible if we were trying to argue about a natural science problem if we did not draw on the research and findings of the past but here you're willing to forego that because you like to think your experiences are particularly insightful because you've lived it all and sat in some meetings. I'm not anti-academia. I'm anti-nonsense. I highly respect and trust academics working in hard science research and education of the practical professions. I even think history, archeology, and psychology are useful when the researchers and practitioners are honest with themselves and thus honest with their audience. The rest I have no use for, and seriously doubt we'd miss if all vaporized tomorrow. WRT "an evidence based approach to reliability", like I've already said, you're playing a pair of eights like it was a full house. It's at best descriptive of best practices -- organizations have, after all, done just about everything suggested in the research before the research was ever done. The research doesn't "prove" -- as you put it -- that those organizations were right. It just records that they were. All that bollocks says to me is that you, from several thousand miles away physically, and several trillion intellectually and philosophically, don't really understand the context here and certainly the context of how research and practice interact here. Unless that was brought up on one of the agenda's you sat in on or perhaps was discussed during those long nights as a SNCO in Kuwait? Don't get me wrong, I am certainly not exclaiming idealism, merely that things are a bit more complicated than that. Oh, and in case, I'm not an academic either. So I've not been playing the "part" you seem to be hinting at. Funny. The concept of the middle class of Great Britain being divided into professional and academic circles came to me from reading a British academic history. (Yes...even though I'm apparently anti-academic.) I forget precisely about what -- probably WWI, but the model of the British middle class presented was self-evident, having been presented. It explains too many things. So, if you're not an academic, what exactly are you? If you can't say, I'll be forced to believe you're one of those closet intellectuals that drives a city bus or mans a hotel door, totally misunderstood and dishonored in his own time. BTW, I was never a staff NCO. I was only ever a sergeant.
Phil Posted January 27, 2014 Posted January 27, 2014 Like I said, your mental model is positively blindfolded in this respect. None of your criticisms of such systems explains why they are used at all? Why build such a system (in fact several such systems) and operate them in the way they are operated unless they added value by contributing to reliability? They are interesting for their rarity - very few organisations have the stomach for operating a confidential system because the allure of blame (and thereby the false attribution fallacy) but all the more reason not to doubt their efficacy. Not that there is really such a need since research shows the basic thinking behind them works in organisations able to cope with the idea of not holding individuals always culpable. So why are they operated? And where is your evidence that they don't work as intended? which is the calling in of "experts" to tell people what they already know, in (supposedly) new and exciting ways Methinks you're mixing up serious thinking with the enormous amount of "management" science bullshit that is out there that bangs on about deliverables and outcomes and frameworks and driving change. I am not talking about that less than rigorous crap. What I am talking about is the exact type of research and thinking that can show a lot of that rubbish for what it is - a sales pitch. Don't confuse snake-oil salesmen with people who conduct a lot of work into what boils down to "root level" analysis. The rest I have no use for, and seriously doubt we'd miss if all vaporized tomorrow. Yes quite. You have no use for. It's at best descriptive of best practices But it's not. It has influenced and systematised a method of investigation so that foresight can be generated from the hindsight root level analysis. It also means that organisations have been studied so incorrect assertions such as yours about where institutional knowledge lies can be used to help build organisational structures that take this into account and ensure knowledge is more widely shared. No academic paper can change the culture of an organisation but it can certainly give the like-minded executive or manager the tools to do it and to have evidence to show that the assumptions are valid and generalisable. Either way it enriches understanding. It just records that they were. And as I said, gives others the evidence base to learn from those other organisations. The concept of the middle class of Great Britain being divided into professional and academic circles came to me from reading a British academic history. Well it must be true then if you read a paper on it. f you can't say, I'll be forced to believe you're one of those closet intellectuals that drives a city bus or mans a hotel door, totally misunderstood and dishonored in his own time. Not unlike you Tony. Someone who has served, someone who has done some menial jobs and now someone who works in a highly politicised and complex organisation. On top of all that I have undertaken some study to help broaden my understanding on various things. I would say I was a pretty well rounded individual combining some academic interests with some varied and practical experience. BTW, I was never a staff NCO. I was only ever a sergeant. Which is a Senior NCO (SNCO) in the British Army. Just an example of something that can't be generalised across contexts I suppose. Your experiences can't be generalised to this country because I don't believe you set foot here. So you argue from information gleaned in an academic paper. But then you undermine academic papers of other disciplines (and as I have said I am not talking about management quack bollocks) and claim ascendancy from your experience. You can't have your cake and eat it.
Tony Evans Posted January 28, 2014 Posted January 28, 2014 They are interesting for their rarity - very few organisations have the stomach for operating a confidential system because the allure of blame (and thereby the false attribution fallacy) but all the more reason not to doubt their efficacy.And with that burst of analytical brilliance, I think we can be done.
Paul in Qatar Posted January 29, 2014 Author Posted January 29, 2014 Today's New York Times had a short article saying the Air Force investigation of cheating on the qual tests (or whatever you call the quizes given to launch officers) has widened.
Paul in Qatar Posted January 31, 2014 Author Posted January 31, 2014 92 (or 114 maybe) "missileers" are now suspended. How many of these people can there even be? Surely a hundred of them must be close to all of them. http://thehill.com/blogs/defcon-hill/policy-strategy/196978-air-force-suspends-92-nuclear-missile-officers-in-cheating
RETAC21 Posted January 31, 2014 Posted January 31, 2014 2 per crew, 3 crews per shift, two capsules every 6 or so missiles... lots.
Paul in Qatar Posted February 1, 2014 Author Posted February 1, 2014 Well, the math ought to be straight foreward. We have 1,000 land-based "heavy" launchers. So 333 launch capsules at one per six launchers. FIguring four two-man crews per capsule, to allow for leave, illness and so on, we get south of 2,500 launch crewmen. So about 0.25% are under a cloud. That would not be so bad if they were tank drivers, but of course they are not.
Marcello Posted February 1, 2014 Posted February 1, 2014 Well, the math ought to be straight foreward. We have 1,000 land-based "heavy" launchers. More like 450 nowadays if I read right.
Paul in Qatar Posted February 1, 2014 Author Posted February 1, 2014 Oh, well, they have stopped sending me the newsletters since I retired. So about 1,250 launch officers, with perhaps 125, or 1% under a cloud.
BansheeOne Posted May 29, 2021 Posted May 29, 2021 Quote Date 29.05.2021 Author Alex Berry US soldiers accidentally leak nuclear secrets via study apps — report Online study aids used by US soldiers stationed at nuclear bases around Europe have been found to contain sensitive details. An investigation by Bellingcat uncovered the leak. Troops on US bases in Europe housing nuclear weapons have been using publicly accessible online flashcard apps to remember long and complex security protocols, the investigative website Bellingcat revealed on Friday. The military personnel turned to sites such as Quizlet, Chegg Prep and Cram to memorize codes, jargon and even the status of nuclear vaults, according to the report. While European governments generally refuse to confirm or deny the specific locations of US nuclear weapons being stored within their borders, leaked documents, photos and comments by retired officials often confirm the presence of the weapons. The latest leaks, however, have gone so far as to identify the exact number and location of the weapons within the bases, including whether the vaults they are stored in are "hot" — with live weapons — or "cold." How did Bellingcat discover the secret information? The author of the investigative piece, Foeke Postma, explained that the researchers were able to discover the flashcards belonging to the active soldiers by searching for certain terms known to be associated with nuclear bases. The result was the unearthing of several sets of flashcards revealing information about several bases around Europe, including in Germany, the Netherlands and Turkey. One set of 70 cards with the title "Study!" disclosed the number of live and non-live nuclear weapons at the Volkel Air Base in the Netherlands, which the Dutch government considers a secret. Other sets revealed how soldiers are supposed to react to various levels of alarm, where security cameras are located on site, and "duress words" that soldiers give over the phone to show that they had been, for example, taken hostage by attackers. How have people reacted to the leak? Bellingcat discovered flashcards dating back as far as 2013 and as recent as April 2021. The site contacted NATO and the US military for comment before publishing their story, after which the cards that had been discovered were removed. [...] https://www.dw.com/en/us-soldiers-accidentally-leak-nuclear-secrets-via-study-apps-report/a-57710899
Stuart Galbraith Posted May 29, 2021 Posted May 29, 2021 Yes, I thought this was very interesting. Id love to know what terms to search, but I dont think an orange jumpsuit would suit me much.
Der Zeitgeist Posted May 31, 2021 Posted May 31, 2021 (edited) On 5/29/2021 at 11:37 AM, Stuart Galbraith said: Yes, I thought this was very interesting. Id love to know what terms to search, but I dont think an orange jumpsuit would suit me much. A few days ago, it still worked by googling: the name of the base, PAS, WS3, vault, chegg, and then going into the Google cache on the site. It was pretty wild, because it also contained information about which specific WS3 vaults had live weapons in them and which ones only had crew trainers. Edited May 31, 2021 by Der Zeitgeist
Adam Peter Posted May 31, 2021 Posted May 31, 2021 Sadly this speak volumes about the training, and the IQ of the persons tasked to guard these weapons.
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now