"A critical report earlier this year by the Center for Public Integrity highlighted at a 2011 incident at LANL where eight plutonium rods were placed side-by-side for a celebratory photograph"
I am appalled by this. Professionals shouldn't be doing this
The response to this should be to honestly discuss data about the risk and realistic case studies of previous accidents so that professionals actually want to act safely. I worked for three years at LANL (nothing nuclear), and the actual response will be to force everyone to watch an intellectually insulting video and take a quiz where all the answers are "check with my manager before I do anything", further fermenting a distain and distrust for procedure.
Unfortunately, the gut response "I am appalled; this must change" by folks who don't know anything about how the lab operates (i.e., lay citizens and politicians) exerts pressure toward the latter, not the former.
Nuclear material workers have much lower rates of job-related accidents and illness than many (probably most) other jobs. Much lower lifetime radiation doses than flight attendants, for example.
I've seen this happen in other industry (I'm thinking mining for ore).
The evidence was pretty solid that safety standards in the workplace are considerably higher than at home. Even relative to other industries performance was good.
This is only opinion, but I thought:
1) This was a good state to be in
2) It was bought on by ignorance of lay people and politicians who didn't understand relative risk
3) The situation reflected well on our political system functioning well without being very logical
It drives me crazy that we hold nuclear to a higher standard again even though the cost/benefit ratio appears to be better than many other industrial and non-industrial activities (eg, driving cars, using coal). People will cheerfully say 'but nuclear ... causes X', but rarely compare that with the benefits that were bought for X.
As an aside: Back in the day, I worked for a nuclear sciences consultancy whose hostnames for machines were Simpsons' characters. They strived to make control-systems fail-safe with extensive planning, processes safe as possible and simulations as accurate as computing power allowed (30 million lines of Fortan worth)... it just takes one nitwit thinking "it's okay" or "they 'know' better" to screw up the industry's reputation even more than prevailing NIMBYism.
>I worked for a nuclear sciences consultancy whose hostnames for machines were Simpsons' characters.
I once tried giving hosts Arabic names. It actually worked pretty well since the naming system lets you express inheritance and works better when spoken than "proxy X on testnet Y" or "proxyX dot testnetY" (especially when not everyone speaks English as their first language and some people may be joining the call from a cell phone in a car).
Well the logic certainly isn't supported by data. Most homeowners have no problems allowing cars to drive around their neighborhoods, despite the fact that cars have dangerous emissions that kill thousands yearly, not to mention inattentive drivers who cause thousands of wrecks annually.
That's because it's their own cars. The same people will complain about a burn ban during a dry summer, because it's their own yard waste they want to burn. But you want to put a developmental group home in the neighborhood and they freak out. There's no sense to it, just egoism.
Is this dangerous? Genuine question, it sounds like I should also be appalled but I don't know anything about plutonium.
Edit: Answered my own question:
> keeping bits of plutonium far apart is one of the bedrock rules that those working on the nuclear arsenal are supposed to follow to prevent workplace accidents. It’s Physics 101 for nuclear scientists, but has sometimes been ignored at Los Alamos
Yep! The mathematics is pretty simple and recurs in a bunch of other places, from the theory of evolution by natural selection, to the question of why things go viral on the Internet.
The basic idea is that these sorts of objects are routinely seeing spontaneous nuclear decays -- there might be billions per millisecond of these. What's important is that each nuclear decay is an explosion which releases debris that can cause other nearby nuclei to decay as a result. So it becomes important to know: what is the average number N of nuclei which fall apart as a direct result of getting hit by the shrapnel of one decaying nucleus?
Since nothing is terribly exact in physics there are two regimes to consider, N < 1 and N > 1. In that first regime we can roughly calculate that we need to multiply this baseline billions-per-millisecond rate of spontaneous decays by the number
1 + N + N² + N³ + ... = 1/(1 - N)
The virality analogy is "Given one person shares it, what is the average number N of their friends who share it?" As you can see as this gets closer to N=1 it goes towards infinity.
For N > 1 the same multiplier holds but it cannot be summed infinitely: instead we realize that each term takes a slightly longer time period and thus the growth goes something like e^(k t) for some k, it's an exponential growth towards a majority of the sample reacting.
For fissile materials like plutonium, an easy way to increase N is to just bring two plutonium rods closer together: free debris from explosions in the one now cause new explosions in the other. This is called forming a "critical mass" hence the language about "criticality". Another way is to bring in these "neutron reflectors" that reflect the debris back into the same sample, it's basically the same principle.
If the critical mass of Pu-239 is 11 kg, can you go below that, say 5 kg, by reflecting the neutrons. I just wondering how much below that 11 kg you can go.
Is there a static substance that reflects loose neutrons? I thought that the choices for a substance were either to let the neutron pass through, or to absorb the neutron
If you're already working with plutonium, whose dust is also very, very toxic (in addition to being radioactive), the additional danger of beryllium doesn't seem so great in comparison.
I routinely handled beryllium for a couple of years as it is present in certain electronic RF power amplifiers. It’s perfectly safe in solid form. If you drill it or smash it, a different matter!
I knew a software engineer who made a lot of money in a startup and bought himself an ultralight bike of titanium and beryllium. They ceased the beryllium parts at the Canadian border and I don't think he ever managed to get them released to him.
I think only about 2% of the population is sensitive to beryllium. I was in a beryllium monitoring program (yearly blood tests) at work. A few coworkers are sensitive, one getting really sick.
It's the most dangerous thing that can happen at a nuclear weapons facility. Too much plutonium in one place and it starts going critical. It doesn't explode, but lots of radiation comes out suddenly and people die of radiation poisoning. People have died from such incidents in the past.
And at the time, Los Alamos was essentially the only place in the country -- probably the world -- doing that kind of research. So it's not really surprising that both incidents occurred there.
> On May 21, 1946, physicist Louis Slotin and seven other Los Alamos personnel were in a Los Alamos laboratory conducting another experiment to verify the exact point at which a subcritical mass (core) of fissile material could be made critical by the positioning of neutron reflectors.
> It required the operator to place two half-spheres of beryllium (a neutron reflector) around the core to be tested and manually lower the top reflector over the core via a thumb hole on the top. As the reflectors were manually moved closer and farther away from each other, scintillation counters measured the relative activity from the core. Allowing them to close completely could result in the instantaneous formation of a critical mass and a lethal power excursion
> Under Slotin's unapproved protocol, the only thing preventing this was the blade of a standard straight screwdriver, manipulated by the scientist's other hand. Slotin, who was given to bravado, became the local expert, performing the test on almost a dozen occasions, often in his trademark blue jeans and cowboy boots, in front of a roomful of observers. Enrico Fermi reportedly told Slotin and others they would be "dead within a year" if they continued performing it. Scientists referred to this flirting with the possibility of a nuclear chain reaction as "tickling the dragon's tail", based on a remark by physicist Richard Feynman, who compared the experiments to "tickling the tail of a sleeping dragon".
An aside: It's this kind of shit that scares the crap out of me with regards to nuclear energy. Some - literal - cowboy completely disregards the rules, endangers him and his colleagues, and then someone gets injured or killed. All for the benefit of showing off.
We should do better than this, but incidents like this give me little faith.
Unsurprisingly, this experiment killed Slotin:
> On the day of the accident, Slotin's screwdriver slipped outward a fraction of an inch while he was lowering the top reflector, allowing the reflector to fall into place around the core.
> [Slotin] received a lethal dose of 1,000 rad (10 Gy) neutron and 114 rad (1.14 Gy) gamma radiation in under a second and died nine days later from acute radiation poisoning.
The picture of this is pretty indicative of the culture there. The room is a mess, they are handling the material with bare hands, and there's an empty glass coke bottle next to the apparatus. Given how toxic beryllium dust is, handling with bare hands and faces, and having a drink next to it, all of which seem to be asking for inhalation or ingestion. As for Slotin and the screwdriver, it beggars belief; why didn't they build a mechanism to raise and lower the hemisphere from a safe distance and within a shielded container? A simple hinge, and wire with counterweight would have allowed to to be carefully lowered and to "fail safe". It all seems a bit messy and haphazard, rather than the clean and organised environment we might hope such activities take place in.
When I look at the steps I used to go through multiple times a day when working in Cat 2 and Cat 3 biological containment facilities to work with cells and pathogens, including double door airlocks under negative pressure, safety hoods, protective gear etc., stringent aseptic practice LANL seems quite lax in their practices. We all watched ourselves and each other for bad technique and careless infractions to maintain that discipline without the need for dedicated separate safety inspectors (though periodic inspections did occur). That good practice was also tied into self-preservation of self and others when working with dangerous stuff; no one wants infection by some horrible pathogen. It seems quite bad to be working with even more lethally hazardous materials with basically zero protection, and the description of the attitude of the management 2011 and the present day towards safety seems to be equally cavalier.
In all the academic and industrial environments I've worked in, we had rigorous inventory and tracking of all dangerous materials (biological and radioactive), so it seems odd that LANL fails so badly here. This was the first casting in four years, and they immediately failed: why wasn't there preparation and planning for moving the material before the casting even started? Is Pu randomly stored around with place? Is there no oversight at all? In the lab I currently work in, we have to track every last trace of radioactivity (mainly P, N for biological labelling I think; I'm not involved directly) and account for it for statutory reporting every quarter, and if you fail to do it properly there would be a massive investigation and you would be banned from working with it; the lab managers make a bit deal of it, and rightly so.
It's a little ironic that safety has been compromised in order to meet production targets, but this resulted in a complete shutdown. Had they worked safely and sensibly and avoided the shutdowns, they would have overall been vastly more productive even if this was slower than the management would have liked. I've seen this pattern several times now in multiple places, from factories, to research laboratories to software development. It all comes down to unrealistic management goals from the top which dictate working at a fast place with attendant quality and safety problems no matter that a better end goal could be realised by working at a slower place with a little more care and thought. We see the same problem every time software design or implementation is compromised by a tight deadline with no scope for doing it the right way for the longer term, purely to meet some unimportant (in the greater scheme) short term deliverable.
The lab was created as part of the nation's balls-to-the-wall effort to build a nuclear bomb with which to smite its enemies in war. I imagine that safety was only a concern to the extent that incidents would slow them down. Otherwise, with thousands of people dying each day in the war, it's worth the risk to people at the labs if they can end the war faster.
Slotin got himself killed after the war was over, but it was less than a year after, and I imagine that sort of "get it done at any cost" culture does not change overnight.
I wonder if that attitude is in fact the root of the trouble today. Especially since I bet that the "save the nation and damn the risks" attitude came back for at least a decade or two once the Cold War ramped up.
I think you may well be correct in your assessment. Culture doesn't change overnight, and it's also inherited so may well have persisted to the present day. It also doesn't change without external pressure, and only reluctantly even then.
I've seen different cultures in various labs I've worked in or had contact with, and it can vary wildly from being extremely disciplined to extremely sloppy and both practices can be picked up by new people from the individuals concerned.
It was theorized at the time. Much of what we know about nuclear safety was learned from accidents like this. Slotin was doing excessively dangerous and downright stupid even for the time, but let’s still be careful to only judge him by the standards of the day.
Lead is a high-Z material, which provides a lot of electrons to interact with photons (and charged particles). It’s not good for stopping the neutrons that pose the danger here, because of the disparity in mass between the shield nuclei and the neutrons. For storing fissile materials in proximity, the optimal material is something with a high absorption cross section, like boron. Water and plastic are good at removing energy from neutrons and preventing leakage in typical large shielding scenarios, but in this case they would increase the danger, because they thermalize neutrons efficiently (as does a workers body) and this increases fission interaction probabilities and reduces the margin to criticality. For spent nuclear fuel, something like a boron-carbide plate is used. LANL may use something similar.
Indeed. The 2011 event was made worse by a worker using their hand to separate plutonium rods, which effectively placed a water barrier between them, making the likelihood of criticality higher. The link below describes the incident and how absurdly counterintuitive criticality safety can be.
By itself, per se, no. But I imagine when dealing with weapons grade plutonium you want belts and suspenders turned to eleven. So such breaches of protocols are indeed serious business.
It would make a great Snapchat location filter. The location of nuclear power stations, waste storage facilities and laboratories is basically static and well known. Those can be a little grainy. Three Mile Island, Fukishima and Chernobyl can be increasingly more grainy than the baseline.
Im sure, the human stupidity will go away, if graciously wrapped with incident statistics and threads of repercussions. Or at least it will become unmeasurable- which to incident statistics comes down on the same side as safety.
Add up those dumb things and you still get less health risk or property damage than you get from coal dust or oil spills or gas explosions, which is why they correctly say nuclear is safer.
"Virtually all of the Los Alamos engineers tasked with keeping workers safe from criticality incidents decided to quit, having become frustrated by the sloppy work demonstrated by the 2011 event and what they considered the lab management’s callousness about nuclear risks and its desire to put its own profits above safety."
This is key. It's hard to find criticality safety engineers, and their absence (as well as management not giving a damn) was probably a key factor in the 2016 incident.
Having the wrong management is almost worse than having unqualified engineers.
I worked in a startup where they had put the Systems Operations (as in, Linux datacenter guys) under the Chief Marketing Officer for about a year for (reasons).
Three years later, we are STILL cleaning up after that mess. Wrong management creates problems that aren't so much a result of ineffective management... wrong management seems to always move resources away from things that most engineers would call "normal and customary processes" (like patching software or updating libraries that software is dependent upon) and that's how you end up with Equifax. Or a smoking hole in the desert somewhere in New Mexico.
Indeed! There's mounting evidence that Equifax simply didn't "get" application development in addition to operations and patching. I doubt the developers of their vulnerable mobile app were responsible for patching. As such, we've seen security issues which span the company where it's likely the mobile app developers not only didn't report to the same management chain as those in operations, but likely reported to entirely different organizations. This of course points to management issues all the way to the top of the company.
That's a great point that is often missed here. You can get alot done with barely qualified individual contributors if there is enough process in place and the people calling the shots understand what they are doing. It's not fun, but possible.
Unqualified people making bad decisions are always fatal.
You can make that exact same argument about security in software development and operations. If management doesn't think it is important it will go downhill pretty quick.
How much profit have Lockheed and other defense contractors made over the past hundred years? I'd expect that it's a significant percentage of defense spending in that time.
Looking at the ID on the one bar, I'm kind of surprised that they'd allow 8s and Bs to be valid characters. Seems dangerous if two rods get mixed up or misidentified.
„The exact cost to taxpayers of idling the facility is unclear, but an internal Los Alamos report estimated in 2013 that shutting down the lab where such work is conducted costs the government as much as $1.36 million a day in lost productivity.“
> And most remarkably, Los Alamos’s managers still have not figured out a way to fully meet the most elemental nuclear safety standards. When the Energy Department on Feb. 1 released its annual report card reviewing criticality risks at each of its 24 nuclear sites, ranging from research reactors to weapon labs, Los Alamos singularly did “not meet expectations.”
That sounds almost suspicious. It's hard to actually articulate that suspicion, but nuclear material is involved here.
From earlier paragraphs:
> In 2013, [officials in Washington] worked with the lab director to shut down its plutonium handling operations so the workforce could be retrained to meet modern safety standards.
> Those efforts never fully succeeded, however, and so what was anticipated as a brief work stoppage has turned into a nearly four-year shutdown of portions of the huge laboratory building where the plutonium work is located, known as PF-4.
What...?
My first thought is "front for clandestine operation." How, or what, I don't know; I don't even know if something like that would be viable. If a TLA wants to play with nuclear stuff, couldn't they make their own base, or would they need an existing one? And if they did need to use these kinds of facilities, surely they'd be able to keep the media out.
Okay maybe I'm wrong. I'll leave this comment here as a suggestion that maybe this line of thinking isn't correct after all. (I've been figuring this out as I've typed.)
The national labs in general almost certainly do plenty of work for clandestine organizations already. I doubt they need the excuses related to mishandling of dangerous material. They just never discuss those things with the press.
> "What's in section PF-3?"
> "That's classified, I can't discuss it."
> "What's in section PF-4?"
> "That's where we used to work on plutonium for DoE and IAEA but we had a pattern of problems there and it was shut down."
I used to work there right after finishing grad school. It was just when the systematic dismantling of that organization had started taking place. New bosses, new umbrella organizations, new rules (for profit vs non-profit).
Within 12 months of me starting there everyone more senior than me had scattered elsewhere. Soon I was gone too. This happened at every level and in every division. The amount of know-how lost just to forced retirement is incalculable...
So sad to hear things are not getting better there. I was very proud to be a part of it for even a little amount of time.
I had a similar experience at another national lab. It was a dream job up until a few months in. That was when higher positions were swapped out with new people seemingly overnight whose philosophies were completely different from what the lab had subsisted on. In my department, it was as if anyone who protested anything would be found packing their things the next day. It was insane.
I left in less than a year for another job. Part of me wishes I would have been able to pull through, but part of me thinks I wouldn't have had a job in a few months. There's no way to really know. Going there for the first time was a genuinely magical experience. I'll never forget it.
At the end of the day, change is good. But more consideration needs to be put in the complexity you find at national labs.
I heard great things about LANL during my job. I hope things do improve over there.
I think that the sub-contractor culture is causing these problems. You see the major labs are subbed out to the lowest bidder. These are the plant managers. They are under pressure to produce pits for the countries nuclear arsenal refresh. They have not produced anything for 5 years. Hiring skilled technical engineers like molders and machinists is hard these days as votech died off in alot are areas. Finding a knowledgeable temp is next to impossible. That is essentially what these workers are. Temps.
In the old days the labs were run by stoic multinationals like duPont now we have the fly by night bidders like Dyncorp and Fluor running our crown jewels.
Race to the bottom if you like.
A rather strange process for something so important.
Most of the US labs were managed by universities back "in the old days". LANL and LLNL were both originally operated by the University of California, with Bechtel joining the fray when UC was forced to compete for the management contract.
Please cite the source that backs up your claim that the workers are (essentially) temps. What you wrote sounds plausible but if the workers actually have secure employment your entire theory falls apart.
I don't know the details of LANL or any other nuclear related organizations, but I am a NASA contractor. We have contractors and civil servants working side by side on the same research projects. The primary reason contractors are used at NASA is that their employment is always directly tied to a specific project, and they will be out of a job if that project ends. That isn't the case for a civil servant, who basically has a job for life once hired. We aren't exactly temps, but we are very much at risk of being fired any time budgets are cut or reorganized.
I was also a NASA contractor for many years. My employer was the prime contractor but I worked with many subcontractors on my teams as well. Some of them had very stable careers that predated and outlasted mine, since work was available to them whenever they wanted due to their network of personal connections with civil servants. On paper the civil servant was supposed to ask the prime through the contracting officer for a body with a certain skill set and we provided as we see fit, but that’s never how it worked in any reality except on paper. The government employee would always make an informal request, get assurance from the contractor staff about who would be provided, usually someone they had prior experience with or interviewed, and then do the paperwork. When I was interviewed it was by my contractor manager and contractor team, but the civil servant was sitting in the room. I guarantee you he had informal veto authority. As prime you would never risk a customer relationship by doing otherwise — that would be a sure fire way to lose the contract next time around.
In budget planning contractors disappear at the end of a project, which is very useful for budget planning purposes. But in most cases it’s not like those contractors are suddenly without a job, except in rare cases like shuttle workers. Rather the agency has fixed and flat budget, so when one winds down a new project starts up. And it is no surprise that in nearly all cases the skill set allocation of new positions roughly equals those going away. And the same people fill those roles, generally. (Again, the shut down of shuttle and forced retirement of many workers is the exception to this.)
During my time there was a competitive rebid of the contract had. We won, but during the process the competition approached many of us on the contract individually, letting us know what sort of compensation they would be willing to pay for us to jump ship if they won, to return to the same job wearing a different patch on the sleeve. My civil servant “boss” (air quotes because he wasn’t my manager on the org chart, but in day to day reality he effectively was) pulled me aside to let me know that no matter who won I’d still have a job. It was actually the same thing my employer did when they first won the contract from someone else. I wasn’t there at the time, but many of my coworkers were grandfathered in that way.
While what you are saying is correct, it might paint the wrong picture for someone not having our shared experience.
Nuclear warhead security has always been problematic, not just the facility producing the materials.
Watch [1] because reading the rest just make everyone dizzy. Our launch protocol relies on very outdated technology. Not that there is anything wrong with old technology in general (as long as the protocol is well-established)...
We don't even want to bring up the recent unfortunate Navy crashes... we ought to really step up.
I toured a capsule decades ago, in the ground below an air force base. The tech was very dated, however that also means it'd had plenty of time to be "debugged" shall we say. No uncertainty as to whether it works right.
Plus, being so dated, far less likely to get hacked. Might've been a joke.
Yeah. I think old technology works. Just a human soldier vs a robot; sure robots can shoot better if we get to that stage, but you can't hack a human... or you can use a frying pan to kill a sniper if you happen to sneak on him... (PUB insight)
Bad things are going to happen in any organization. The key is not who you fire or blame, it's how you change your processes so that no similar accident can happen again. Make it impossible that this could reoccur.
An example of this might be something like "all plutonium rods are individually stored in locked lead boxes". (I'm no nuclear physicist, this might be a terrible idea, but you get the point).
It sounds to me like the leadership at this facility don't understand that. These same things keep happening. They don't change their processes enough to prevent it.
That's coming. Nuclear generation will only shrink from now on. In the solar/wind/natural gas context, the public refuses to pay more for nuclear. With decreasing need for fuel as existing power plants succumb to age and poor management, eventually these labs will be seen as the hideous dangerous boondoggles they have always been.
I'm anti-nuclear in Taiwan, mostly to the earthquake risk. I'm apathetic about nuclear in Switzerland.
From reading the ScienceMag article that nerdy linked to, I learned that PF-4 has been unable to make new weapons for 4 years due to their safety staff quitting. I joined some anti-nuclear protests before, but those protests aren't targeting any individuals.
Meanwhile my best-case scenario (shutdown of weapons production) is happening, because the staff quit due to bad management.
I'm curious about the reasoning behind an anti-nuclear power position.
We've seen just about the worst that can happen, TMI, Cherynobyl and Fukushima. All three were due to human error: mismanagement of the system and/or mismanagement of the mitigation afterwards. These were terrible things nobody would want in their town, but they are manageable and bounded.
Compared to the global disaster in progress, humans burning things, a nuclear plant mishap seems preferable, no? Globally, we're facing a mass extinction, more intense weather, and loss of low elevation islands and cities everywhere. What's worse is the effect is not bounded, it will continue to get worsefor centuries as we burn more stuff, for the whole planet.
"terrible things nobody would want in their town".
Where the natural risk is higher (e.g. Fukushima), my answer is "just say no". Where the risk is manageable (deserts, a bunker under the Alps), I don't mind so much.
Nuclear is better than coal or oil, by a long way. Renewables would be great, but only hydro has a real chance to be competitive at this stage. So maybe nuclear is needed in some less-populated, geologically stable regions. I trust location more than any human safety protocol, because of the reasons in the article.
I think part of the problem with nuclear is that the consequences are pretty clear. Burning fossil fuels puts a lot of particulates into the air that overall kills some people but we don't really think about it. It would be interesting to see what type of power was responsible for the most deaths per kWh
The climate related deaths will rise, but humanity is having a hard time thinking long term about it. Just this fall, Irma, Harvey, and Jose took a number of lives and many billions in cleanup. Weather deaths per year might be a useful statistic to follow, including heat and cold deaths.
It’s pretty easy to design nuclear reactors that are earthquake safe. I would think that an island nation like Taiwan with unreliable international relations would benefit from a locally supplied power source like nuclear.
This is a world of trade offs. It is easy to point to the bad,m in one approach, but you must also evaluate the bad of the alternatives.
The mass exodus of safety engineers as described in a Science article makes me think that there's a parallel between the concept of exit, voice, loyalty and the concept of brain-drain, which I see as a potential Gresham's Law mechanism.
The staff effectively said "if our work is given so little heed here, we'll go elsewhere where it is" -- a classic brain-drain mechanism (and a frequent response to a declining firm or corporate culture).
That's also effectively a mechanism of the generalised concept of Gresham's Law -- applied not only to money, but to any quality valued (or costed) differentially, whether in one or multiple markets. A parallel that dates to the earliest descriptions of the phenomenon -- Greek playwright Aristophanes in "The Frogs" describes the behaviour as common to both coin and politicians, an observation repeated by American journalist H.L. Mencken in the 20th century, see his "Bayard vs. Lionheart".
A high-quality (and high-cost) team saw low professional rewards at Los Alamos, and decamped for greener pastures. Brain drain occurs for various causes, and not just compensation, but if the conditions for work, rewards, or oppression are discouraging in one location, the talent will generally go elsehere.
In WWII, much scientific (and other) talent, including much of that which developed the U.S. nuclear programme fled Nazi-occupied Europe. From the 1950s through the 1970s, and to an extent still, black artistic, musical, and business talent leaves the U.S. for Europe, for much the same reason: to escape oppression, and to seak greater opportunities.
(Talent flow between subnational regions, industries, academia and business, etc., follows similar patterns.)
So LANL is making weapons now. That used to be done at Pantex. Why is LANL doing this?
The US forgot how to make nuclear weapons. It's been decades since the US built one from scratch. For about 20 years, the US lost the capability to make H-bombs. There's some special material required, and the 1950s factory to make it had worn out. An attempt was made to make it by a cheaper new process, and that didn't work. A plant using the old process wasn't funded for decades.
The US has way too many old nuclear weapons, and has been overhauling them during this period. There's no shortage. The fissionable parts don't wear out, but the tritium has to be replaced every decade or so.
Somewhat related if you are interested to see what could have happened had the samples gone supercritical [1]. Its eerie to read about, it sounds like science fiction and its terrifying to think some of the smartest people in the world made mistakes that could have been avoided by proper procedures.
It is. There's a huge shortage of workers in the weapons research field because of the old experienced guys retiring and the difficulty of training new employees due to today's more stringent restrictions.
Also the stereotypical pay issue. They'll only pay average or lower despite the annoyances of the job being top 1% or so of jobs. (drug tests, sec clearance, unbelievable endless bureaucracy, weird job reqs, etc)
There are cultural issues, I remember reading something about "mid career" mech engineers being five years exp. Which is great if you graduated at age 55 or so, not so good if you graduated at 22 with loans that will take more than 10 years to pay off. Meanwhile theres a handful of "lifer" boomers who clog up the promotion pipeline until death at which point the talent pool is empty between the ages of perhaps 30 to 60.
Its a poor conditions, underpaid, temp job, more or less. Insert surprise that people would rather work anywhere else.
The article doesn’t give the impression that modern standards are higher, but it may be the company culture that is the problem rather than the rules. Having mass resignations is hardly a sign of a good company culture.
It must be hard to keep track of a bunch of pieces of plutonium not ever getting near other pieces. Not something people are used to doing in normal life. Some kind of radio tag that beeps when in proximity to other tags might be a good safety measure.
"Criticality Safety Event" isnt the same as a criticality event. These are violations of safety protocols, not nuclear physics run loose. Bad, but not horrific considering the number of industrial accidents happening daily. There was no risk of explosion, only localized radiation. Id be far more worried about the dozens of chemical plants one flood away from exploding lethal clouds into cities.
No, it is horrific. The reason we have safety protocols is to avoid hazardous situations which cause harm. Nothing happened in this case, but that's only because of luck. Safety protocols are about reducing risk (the probability of an accident) to minimal levels, and to violate them is to invite disaster because it significantly increases the probability. They are about taking as much human error out of the equation by formalising safe working practices. On the surface, it might not appear to be a "big deal" because everything was OK this time, but actually it means that safety is not being treated seriously and that it's only a matter of time before that carelessness causes a serious accident. That needs nipping in the bud and sorting out at every level of the organisation.
If you've ever worked in an industrial or laboratory setting, you'll be familiar with risk assessments, hazard levels, and the attendant working practices that accompany them. Sometimes it's taken to bureaucratic extremes, but it's always important to follow them strictly, because as soon as you start ignoring them and taking shortcuts, you're no longer working safely and you're endangering yourself and your co-workers. That's complacency, and it's a bad place to be. I've seen a co-worker grow complacent about biohazards, and they ended up in hospital with a nasty tropical parasite infecting them, all because after several years of strict discipline, they grew complacent about the danger because it's invisible and you get sloppy in your well practiced routine (I assume; even they don't know exactly how it happened, but it was almost certainly due to sloppy working practice). The same applies here; this is very dangerous stuff but it looks innocuous and working with it leads to trivialising the danger and working unsafely. In a well managed environment, this should be being picked up on quickly by co-workers and inspections. Where I've worked, any violation would mean a formal report up to line managers and lab managers with appropriate disciplinary action. And I have done so when I encountered it, for the safety of all of us. Safety culture needs to be ingrained so that it's second nature.
One angle to look at this is that "material transfer" is actually the primary activity taking place in most industrial and laboratory settings. When I worked in a food/drink industry lab, the logistics of the whole multi-stage process from input raw materials, processing and production, to packaging materials, packaging, warehousing and distribution were all carefully planned and controlled (by an AS/400). When I worked in pharma, all the compounds were in a central robotic compound library, and everything checked in or out was controlled, and all operations in the laboratories were automated and controlled. What I'm trying to say is that the logistics of material handling, inventory and transfer have been solved in many industries for decades. There's no question of where inventory lies because it's all recorded from start to finish. You don't have a hold up because the warehouse is full, you run out of packaging materials, or there are no empty tanks to fill from the previous step in the pipeline, because you have a total view of the process and can plan all the logistics to manage the process optimally. Random materials are just not lying around to clog up the process. There's a managed process with careful oversight and record keeping, and a safety culture ingrained into all workers and management from the start. LANL seems to be very backward in these respects.
Natural uranium is pretty unremarkable. The greatest danger from it is heavy metal poisoning. It's also common enough that restricting its availability would be rather difficult.
Strictly speaking, if you got enough of it and were sufficiently determined you could create an improvised nuclear reactor with it, something along the lines of a carbon-moderated design. But the risk from that would be fairly minimal apart from some local radioactive contamination.
It's just the ore. Not purified, and not enriched for a specific isotope. So its radioactivity is vastly less, and it's really just a novelty and for playing with your geiger counter.
They moved too much plutonium into the same room, risking nuclear reaction.
Perhaps they should use software to track the movement of materials which would forbid the move from taking place when the end result would be dangerous?
Indeed, all problems in the world can be solved by SV software startups staffed by hipster brogrammers. The only real difficulty is which JS frontend framework to choose.
For nuclear weapons material? React (now with MIT license is a lot less dangerours, because you can sue Facebook if they start to build an ad-supported missile infringing your IP)
OT, but I've seen this phrase several times, and I have no idea what it means. The two component words seem contradictory, like "preppie burnouts" or "considerate bullies". Is this irony? If so, what are we attempting to indicate about firms staffed by these people? Simply that they employ multiple groups who annoy us?
"[those who] reject the culturally-ignorant attitudes of mainstream consumers ..."
Brogrammers:
"[those who tend] to mimic stereotypical "jock", "bro", or "cool" culture in combination with the egotism, insensitivity, and terrible humor of "nerd" culture"
Combine the two and you get:
Narcissistic jock bros that are also egotistical insensitive nerds
My first thought when reading the article was I could have easily instituted operational change preventing future criticality incidents with maybe 2 days max of JS coding. Hackathon anyone?
Yes, software is a tool, nothing more, and unless it's part of a process which is enforced, it's worthless. The places I've worked at were very strict about it: no accurate data entry, no progress. And they also had physical cards to match to ensure entry of the written details before it got handed over. In this specific case, you'd likely want a card to physically accompany every bit of material so that you have a physical token to document and hand over responsibility, but that also has to be part of an enforced process. It's less easy to ignore or forget a token than glance over a screen and miss something of significance. If you store the cards on the wall when the material enters a room, it's also a quick visual indicator of the quantity of material in the room, and if you only have space for a certain number, it also acts as a simple physical restriction on the amount of material you can move into the room. Less sexy than computing technology, but for critical important processes, the movement and handover of physical tokens works very well.
In the factory I worked at, the physical card was required to be physically handed over from department to department, and the information also put into the computer before the next processing stage could proceed. This was enforced; I once was part of a panic when some of the "tank cards" (which were the physical tokens representing the contents of storage tanks) went missing. They were found in a lab coat pocket after some frantic searching. But because the rules were enforced, this did block the physical process. And further delay and we'd have got an internal fine of £10000 which was another incentive for not losing track of them!
> Click the lock to give your browser permission to send you notifications then refresh the page.
I feel like browser manufacturers need to make "This website would like to embed itself into your browser and send you push notifications indefinitely" pop-up a little bit more scary if websites are getting this scummy.
A friend and I made www.timetaco.com, which gives you a countdown to an event you specify, and can send you a notification a few minutes earlier. I think that's one of the few legit use cases I've seen.
It's not about how close they were, it's how they ignore or don't understand simple safety rules that are supposed to prevent incredibly dangerous accidents.
These people aren't responsible enough to be working in the nuclear industry.
Here's a car analogy: take driving across an intersection. The hazard here is being involved in a collision, possibly fatally, by crossing traffic. The risk is the likelihood of a car colliding with us when we cross. We mitigate the risk by requiring the traffic to stop at give way lines and/or installing traffic lights. This is the safety protocol.
Would you drive across the intersection through a red light, in violation of the rules? Maybe the risk varies during the day as the traffic level varies, but whatever the time, there's always the chance of a collision if you go through a red light, but almost guaranteed safety if you stop at the red and wait for the green.
The same considerations apply to dangerous situations at work, such as this case. The hazard is severe, and the risk is high. You avoid the problem by following the rules and working safely. It doesn't matter that they "weren't even close to critical", the situation should never have had the possibility of occurring in the first place. You don't want to rely on a probably OK, you want absolute certainty or as close to that as you can get, otherwise you're playing the odds and it is only a matter of time until there's a serious accident. That's not how things work in serious settings. Just as you would (I hope) not take the wholly unnecessary risk of running a red light, you wouldn't take unnecessary risks with potentially critical masses of Pu.
I am appalled by this. Professionals shouldn't be doing this