The Future of Plutonium

On November 6, 1944, researchers at the Hanford Site in Washington first created weapons-grade plutonium, the radioactive element used less than a year later in the Fat Man, the atomic plutonium implosion-type bomb dropped on Nagasaki, Japan, during World War II on August 9, 1945.

The element was first discovered in 1941. Before then, it had existed only in chemists's imaginations. As the Science News Letter eloquently put it in August 1945, "the knowledge [of plutonium] had been about that which a man has of a woman whose beautiful face flashes by him as he looks from a train window into the windows of another train going in the opposite direction."

The description is fitting, if dated. Plutonium was incredibly hard to produce, and even when they figured out how to make it, it was difficult to make enough to be useful. Researchers at the University of California, Berkeley, only found a speck of plutonium after bombarded a block of uranium with particles called deuterons in a particle accelerator. At the Hanford plant, each of the three reactors required at least one ton of uranium to produce just 225 grams of plutonium. (The Fat Man required 6.2 kilograms.)

But eventually labs and facilities got better at it, and by the end of the Cold War reactors had churned out so much plutonium the facilities ended up with surpluses. And as hard as plutonium was to produce, it's even harder to get rid of.

In the 1970s, the U.S. rejected proposals to have private companies use nuclear waste for energy—radioactive elements produce heat during fission that can be converted into electricity—because of the risk of using and losing the elements without government oversight. That means that today, American plutonium is not reprocessed. Stores of plutonium simply sit and wait.

Hanford's B Reactor building (Department of Energy)
Hanford's B Reactor building (Department of Energy)

The issue soon became both a political challenge and a technical one. No state or city wants to be known as the site of nuclear waste, but there's no good way to get rid of the plutonium. Burying it means first figuring out how to engineer a space that will last long enough (say, at least 10,000 years) for the waste to decay. Shooting plutonium into space means building rockets with low failure rates and contingency plans that would minimize radiation if the rocket were to fail on takeoff. Dumping the waste into the ocean means environmental trouble.

"It's always been a problem," Alex Wellerstein, a historian of science at the Stevens Institute of Technology, tells me. "People assumed during the Cold War it would be figured out in a decade or two."

So far, it hasn't been figured out. But it's not for a lack of trying. From 1993 to 2013, the U.S. participated in a program called Megatons to Megawatts with Russia, where the U.S. purchased and helped convert uranium from Soviet era warheads into low-enriched uranium fuel. In 2010, Bill Gates invested millions in TerraPower, a startup aiming to produce small, sustainable nuclear reactors that would rid the U.S. of its stockpiles of nuclear waste and use them to generate electricity and power homes. In 2013, researchers at MIT's Transatomic facility designed a safer nuclear reactor that would avoid meltdowns. But these projects have barely made a dent in the total sum of plutonium still sitting around.

Using nuclear energy as an alternate power source will be a steep uphill battle for any company willing to try, but Wellerstein says nuclear energy has the potential to become part of our future without fossil fuels. "I'm not one of those people who thinks nuclear power is the answer to everything, but it probably needs to be a component going forward, because reliance on fossil fuel is terrible," he says. And while some might balk at the idea of creating even more nuclear waste, it's worth remembering the alternative. "Nuclear waste is a small-scale problem compared to climate change," he says.



Read The Future of Plutonium on theatlantic.com



More From The Atlantic

Advertisement