Saturday, September 24, 2011

Neutrinos and the future of physics

The scientific community is buzzing, or one would imagine we are, after news of this paper came out last week. Could neutrinos, those mysterious particles which hardly interact with normal matter, really be traveling faster than the speed of light? It was all over the news. The collaboration held a news conference. Hints of another cold fusion fiasco creep into conversations.

For the benefit of those who don't feel like trudging through the 24 page paper, here is a brief summary (with pictures!):

The detector is called OPERA, which stands for "Oscillation Project with Emulsion-tRacking Apparatus" (I know, that should be OPETA, but hey, you decide on the acronym first and then fit the name to it). It was designed to measure neutrinos coming from the super proton synhrotron at CERN. The neutrinos produced (as a byproduct, incidentally) at CERN are mainly muon neutrinos (remember that there are three "flavors" of neutrinos), and OPERA, located just under 500 miles away in the Italian national lab of Gran Sasso, hopes to see tau neutrinos. Neutrinos can spontaneously change flavor (the theory, actually, is not that they discretely change, but that they exist as superpositions of all three flavors in various mixing ratios which "condense" into one type when interacting with matter), and measuring the number of muon neutrinos which change into tau neutrinos will help to determine all sorts of interesting physical things (like whether or not neutrinos have any mass... photons don't, but electrons do). Here's what OPERA looks like:

As an added bonus, the detector measures the time spectra of the neutrinos as they pass through it, and the neutrinos, just like the protons that produced them originally (back at CERN), come in bunches. One of these bunches looks roughly like this:

The red curve is the proton time spectrum (from CERN). The black data points make up the neutrino time spectrum from Gran Sasso.
So after these two time spectra are collected for a given "extraction" (ie, a given bunch), they can be compared. Here's the kicker. The curves can be adjusted to fit one another only if the scientists assume the neutrinos which were observed in the OPERA detector are traveling faster than the speed of light. Not much faster, mind you - only 0.0025% - but over the hundreds of kilometers from start to finish, that amounts to 60 nanoseconds (ns). This number they give with a "six sigma" stamp of approval... which, in a nutshell, means that the 60ns result has only a 0.0000002% chance of being a statistical fluke.

Ok... so we have a result that seems to indicate the neutrinos are traveling faster than the speed of light. What I'd like to do now is not argue whether or not this result is correct, but instead point out a common fallacy, if you will, with regard to physics results which may or may not alter the paper's conclusions.

Fallacy #1: More statistics = better result
The authors of the paper make a big deal of the fact that around 16,000 events went into their data analysis. This is a large number of events, and naively it is true that statistical uncertainty goes down as sample size goes up - for something that follows a gaussian distribution (aka "bell curve"), for example (like the exam grades of a typical physics class), the statistical uncertainty increases with the square root of N, where N is the number of events. This means that for a sample size of 100, the uncertainty is 10 events, or 10%. Increase your sample to 1000 and the uncertainty is 32, or 3%. If you had 16,000 events, your statistical uncertainty is 126 events, or a meager 0.8%! So it appears that more stats does mean better results... but this neglects a very important point, and that point is systematic uncertainty.
There are two types of uncertainty in any experiment, as any student who's taken a basic science lab will tell you. Statistical uncertainty is the uncertainty inherent in the number of samples, but systematic uncertainty is a totally different beast. Systematic uncertainties are those which you introduce - intentionally or not - to the experiment, just by the way you do it. Is there an uncertainty in the length of the ruler you used to measure your experimental distances? Is there an uncertainty in the way you recorded the time? Sometimes, it can be very difficult to account for all of these experimental biases, because you may not even know they exist. Maybe you have some bacteria in a petri dish, and you've drawn a grid on the glass of the petri dish to allow you to measure how much the bacteria are moving. So you check the locations of your bacteria at noon, and again at three, and again on Tuesday. But did you stand in exactly the same location when you measured the bacteria against the grid? Did you lean over the petri dish in precisely the same way? Light bends through glass (refraction, the same reason a straw looks bent in a glass of water), so the angle of your eyes to the surface of the petri dish will change, very slightly, the way you see the grid and the bacteria together. Catch that? Sneaky. Yet another systematic uncertainty.
Systematic uncertainties do not get better with more data points. Systematic uncertainties are completely independent of the number of events in a given experiment. In fact, the whole data set - be it 3 events or 3 million - can suffer the same systematic uncertainty, which can sometimes cancel out, but sometimes shift the entire thing one direction or another. Consider your petri dish bacteria. If you drew your grid on the inside of the dish, before putting the whole thing together and filling it with bacteria, then what you see from the outside is always slightly off from what really exists on the inside. Even if you always stand in the same place and look in the same way, the entire grid may still be slightly off from reality, and this would offset your entire data collection. This potential offset is why, in my field, we always try to do experiments in as many different ways and at as many different labs as possible. It serves as a check. If we do things here at our lab and someone else does the same things at their lab, and our results are always offset from one another, a systematic uncertainty is said to exist between the two labs. Two methods can also suffer from a systematic uncertainty between them, such as measuring a nuclear reaction "forward" (oxygen+alpha->proton+fluorine) and "backward" (proton+fluorine->oxygen+alpha).
There are ways to estimate systematic uncertainties, and the more often you do something, the better you get at it (the more a piece of lab equipment is used, for instance, the better characterized it is). But OPERA has only been running a few years (this may seem like a long time, but for neutrino experiments, it's not). The amount of data they've collected is still being analyzed. So the potential for as-yet-unknown systematic uncertainties certainly exists (the authors of the paper even admit this fact explicitly, saying "the potentially great impact of the result motivates the continuation of our studies in order to investigate possible still unknown systematic effects that could explain the observed anomaly").

Now, to be fair, it's entirely possible that this result is real. While the observation of supernova SN1987a seemed to preclude the possibility of neutrinos traveling faster than light, an earlier result from the MINOS experiment indicated that neutrinos they measured might have been going a bit too fast (that experiment, however, had big enough uncertainties that the neutrinos could have been going light speed). String theory allows for faster-than-light travel through fluctuations in the "quantum foam" of spacetime. General relativity, however, does not.

So here are some more specific notes for the scientifically-minded reader, with regard to the things I feel are likely suspects in the search for systematic uncertainties.
1) The limits set by SN1987a are for a different energy regime and, more importantly, a different neutrino flavor (anti-electron neutrinos were detected during this event, vs muon neutrinos for OPERA and MINOS). There could be a systematic effect between neutrino flavors, as well as neutrino energy (the OPERA result cannot rule this out).
2) The GPS signals used to determine location and timing had to be taken at the surface, while the laboratories are actually far underground. This leads to an extrapolation, which can lead to uncertainties. Was the curvature of the Earth accounted for? The density and type of the rock?
3) Something I feel is most telling - the neutrino time-of-flight (TOF), which is ultimately compared to the time expected if the neutrinos were going the speed of light to get this "delta TOF" of 60ns, is not actually measured. As I mentioned earlier, the proton time spectra and neutrino time spectra are measured within their respective detectors/labs, and timestamped to within a few nanoseconds. In theory, there is no discrepancy between the timestamps (GPS and cesium-clock generated) at the different labs, but it is even emphasized by the authors themselves that this is not a t(stop)-t(start) kind of experiment. There is nothing that's actually starting a clock when the protons are produced and stopping the clock when a neutrino is seen, and that's because the whole process is statistical (they can't know exactly when a given proton will create a given neutrino, or where). So they do a "maximum likelihood" fit (a fancy, mathematical way of saying "we moved the two curves until they overlapped") to the two time spectra to determine how far off they are from each other. What if there's a systematic uncertainty here? It alters the entire result. What if the neutrino bunch just measured corresponds not to the proton bunch you think it does, but to the one before? It's not that the neutrinos are traveling slightly faster than the speed of light, they're traveling slower, and you're just off by one 'cycle.' I didn't get in the preprint a good description of how they know which neutrino bunch corresponds to which proton bunch, other than simply expecting them to be traveling light speed and assuming that anything falling within a small window around that would be real.

One last humorous note, which I mentioned previously on facebook. Have you ever been working in a spreadsheet program, entering a function into cell B2 that depends on cell D7? Everything is fine unless the content of cell D7 also depends on the value in cell B2... then you get what's known as a recursion error. The functions can't be solved because they each depend on the other, so you end up stuck inside an infinite loop (B2's value leads to D7's which leads to B2's which leads to D7's which leads to...). The OPERA result depends (rather heavily) on GPS timing and position signals. But GPS depends on relativity, and relativity, in turn, depends on the speed of light being constant for all observers (that means neutrinos, too). But if the OPERA result is correct, then the neutrinos have traveled faster than the speed of light, contradicting relativity. If your result contradicts the possibility of your result, how can it be your result?

I've heard a lot of good scientists weigh in on this result and its potential consequences. One real (rather philosophical) question remains. Does this mean the end of physics is looming? Hardly. This is science - doing experiments, drawing conclusions, testing those conclusions with more experiments. Overturning long-held (and often dearly loved) hypotheses is part of the deal, so long as it's done right. Time will tell if this is one of those instances... and won't it be great to know you were there when it happened?

The OPERA Collaboraton: T. Adam, N. Agafonova, A. Aleksandrov, O. Altinok, P. Alvarez Sanchez, S. Aoki, A. Ariga, T. Ariga, D. Autiero, A. Badertscher, A. Ben Dhahbi, A. Bertolin, C. Bozza, T. Brugiére, F. Brunet, G. Brunetti, S. Buontempo, F. Cavanna, A. Cazes, L. Chaussard, M. Chernyavskiy, V. Chiarella, A. Chukanov, G. Colosimo, M. Crespi, N. D'Ambrosios, Y. Déclais, P. del Amo Sanchez, G. De Lellis, M. De Serio, F. Di Capua, F. Cavanna, A. Di Crescenzo, D. Di Ferdinando, N. Di Marco, S. Dmitrievsky, M. Dracos, D. Duchesneau, S. Dusini, J. Ebert, I. Eftimiopolous, O. Egorov, A. Ereditato, L. S. Esposito, J. Favier, T. Ferber, R. A. Fini, T. Fukuda, A. Garfagnini, G. Giacomelli, C. Girerd, M. Giorgini, M. Giovannozzi, J. Goldberga, C. Göllnitz, L. Goncharova, Y. Gornushkin, G. Grella, F. Griantia, E. Gschewentner, C. Guerin, A. M. Guler, C. Gustavino, K. Hamada, T. Hara, M. Hierholzer, A. Hollnagel, M. Ieva, H. Ishida, K. Ishiguro, K. Jakovcic, C. Jollet, M. Jones, F. Juget, M. Kamiscioglu, J. Kawada, S. H. Kim, M. Kimura, N. Kitagawa, B. Klicek, J. Knuesel, K. Kodama, M. Komatsu, U. Kose, I. Kreslo, C. Lazzaro, J. Lenkeit, A. Ljubicic, A. Longhin, A. Malgin, G. Mandrioli, J. Marteau, T. Matsuo, N. Mauri, A. Mazzoni, E. Medinaceli, F. Meisel, A. Meregaglia, P. Migliozzi, S. Mikado, D. Missiaen, K. Morishima, U. Moser, M. T. Muciaccia, N. Naganawa, T. Naka, M. Nakamura, T. Nakano, Y. Nakatsuka, D. Naumov, V. Nikitina, S. Ogawa, N. Okateva, A. Olchevsky, O. Palamara, A. Paoloni, B. D. Park, I. G. Park, A. Pastore, L. Patrizii, E. Pennacchio, H. Pessard, C. Pistillo, N. Polukhina, M. Pozzato, K. Pretzl, F. Pupilli, R. Rescigno, T. Roganova, H. Rokujo, G. Rosa, I. Rostovtseva, A. Rubbia, A. Russo, O. Sato, Y. Sato, A. Schembri, J. Schuler, L. Scotto Lavina, J. Serrano, A. Sheshukov, H. Shibuya, G. Shoziyoev, S. Simone, M. Sioli, C. Sirignano, G. Sirri, J. S. Song, M. Spinetti, N. Starkov, M. Stellacci, M. Stipcevic, T. Strauss, P. Strolin, S. Takahashi, M. Tenti, F. Terranova, I. Tezuka, V. Tioukov, P. Tolun, T. Tran, S. Tufanli, P. Vilain, M. Vladimirov, L. Votano, J. -L. Vuilleumier, G. Wilquet, B. Wonsak, J. Wurtz, C. S. Yoon, J. Yoshida, Y. Zaitsev, S. Zemskova, & A. Zghiche (2011). Measurement of the neutrino velocity with the OPERA detector in the CNGS beam arXiv arXiv: 1109.4897v1

Monday, September 19, 2011

Why a solar flare won't kill you

In response to a comment on the previous post, I thought it would be good to explain why a solar flare won't cause the end of the world (it's going to take more space than just a comment to clear this mess!). Let's see if I can put some sense into this doomsday discussion.

"To date, Fukushima has already released 168 times the total radiation released from the Hiroshima nuclear bomb detonated in 1945, and the Fukushima catastrophe is now undeniably the worst nuclear disaster in the history of human civilization."
According to the International Atomic Energy Agency (IAEA), radiation levels at the Fukushima Daiichi site are constantly monitored, and display a "general decreasing trend." As for the claim that the Fukushima plants have released 168 times the total radiation released from the Hiroshima bomb, we have to specify a couple things. One thing that must be kept in mind is that radiation, radioactivity and radioactive material are different things. Radiation is the energetic particles which are emitted, via radioactivity (which is the process), by radioactive materials (the "parent"). Radiation affects us in a different way to radioactive materials, because while radiation is gone in an instant, radioactive materials can hang around. So if we're talking about radiation, we're talking about the "instantaneous" dose rate (the dose you'd get from the actual particles of radiation hitting you). At the Fukushima plant, the highest confirmed radiation dose rates recorded (and as I said, they're constantly getting better) were about 80 microsieverts per hour (about 8 mrem in US terminology). This was very near the plant and thus no one was actually exposed to it long-term; if a nuclear plant worker stood there for an hour, he or she would get only 2% of the annual dose we get from natural background radiation. As for radioactive materials, which are a bit more insidious because they can linger, the ongoing monitoring in Japan has picked up trace amounts of materials like iodine-131 and cesium-137. The IAEA reports that very near (under half a mile) from the plant, the highest concentrations in air for these radioactive materials was 3 Becquerel per cubic meter and 9 Becquerel per cubic meter, respectively. No ground contamination of iodine was detected; the ground contamination levels for cesium varied from very little to almost 100 Becquerel per square meter. One becquerel is one radioactive decay in a second - so 100 Becquerel (Bq) is the same as 100 decays per second. That may sound like a lot, but it isn't - at the Rocky Flats nuclear weapon building facility in Colorado, contaminations of 500,000 Bq were detected during the 1970s. (In case you're wondering, the area is now a nature preserve!)
Now, to compare these numbers to other instances of nuclear contamination. The United Nations scientific committee which investigates nuclear incidents produced a map of the cesium-137 depositions in Europe following the Chernobyl accident (and I've already touched several times on why Chernobyl was a "freak accident"). Notice the legend: that's a maximum of nearly 1500 kBq/sq m... or 1,500,000 Bq per square meter. In no way is Fukushima worse than this. And yet, even Chernobyl wasn't that bad. So let's last touch on the comparison to the Hiroshima bomb. The Telegraph ran a story last month claiming that the Fukushima incident was equivalent to 168 nuclear bombs (it's uncertain whether they were the first to do so), without really taking time to clarify what that actually means. The Japanese government has estimated the total amount of cesium-137 released so far (it's been about six months) is 15,000 teraBecquerels (TBq), or 15 followed by fifteen zeros. Governments have been known to overestimate the severity of a disaster in order to receive more international aid, and it is understood that when lives are potentially at stake, underestimating severity is more dangerous. So we know with reasonable certainty that this estimation is in actuality too large (pretend, for the sake of argument, that the 100 Bq per square meter measurement quoted earlier is deposited each day - that would mean you'd get a total deposition of ~150 TBq in the whole of Miyagi prefecture since the accident, about a hundred times less than the government estimate). Now the Telegraph writer is connecting this number - the possible upper limit on the total amount of radioactive cesium-137 released in 6 months' time (notice how specific that is) - with the amount of cesium-137 released by the detonation of Little Boy above Hiroshima in 1945. The report claims that Little Boy released 89 TBq (they actually don't specify if this is just cesium, or total), 168 times less than the Japanese claim. Making this kind of inflammatory comparison is nothing new. But, as it even says in the article, "government experts" continue to argue that the comparison is simply not valid, and for good reason. Nuclear weapons produce different radioactive materials than do nuclear reactors (the entire field of nuclear forensics is based on this fact), and the time scales are vastly different. Bombs are dispersive instruments by design, whereas nuclear reactors are made to be contained. Most importantly, from a public relations point of view, people (wrongly) associate the radiation of a nuclear weapon with the deaths the bomb causes (the vast majority of which are due to the explosion itself: heat, pressure, fire), so the inference drawn from the comparison is that release of radioactivity from a nuclear plant is equivalent to detonation of a nuclear weapon, which is the same as assuming it's the pickle that kills everyone born before 1865. And even so, it's worth noting that the radiation/radioactivity released by the nuclear weapons during WWII have - still - had minimal long-term effects on people in the area.
Long story short, I hardly believe that Fukushima represents the "worst nuclear disaster in the history of human civilization." Chernobyl was worse, and in that instance the plant operators didn't have a magnitude 9.0 earthquake and 30-foot-tall tsunami waves to deal with.

"All nuclear power plants are operated in a near-meltdown status. They operate at very high heat, relying on nuclear fission to boil water that produces steam to drive the turbines that generate electricity. Critically, the nuclear fuel is prevented from melting down through the steady circulation of coolants which are pushed through the cooling system using very high powered electric pumps."
The claim that nuclear plants are operated in "near-meltdown status" is absolutely ludicrous, and gives away a complete lack of understanding as to how a nuclear reactor actually operates. Nuclear fuel - typically uranium-235 enriched to a few percent (naturally it comprises under 1% of uranium ore) - is one of a few materials which is fissionable (it can undergo "induced fission"). Left to its own devices, uranium-235 prefers to alpha decay: it emits an alpha particle, which is the same as the nucleus of a helium atom, made up of two protons and two neutrons. Sometimes (less than 0.00000001% of the time, actually), uranium-235 will undergo "spontaneous fission," where instead of an alpha particle, it will break up into two large chunks, plus a bit of energy and a few neutrons. Inside of a reactor, where a lot of uranium-235 is packed into a small space, these neutrons (after being slowed down by water) can hit other uranium-235 nuclei and cause them to fission also (that's "induced fission"). These also produce a few neutrons, which then hit other uranium nuclei, and the process begins what's called a chain reaction, or criticality: the nuclear reaction is self-sustaining. All that energy being released each time a uranium nucleus fissions is being collected in the form of heat to make steam. But this shouldn't be confused with being "near meltdown." Here's why.
Inside of a reactor, there are several things which impede this criticality. First is the water itself: water is a neutron moderator as well as a coolant. A moderator is something which slows the neutrons down. When they first are kicked out of the fissioning nucleus, the original neutrons are actually too high in energy to efficiently set off another fission reaction. If they travel through water first, they lose some of that energy and become "thermalized," making them far more likely to cause another uranium nucleus to fission. Thus, if the cooling water is lost (in the nuclear industry, this is known as "LOCA" - loss of cooling accident), the moderator is lost as well, and the neutrons speed up - and this (ironically) makes criticality even more difficult to attain. In addition to the water, control rods are used to (as the name suggests) control the nuclear reactions. Control rods are made out of materials which absorb neutrons (things like boron and graphite), so they have the effect of removing neutrons from the reactor core, meaning fewer neutrons are available to cause fission.
Now here's an important point which is overlooked by the Natural News author: if the coolant stops, the fuel rods do not necessarily "go critical." This is what happened in Chernobyl, but for a very obvious and unfortunate reason - the operators there, while conducting a test, turned off the safety system interlocks. While a LOCA can cause the uranium fuel to overheat, it does not suddenly go out of control in some sort of nuclear-bomb-like explosion. Nuclear reactor fuel CANNOT EXPLODE like a nuclear bomb. It simply isn't physically possible (water can't burn your skin the way sulfuric acid does - it's not physically possible). Reactors these days are designed with catastrophe in mind - they have to be, given the increasingly strict regulations surrounding them (chemical plants, industrial plants and coal burning plants have incredibly lax regulations by comparison) - so with each possibility of something going wrong, an additional layer of protection is built-in.
So imagine the scenario that the nuclear plant suffers from a loss of power (and nuclear plants, just like other power plants, do supply their own power; the Japanese struggled to reconnect the Fukushima plants to the grid because it would allow them to supply power from plants unaffected by the earthquake and tsunami). All hell breaks loose, right? Nope. In modern reactors (and older reactors in the US are required by law to be retrofitted to meet new safety standards), safety systems don't all depend on getting power. The control rods are gravity fed - meaning if something happens, they will simply drop into place, no power necessary. The cooling water may run on electric pumps, but these pumps have diesel or battery backup bumps, and these have gravity fed or thermodynamic backup systems (ie, systems which run on gravity, as the control rods, or which depend upon the natural circulation of air or liquid). Reactor safety systems which don't require power - or sometimes, don't even require an operator! - are called "redundant" and "passive" systems. They operate without power, without diesel, without batteries, without plant access, without people. They just work. Designs for modular nuclear plants exist even now that are completely meltdown proof.
Without getting too much more in-depth, it's fair to say the scenario isn't quite as dire as originally imagined. But let's take a couple more specific points.

"When the generators fail and the coolant pumps stop pumping, nuclear fuel rods begin to melt through their containment rods, unleashing ungodly amounts of life-destroying radiation directly into the atmosphere."
As we previously discussed, reactors have lots of redundant safety systems built-in. So it's not a guarantee that fuel rods will melt through their containment if the cooling water is lost (see above). It's also a horrible crime to claim that when nuclear fuel gets hot, it releases radiation "directly into the atmosphere." Secondary safety systems, such as containment, prevent this from happening. If the unlikely event of a LOCA occurred and the backup systems and backup-backup systems were to fail (with each layer of failure comes an even more decreased probability of it actually happening, like drawing four aces in a row from a deck of cards), the hot nuclear fuel is contained within a steel and concrete containment vessel which can withstand the heat and pressure the fuel creates (in fact, they can withstand earthquakes and airplane crashes and all manner of highly unlikely things). Physically, there are limitations as well: nuclear fuel is heavy, metallic stuff. If you took a chunk of steel, for example, and melted it, you'd be left with a lump of steel, and the same is true of nuclear fuel. Most of the material remains as a big, solid lump. Very little becomes gaseous or particulate, so very little even has the potential to become airborne in the first place. So even if the nuclear fuel were to melt, it wouldn't escape directly into the atmosphere. And we've already touched on the fact that nuclear material is not as frighteningly deadly as it's made out to be.

"As any sufficiently informed scientist will readily admit, solar flares have the potential to blow out the transformers throughout the national power grid. That's because solar flares induce geomagnetic currents (powerful electromagnetic impulses) which overload the transformers and cause them to explode.... But the real kicker in all this is that the power grid will be destroyed nearly everywhere."
Well, this is a half-truth. Transformers explode when they are overloaded; a power spike, usually caused by lightning or a sufficient power blip down the line (perhaps caused by another exploding transformer), melts the circuits inside the transformer and secondarily heats the oil used to cool the circuitry inside, causing an explosion. I doubt, however, any "sufficiently informed scientist" will admit that regular solar flares would wipe out the entire power grid (a good scientist will always admit that something is possible, but will also maintain that it need not be probable). Solar flares occur all the time - we wouldn't have the Aurora without them. Powerful solar flares do have the potential to interrupt satellite communications and cause temporary surges in the power grid, as has been seen previously, and the Sun is coming up to the peak of its 11-year cycle (due in about 2013). But this cycle is set to produce fewer sunspots than normal, and fewer sunspots means fewer possibilities for massive solar flares.
We can't discount the potential for large solar flares, however, at some point in the future. The chances of a flare being a massively disruptive one are low, but not zero. It's not that we haven't known about the potential for years. But because of this, we have systems now to tell us when something is going to happen. NASA satellites which float constantly in the space between us and the Sun are monitoring at every moment, ready to give us hours of notice should a large solar flare occur. If we have hours to know a solar storm is coming, doesn't that mean we have hours to shut down any sensitive systems?

"Did I also mention that half the people who work at nuclear power facilities have no idea what they're doing in the first place? Most of the veterans who really know the facilities inside and out have been forced into retirement due to reaching their lifetime limits of on-the-job radiation exposure, so most of the workers at nuclear facilities right now are newbies who really have no clue what they're doing."
This is a common misunderstanding. While there are lifetime limits for work-related radiation exposure, it doesn't mean the expertise of someone retiring from the nuclear industry is lost (think, how many consultants do you know?). And to claim that new employees have no idea what they're doing is to ignore the years of technical training required by law for any nuclear operator.

"Imagine a world without electricity. Even for just a week. Imagine New York City with no electricity, or Los Angeles, or Sao Paulo. Within 72 hours, most cities around the world will devolve into total chaos, complete with looting, violent crime, and runaway fires."
We don't have to. We've already seen what happens, and it wasn't so bad.

"Now imagine the scenario: You've got a massive solar flare that knocks out the world power grid and destroys the majority of the power grid transformers, thrusting the world into darkness. Cities collapse into chaos and rioting, martial law is quickly declared (but it hardly matters), and every nation in the world is on full emergency. But that doesn't solve the really big problem, which is that you've got 700 nuclear reactors that can't feed power into the grid (because all the transformers are blown up) and yet simultaneously have to be fed a steady stream of emergency fuels to run the generators the keep the coolant pumps functioning."
I've already spent some time explaining why the response need not be so frantic (passive safety systems, etc), and I've also already spent quite a bit of time ranting about worst case scenarios. So imagining this scenario is at once easy and absurd.

"Let's be outrageously optimistic and suppose that a third of those somehow don't go into a total meltdown by some miracle of God, or some bizarre twist in the laws of physics. So we're still left with 115 nuclear power plants that 'go Chernobyl.' Fukushima was one power plant. Imagine the devastation of 100+ nuclear power plants, all going into meltdown all at once across the planet. It's not the loss of electricity that's the real problem; it's the global tidal wave of invisible radiation that blankets the planet, permeates the topsoil, irradiates everything that breathes and delivers the final crushing blow to human civilization as we know it today."
Again, let's be realistic. Even if all of the 440 power-generating nuclear reactors in operation today were to lose power (we can't count the research reactors, because they are purposely designed to be small and harmless and not designed to create power, nor can we count the nuclear navy ships and submarines, which would be impervious to any problems with the national power grid), that doesn't imply nuclear holocaust. Since the Chernobyl incident, safety standards have been raised such that containment is required; only the old Soviet style reactors lack a containment vessel, and these have been outfitted. In order to see any radioactivity leak from a nuclear plant, we'd have to have a breach of containment, and we have estimates of that potential through Probabilistic Risk Assessment (the mathematical way to calculate the potential for an accident to occur in a highly technical system, like a reactor). A Sandia Labs report estimates that a typical containment vessel might fail at a rate of roughly 1x10^-7 per year (that's 0.0000001 failures per operating year), and that's IF THE CORE HAS ALREADY FAILED (in other words, if all of the nuclear fuel has already melted). So we have 440 nuclear reactors, and we'll assume they've all lost power, and we'll assume even further that they've all lost cooling water, and we'll assume even further that they've all lost their passive safety systems, and we'll assume even further that their nuclear fuel has overheated. 440 plants with core damage times a containment vessel failure rate of 0.0000001 gives a probability of ~0.005% that any radioactive material is released into the atmosphere. In order to get one incident (in other words, in order to achieve a likelihood of 100%), you'd have to wait over twenty thousand years. And like I said, that's already assuming the reactor core is damaged.

"The world's reliance on nuclear power, you see, has doomed us to destroy our own civilization. Of course, this is all preventable if we would only dismantle and shut down ALL nuclear power plants on the planet. But what are the chances of that happening? Zero, of course. There are too many commercial and political interests invested in nuclear power."
I'm quite curious as to where he gets this idea. Nuclear power, because of all of the regulations surrounding it, is expensive and difficult to get started. I can think of not one instance of there being commercial or political investment in nuclear power in recent years. Oil and coal, on the other hand, is a huge lobby, and moneymaker, for politics and commerce. We won't even bother to go into that here (though I find it amusing that our author even admits in his own article that "most people don't realize it, but petroleum refineries run on electricity" - making the oil and gas infrastructure just as vulnerable, if not more so, to his outlandish doomsday scenario). Besides, it's ridiculous to assume that having 14% of the world's power supplied by nuclear counts as "reliance." We're far more reliant on other sources of power (coal produces nearly half the power in the US).

"What can you do about any of this? Build yourself an underground bunker and prepare to live in it for an extended period of time. (Just a few feet of soil protects you from most radiation.) The good news is that if you survive it all and one day return to the surface to plant your non-hybrid seeds and begin rebuilding human society, real estate will be really, really cheap."
Is it mean to say that I hope this is what he plans to do, so the rest of us can move on to something more productive?

Monday, September 12, 2011


Before this gets out of hand, I have to say something.

Yes, there was an explosion at the Centraco plant in Marcoule, France. It has been described as an "industrial accident."

NO, it was not a nuclear explosion.

NO, it did not involve nuclear materials.

NO, Centraco is not a nuclear reactor. It is a low-level waste processing site.

One person is confirmed dead, and three injured.

Now compare that to the second explosion which took place today:
A leaking gas pipeline in Kenya exploded, killing at least 75 and injuring at least 100.

Was that second incident underreported? ABSOLUTELY. (Google News counts: "Marcoule explosion" in last hour: 1,710; "Kenya explosion" in last hour: 712.)

Friday, September 9, 2011

Science is suffering

I wanted to draw your attention to the latest Washington Dispatch update from the APS Office of Public Affairs. Here I have reproduced a portion of this month's edition. Emphasis mine (because nothing else needs to be said).
ISSUE: Budget and Authorization Environment
Fiscal Year 2012 Appropriations
As of the deadline for APS News, the House of Representatives had passed the Energy and Water Development (E&W) bill that funds DOE and completed full committee action on the Commerce, Justice, and Science (CJS) bill that funds NSF, NIST, and NASA. A summary of key elements of the action follows.

* E&W Appropriations bill (HR 2354): On July 15th the House passed HR 2354 by a vote of 219 (209 R, 10 D) to 196 (21 R, 175 D), providing $24.7B for DOE (-$850M relative to FY11), including $4.8B for the Office of Science (-$43M); $1.3B for Energy Efficiency and Renewable Energy [EERE] (-$491M); $733M for Nuclear Energy [NE] (+$8M); $477M for Fossil Energy (+$32M); $180M for ARPA-E (+$0); $10.6B for National Nuclear Security Administration [NNSA] (+$76M); and $4.9B for Defense Environmental Cleanup (-$42M)....

* The E&W Subcommittee report also contains language of concern: ... (2) It also directs Basic Energy Sciences to create "a performance ranking of all ongoing multi-year research projects... by comparing current performance with original project goals" and directs DOE to eliminate $25M by terminating the lowest ranked grants based solely on that criterion.

* CJS Appropriations bill (No bill number assigned): The House Appropriations Committee passed the CJS bill by voice vote on July 13th, providing $4.5B for NASA Science (-$431M); $701M for NIST (-$49M) and $6.9B for NSF (+$0)....

* Of greatest concern to the science community should be the elimination of funding for the James Webb Space Telescope (JWST), the highest priority for astronomy and astrophysics. Rep. Wolf (R-VA 10th), chair of the House CJS Appropriations Subcommittee, alleged that NASA had "been hiding costs" associated with the telescope... [and] also claimed that NASA had rushed its planning....

Thus far, the Senate has begun debate on only one appropriations bill: Military Construction. It is not expected to address the other eleven bills until after Congress returns from its August recess, virtually assuring a Continuing Resolution to take effect when the new fiscal year begins on October 1st.