Sunday, December 29, 2013

Warring Worldviews - Or Not

So Christmas this year resulted in a pretty significant stack of new books to read, and I'm ready to report on the first of these: the joint venture between Deepak Chopra and Leonard Mlodinow, War of the Worldviews.
Suffice it to say, the only war is the one we artificially construct.
Much of the issue, as is typically the case with such things, is one of ignorance - neither party has sufficient knowledge of the subjects to discuss them in a meaningful way. Thus Scientist Mlodinow ignores philosophy because he basically doesn't get it (trying to apply the rules of mathematics to philosophy is like trying to use one cake recipe to derive the importance of food), and Spiritualist Chopra makes the dangerous mistake of trying to fit a metaphysics into the currently existing physics. They both discuss the difference between mind and brain without once mentioning the phrase "emergent behavior." Mlodinow compares a spiritual worldview to Victorian seances, cites optical illusions as proof that an immaterial realm cannot exist (surely if one was looking for a connection between the immaterial realm of "mind" and the material realm of "brain," the very brain scans he cites as proof that no such connection exists are exactly that connection), and uses Richard Dawkins' "Methinks it is like a weasel" computer code as though it makes his point (a note to the uninformed: the idea is that randomness plus a process like natural selection can make evolution proceed faster than randomness alone; while this is generally true, Dawkins' example is fundamentally flawed in that he directs the process, making the end result, the sentence "methinks it is like a weasel," the goal of the process - completely unlike natural selection, which has no "goal"). Not to mention his consistent complaining about Chopra's imprecise use of terminology, which he subsequently ignores when using his own imprecise terminology. There's even an eerily close approach to Godwin's Law on page 62 (who brings up Nazis in a discussion on whether the universe is evolving?).
Similarly misdirected, Chopra tries to imbue positive meaning into the quantum world, instead of just indeterminacy. However, he doesn't ignore science or discount it, but embraces it - a supremely important step - and asks us to not change the materialistic worldview so much as expand it; he wants science to be directed by spirituality, if you will, partially in how it approaches the world (not a great idea) but also partially in how it is applied to the world (a great idea). Oddly enough, Mlodinow doesn't straight-off discount spirituality, either, but he uses fuzzy terms to talk about how people should be spiritual without being spiritual. The material world is all that exists, but you should still be nice to people. It's no wonder it took me four days to get through this nonsense.
One of the really irritating things about this book, actually, is Scientist Mlodinow's misrepresentation of science. I suppose I should have known already that I wouldn't be very sympathetic after the last debacle, but I wanted to give it a shot. He somehow misses the point entirely when he discusses, toward the end of the book, the fact that our perception of the universe is necessarily - by being filtered through our senses and our brains - subjective, yet clings unapologetically to the claim that science is truly objective and the only real way to understand the universe. In the same breath, he will denounce people's blind acceptance of a spiritual or religious teaching, then bid us trust everything he says about scientific results. (It makes you wonder if he ever actually listens to himself talk.) Similarly, he'll cite correlations as proof of causation, without so much as a single footnote on how this is actually outside of the scientific process (science is all there is, indeed!). He claims that science does not answer to authority, as these foolish religious traditions do, and yet he continually references Einstein's opinions on religion (on a related note, can I just complain that this is a weak and useless argument for science and against spirituality... claiming that science is better than the ancient wisdom traditions because it is continually updated is not only blind worship of progress, but it also ignores that the ancient wisdom traditions have had no need to change. A person is as driven by conflicting desires now as he was 6000 years ago. Basic human nature remains constant, and thus our knowledge of it needs no update). And at one point, he makes the horrific mistake (please let it be a mistake!) of saying that in science, we seek to both prove and disprove things. We never, ever, ever can prove something in science. That's fundamental to science's nature. Science wouldn't continue to progress and improve, as Mlodinow so insistently reminds us, if we were able to prove something to be absolutely scientifically true.
Now, I appreciate the difficulty in explaining science, broadly, to a non-scientific audience. This is why I argue that people need to be science-literate just as much as they need to be English-literate, regardless of whether they want a career in STEM. There are things we say to one another as scientists, words that have slightly varied meanings from the usual lexicon, phrases that indicate something to us but which mean nothing, or worse are misinterpreted as meaning something else, to those outside of our field. I get that. When we say "prove," what we really mean is "show to be within 97% certainty" or the like. And when I joke with my fellow physicists that "science is subjective," I mean something slightly different than what this statement appears on the surface; of course science gains some level of objectivity through its focus on repetition and reproduction of results. So when I complain about Mlodinow's misrepresentation of science, it is in this context: he is making science sound like more than it actually is to an audience outside of science. As a scientist, I understand why he makes a point of explaining that the term "evolution" can't be applied equally to biological systems and to physical systems like the universe, but at the same time, I don't understand why he then turns around and uses the term "prove" to describe a sweeping conclusion drawn from a few, barely statistically significant measurements of brain activity in patients suffering from damage to their ventromedial prefrontal cortex.
In the end, neither of them succeed fully in making their point. If there is really a "war of the worldviews," it isn't between science and spirituality, no matter how much the book's reviewers (their war cries printed all over the cover and inside the flaps) froth at the mouth. The real conflict is between subjectivity and objectivity, and in truth, only one of these can win. Science relies on objectivity, which works reasonably well enough when the view is turned outward, toward objects (it's painfully obvious, I know). But that objectivity will completely break down when turned inward, toward the source of the subjectivity from which the objectivity is trying to break free (it also breaks down at the quantum level). One simply cannot "know thyself" when "thyself" has been purposefully removed. (As Schroedinger wrote in Mind and Matter, "No personal god can form part of a world model that has only become accessible at the cost of removing everything personal from it... I do not find God anywhere in space and time - that is what the honest naturalist tells you. For this he incurs blame from him in whose catechism is written: God is spirit.") Science (just as religion) is the product of minds which are, and always will be, subjective; hence its objectivity will always be tinged with subjectivity. We are part of that which we are observing. We cannot divorce ourselves entirely from the universe, and if we try to operate under the assumption that we can, we will find ourselves running up time and again against tremendous inconsistencies (both scientifically and spiritually). Subjectivity rules.
But the truth is, the ancient wisdom traditions have known this all along. Tat tvam asi - that art thou - we are connected to everything, or, more accurately, we are everything, and everything is us. We are not disconnected, nor can we ever be. Thus science can only be one way of seeing the world, but it is not the way. We don't have to be mystics to understand that. We can tell simply by the fact that science insists that "we" don't factor into science (a scientific worldview implies, but purposely ignores, a viewer).
The really telling portion for me, though, came only a third of the way into the book. Scientist Mlodinow, in his essay on "what is life?," begins his final paragraph with the following:
I spoke to my father while writing this book. For as long as I can remember I have feared for his health. When I spoke to him the other night he reassured me that he is alive and well, in the same way he has reassured me each time I've seen him over the last twenty years - in my dreams. My father died two decades ago but I'd obviously rather not accept it. I'd rather believe that he has rejoined the universe, or gone on living in some other form. Unfortunately, for me the desire is not strong enough to outweigh the skepticism....
I'm sorry, what?!? I'd say your subconscious brain has a $%*^&$%@$#*% lot to tell you about how strong your desire for closure is!
Expletives aside, the candid admission is meant to show that it takes more strength to believe that "we again become one with the dust" than to accept a "reassuring" metaphysical answer. That might be true. But does that make you a better human being? And isn't the betterment of human beings the entire point of discussing which type of worldview to adopt? In the Foreward, it even says that "No one can ignore the question of how to perceive the world.... What else could be more important?" It seems to me like this anecdote, meant to show strength, actually demonstrates a vast - but thankfully curable - weakness. If to adopt the purely scientific worldview means that I cannot properly grieve and come to terms with death, if to adopt the purely scientific worldview means that I become incapable of being a well-rounded and well-adjusted human being, then I reject it. If, on the other hand, the spiritual worldview gives me those things, allows me to accept death and adjust to life, then this is my obvious choice. It is also presumably the choice of every other sensible person who reads this book. You, Scientist Mlodinow, have conceded the argument without even fully presenting your side of the debate, thanks to one overwhelming fact: I would rather not be like you.
Please understand that I do not mean to be harsh. The loss of a loved one is a tremendous psychological (spiritual) burden, and one that not every person bears the same way. But if this scientific worldview, at least as presented here, offers me no solace, then what reason do I have to choose it? Scientist Mlodinow, as if anticipating his debate defeat, leaves us with one more anecdote, this one about a friend who believed in God and the soul.
I expected her to disagree about the absence of evidence, but she didn't.... Can you enjoy a film even if you'd be at a loss to describe its merits [she asked]? Can it speak truth to you even if it is not a cinematic masterpiece? Why is it wrong to believe in a higher power even if you don't have proof? Then she told me of a book published in German, a collection of notes and letters written by people about to be executed for helping Jews survive during World War II. All were written either by people deeply involved in their faith or by children. There was only one exception, she said - a nineteen-year-old secular man who got involved in the resistance movement as a sort of adventure. His letters were different from all the others, she said. He was the only one who feared death.

Thursday, October 31, 2013

What we can learn from Einstein

Everyone knows Albert Einstein - yes, that Einstein, the famous physicist, the man in the Swiss patent office who shattered the Newtonian ideals of fixed space and time. But even as we recite the stories, recalling the now familiar image of a wild-haired, wrinkle-browed old man, there are things that we miss. Here are a few.

1) Einstein won the Nobel Prize, but that's not what he's most famous for.
Einstein's Nobel was for his work on the photoelectric effect, which describes how materials struck by certain types of light can emit electrons. It's worth noting that while most people don't even know what the photoelectric effect is (or Brownian motion, for that matter), they can correctly link Einstein with "the theory of relativity." Both endeavors were important to physics, but only one was able to capture the public's attention. In other words, what review committees might view as "Nobel-worthy" may not be popular (interesting to a wider audience), and vice versa. When funding is based solely on one of these criteria, it can lead to a loss of good science.

2) Einstein's H-index probably wasn't all that high (during his lifetime).
Sure, Einstein published some really seminal papers, especially in 1905. He published lots of papers over his entire career (though not as many as some frighteningly prolific researchers), though many of those papers he did publish weren't necessarily peer-reviewed. But it's whether people properly cite your work that counts where the H-index is concerned, and despite his brilliance, many of Einstein's papers were greeted with nods of agreement and nothing more. If you only consider the "miracle year," in fact, Einstein's calculated H-index - often used as a means of determining whether you deserve a job - would have been in the measly single digits. Using metrics like the H-index may be interesting, but they're not the penultimate indication of how good you are at science.

3) Einstein struggled to find a faculty job.
Getting tenured (or at least tenure-track) jobs in academia is hard enough when you're not Einstein, so it's always scary to consider that the man himself had trouble finding a job that would allow him to work on theoretical physics. Two frustrating years after he graduated, he managed to get the position at the patent office (with the assistance of his friend's dad); not what he wanted, but at least something to pay the bills and allow him a little free time to pursue his work. Einstein's first lecturer position didn't come until more than three years after the 1905 "miracle year." Imagine how the history of physics might be different had someone in academia recognized Einstein's potential earlier.

4) Einstein was a trouble-maker.
It's a common misconception that Einstein, as a kid, was bad at school - in fact, his grades were exceptional. However, he had a massive distrust for what he considered to be "arbitrary" authority, and he hated the general form of early schooling. Memorization and recitation bored him, and eventually he would take his teacher's advice and leave school (so as not to disrupt the classroom any further). It was only at the Swiss school in Aarau that teachers recognized Einstein's gift and allowed him the freedom to pursue his own studies - eventually allowing him to become one of the most prominent physicists of all time. That student in your class, who doesn't do the homework and would rather make snide observations than listen quietly, could be acting out of boredom, or a longing for freedom and purpose; that student could be the next Einstein.

If Einstein really is our role model, our categorical scientist, then we would do well to learn everything we can from his life.

Sunday, October 13, 2013

Hoyle's Nobel

Fred Hoyle never got a Nobel Prize, but he did win the admiration and respect of his colleagues thanks to his incredible insights into the brand-new discipline of nuclear astrophysics. One such contemporary was George Gamow, who wrote later of Hoyle's theory of the formation of the elements:
In the beginning God created radiation and ylem. And ylem was without shape or number, and the nucleons were rushing madly over the face of the deep. And God said: "Let there be mass two." And there was mass two. And God saw deuterium, and it was good. And God said: "Let there be mass three." And God saw tritium and [helium-3] and they were good. And God continued to call number after number until He came to the transuranium elements. But when He looked back on his work He found that it was not good. In the excitement of counting, He missed calling for mass five and so, naturally, no heavier elements could have been formed. God was very much disappointed, and wanted first to contract the universe again, and to start all over from the beginning. But it would be much too simple. Thus being almighty, God decided to correct His mistake in a most impossible way.
And God said: "Let there be Hoyle." And there was Hoyle. And God looked at Hoyle... and told him to make heavy elements in any way he pleased. And Hoyle decided to make heavy elements in stars, and to spread them around by supernovae explosions. But in doing so he had to obtain the same abundance curve which would have resulted from nucleosynthesis in ylem, if God would not have forgotten to call for mass five. And so, with the help of God, Hoyle made heavy elements in this way, but it was so complicated that nowadays neither Hoyle, nor God, nor anybody else can figure out exactly how it was done.

Tuesday, October 8, 2013

Furloughs and Nobels

As anyone who has been following me on facebook knows, the government shutdown is having a tremendously negative effect on science: the evidence is all around us [1,2,3,4]. Furloughs and temporary lab closures may be coming my way, too, if Congress doesn't get its act together.
At the same time, from across the Atlantic comes news of this year's Nobel Prize in Physics. One should not ignore the unfortunate fact that the "discovery" of the Higgs boson came from CERN and not from Fermilab - in other words, the US decision to shut down the Tevatron had unforeseen effects.
The Nobel announcement should be a clarion call to those whose job it is to regulate/fund/otherwise facilitate US science. If we keep playing games with our scientists, as the US is doing now, we will lose out on future discoveries. We will fall behind in technological advances. We will slowly but surely drain ourselves of expertise, as existing scientists move on to other places and the future generations of scientists are left with no training, no laboratories, and no funding.
Of course, there are more dire situations that have been caused by the government shutdown (cessation of WIC, etc), which I don't mean to minimize. But science is suffering, too.

Tuesday, August 6, 2013

The hierarchy

I thought I should share this.


Wednesday, July 3, 2013

Bipolar

The Impostor Syndrome is well-known in science, or in fact any intellectual field. The belief that you're not good enough, that you're just fooling everybody into thinking you're smart, and at any moment someone will call your bluff and out you as the faker and failure that you really are. Some people are more prone to it than others: people (like myself) who have a hard time "owning" successes and accomplishments.
It doesn't help when nobody congratulates or rewards you on those successes (or they do so rarely). I know scientists who will reward major achievements from their students or postdocs with dinners out, pizza parties, bottles of whiskey. I know others, however, who think success should be its own reward (and it's likely because that tactic works for them personally), and who will never say a word of thanks to a subordinate for accomplishing something important.
But in science as in everything else, sometimes we know we're right. Sometimes, I've accomplished something and I know it is a good result - a great result, even - and that it deserves recognition. That, by extension, I deserve recognition. Perhaps that is enough motivation to fight for recognition, to demand an appropriate "reward" for our work. But that's no guarantee we will actually receive recognition, and if we don't, we feel slighted, oppressed, wronged.
So the situation swings wildly from pole to pole, from feeling like an impostor to feeling like an underdog. Is it any surprise that scientists are depicted as cool, level-headed and unemotional? If we were depicted in a realistic way, we'd be raging, weeping, shouting caricatures of human beings. At least, that's how I feel I would appear.
Maybe the problem is internal. Each of us needs to learn that we don't need recognition for our achievements to mean something. We need to become detached from our efforts. We need to be that unemotional scientist.
But maybe the problem is more systemic. Two things come immediately to mind: first, I don't see why I should have to give up "ownership" of my accomplishments, and second, the academic science environment only rewards major accomplishments that have been suitably recognized. Someone who publishes regularly in Nature will get more grant money than someone who publishes in Il Nuovo Cimento, and someone at MIT will be more renowned for the same work than someone at New Mexico Tech. If we can't get funding to do our research unless the wider scientific audience recognizes our efforts and successes in that research, if we can't get a job unless we have the right number of publications in already-recognized journals, then the idea we should be willing to go without recognition is the same as saying we should be willing to not succeed. If we want the system - and those in it - to be mentally healthier, we should work to disconnect this link between recognition and success. Recognition should be the reward for success, not the other way around.
One last thought, though, and it's a mea culpa: perhaps it's all just me. Perhaps I personally desire recognition far more than other scientists, and so the error originates when my brain invents a reality which assumes I am the average and not the outlier. I'd still like to see a change for the better, one where there is enough funding for everyone and people are not judged based on how often they publish, but such pipe dreams are still long in coming.

Monday, June 3, 2013

The usual

I wanted to share with all of you a bit of the troubles I've encountered lately, in case you think science is some glamorous thing wherein everybody has spotless laboratories filled with beakers of colorful bubbling liquids.
The system I'm building involves vacuum pumps... a lot of vacuum pumps. Six multistage roots blower pumps, seven single-stage roots blower pumps, and nine turbomolecular pumps, to be precise (plus an industrial compressor). I won't explain the details of how it all goes together, but I'll point out that the really important bit is those six multistage roots (msr) pumps. They back up the entire system, so if they don't work, the entire system doesn't work.
Now here's my problem.
The pumps we bought originally don't work the way they were meant to. They pump nitrogen gas just fine, but they die when pumping helium. And we need them to pump helium.
Now, we've managed to find a pump that does work the way we need it to, and after long negotiations we've set up an agreement with the original vendor to swap the pumps we have for the pumps we need. But each of these pumps weighs nearly 500 pounds, and it is not a simple matter to package them up and ship them across the country. Nor is it a simple matter to get the paperwork completed for a swap of thousands of dollars worth of equipment. But that's what I've had to do, because that's part of my job.
The occupation "scientist," it turns out, includes a lot of things. Sometimes, like recently, my job is Purchasing Department or Shipping/Receiving. Sometimes it's Statistician or Data Analyst. Sometimes it's Electrician, Plumber, Machinist, Historian, Writer, Programmer, Debugger, Sys Admin, Benchwarmer, Babysitter, Mathematician, Engineer, Translator, Teacher, Mentor, and occasionally even Futures Forecaster (aka Mind-reader).
Being a scientist isn't always great. Heck, it isn't always fun. But it's rewarding.

Tuesday, May 7, 2013

Miss Atomic Bomb goes to Washington

Yesterday, I participated in what's known as a "fly in" - people linked by a common goal all come together in DC for one day to blitz as many public offices as possible with their message. Our common goal was funding for nuclear physics (we were, incidentally, all nuclear physicists); we came together for a day to let our representatives know that we supported the President's proposed budget for DOE Office of Science and that they, too, should support it.
One important thing to note about such fly-ins is that the people flying in don't often meet with the actual representative. Your Senator or Congressperson is too busy to meet with everyone who would like to share a story or voice an opinion, so instead you meet with a staffer - a legislative assistant who has been assigned to a specific topic (such as budget, or science & tech, or immigration policy). These people range in experience from fresh-faced political science majors just out of college to PhD scientists on AAAS Congressional Fellowships (I had the good fortune of encountering one such fellow in the office of Colorado's senator Michael Bennet). One should not, however, assume that just because the meeting is with a staffer, the meeting is a wasted effort. These assistants bend the ear of their respective representatives, and can have a tremendous amount of influence - a Senator or Member relies upon the input of their staffers for making important decisions (because, as I said, they're busy). Just like you would trust the opinion of your butcher when buying meat or your mechanic when fixing your car, the staffers provide educated opinions on their assigned topics to the office where they work.
Because we're only in Washington for one day, schedules are tight. I had meetings with six different offices, in both the House and Senate, between 10:30am and 3pm (with a short break for lunch in the House cafeteria!). There were others with me, usually one or two, and the day's organizers made sure to provide us with materials that we could leave with people (such as pamphlets on how nuclear physics is important to national security, medical research, isotope production, and the like). All of the offices I went to were full of interested and supportive people, people who took time out of their day to listen to what we had to say. By the time I was finished yesterday afternoon, I was exhausted but pleasantly surprised at how the day had gone.
Today I'm back in my regular office, back to my regular work, and the whirlwind of yesterday already seems that much further away. But asking for something once isn't generally enough, so I'm sure I'll be back in Washington again to make sure we have funding, not just now, but for the future.

Friday, May 3, 2013

What's it worth?

Being a scientist is hard, not least because we're constantly struggling to find funding.

Our jobs are difficult, require more than average dedication, and yet are often only temporary. We bounce from one project to the next, hoping to find a permanent academic job (so that we can then fight for funding and tenure) or giving up and moving into less demanding occupations. We get paid little in the grand scheme of things, certainly much less than what we are worth. But we love what we do, and so we put up with it.

It took me roughly a decade of education past high school to become a scientist. Then, just like in the medical profession, I embarked upon a "residency" - postdoctoral positions, all temporary, where you are meant to learn more than even what your degree taught you. You have to go through these positions, often many of them, and often for many years (each lasts 1-3 years, depending on field of study, funding, etc), before you can even think about applying for a permanent job. So I'm now at the end of my third postdoc and finally applying for permanent jobs. Such is the nature of the beast, if you will. My effort, my hard work and dedication, all goes toward science - toward the furthering of the knowledgebase of humankind - and for the most part, my satisfaction in this pursuit is enough reward.
Now let's consider a different story. A man, who starts with nothing but a desire, works hard, goes to college, designs a gadget that acts as both a phone and a radio (think iPhone), manages to sell the idea to a big firm and ends up rich. Our cultural zeitgeist says he earned it, through his hard work and dedication, and we should let him have it. Don't penalize the successful people, right?
Here's where we expose the lie. Is the story really all that different, at least at the start? I'm successful, too - or, I would be, if the "product" that I have worked to create was something other than intelligence. All of my hard work and dedication goes toward making something that we (as capitalists) have a difficult time understanding, much less assigning a monetary value to. If I spent that last decade plus of my life designing iPads or "special" assets, I'd be rich. As it is, however, my hard work and dedication is not rewarded. I don't earn money based upon my level of effort. In fact, sometimes, no matter how hard I try, funding dries up and I don't earn money at all.
Capitalism doesn't have to be this way - we can assign a monetary value to intelligence, or to protection of natural resources, or any of those other things which we know intrinsically have worth but which we never bother to quantify. We can acknowledge that my effort is worth money, just like the effort of someone who invents Windows software is worth money.
There is also the matter of where the money comes from. In the case of iPhones and Microsoft and Bank of America, the money for the thing comes from people who are less well off than the people who designed the thing. In other words, people who purchase iPads are, on the whole, not people who make nearly as much income as Steve Jobs did, and the people underwater in their BoA mortgages will never be as rich as the bank's president. The money that the rich make in a capitalistic society comes from the poor (or the "poorer"). But the money I get comes from the government, which means it comes from everyone. Everyone puts in a share, a little or a lot, and that is where my "reward" comes from. It used to be that the rich gave back to society by personally funding things like science experiments and symphonies and social welfare projects, but we don't have that anymore.

So the question is - how much is it worth? How much is intelligence worth, and why don't we reward it the same way we reward the capitalistic creation of crap? Why do I have to struggle for money, sometimes even for a job, when I worked just as hard to get to this point as someone who makes ten times what I make in a year?

Science is hard enough as it is.

Friday, March 15, 2013

A lesson on precision and accuracy from the highway patrol

On a recent trip down a long and lonely highway, I found myself being issued a warning from a courteous highway patrol officer for traveling 77 mph in a 75 mph zone.
But this got me thinking. With his radar gun (regularly calibrated and checked), he was able to determine my speed to probably quite a good precision - perhaps fractions of a mile per hour. So he knew how fast I was going. However, that doesn't mean that I knew how fast I was going.
There is a well-known source of uncertainty in measurement, which states that the precision of any measurement can't be better than half of the smallest increment of the measuring device. Picture a ruler. Maybe it's a good one and it has increments marked on it down to every 16th of an inch. If you were to measure the length of something with this ruler, you wouldn't be able to say that you measured that length to better than half of 1/16th inches. The same is true for your liquid measuring cup: if the divisions on the cup are ounces, then your volume is only known to half an ounce. Even though you may be able to estimate the measurement better than that, it is invariably (and incurably) subjected to that imprecision. Consider a meter stick, which has 1 mm divisions. If you measure the length of a piece of metal with that meter stick to be 179.3 mm, you would still have to report the length as 179.3 +/- 0.5 mm. Because the precision of the meter stick is only half the smallest division (so half of 1 mm).
The speedometer in my car has 5 mph increments. So even if, in practice, I can estimate from the location of the needle that I was going 77, I must report that number as 77 +/- 2.5 mph. Which means that it's entirely possible that I was, in fact, going the speed limit, or even slightly under it (77-2.5 = 74.5). Because of the inherent precision in my speedometer, I simply can't know to any closer than 2.5 mph.
This, additionally, isn't the only problem I encounter in wishing to know my speed. While the highway patrolman's radar was probably calibrated recently, my speedometer may never have been calibrated (more likely, it was calibrated once, on the factory floor when the car was brand new). All sorts of things can affect the overall calibration of a car's speedometer, including the size and shape of the tires, the car's age, and the device originally used to perform the calibration. This kind of uncertainty (inaccuracy, as opposed to an imprecision) is referred to as systematic (recall my discussion of the faster-than-light neutrinos), and it's the hardest kind to find and quantify. Even if my speedometer says I'm going 75, I might be going 73, or 77. I might even be going 80. The only way I can tell is to compare my result (my speedometer reading) with one or more simultaneous external results (like the patrolman's radar measurement).
In other words, I need an external reference to check my speedometer's accuracy, while the speedometer's precision is determined through the speedometer itself; and both are needed to fully understand the speedo's functionality and operation.
Fortunately, in this particular instance, I just happened to have a GPS unit with me. And the GPS unit agreed with both the radar and my speedometer: I was driving 77 mph. (In my defense, I was coasting down a small hill. If he'd seen me going up the hill instead, he'd likely have clocked me going 72.)
So from a simple traffic warning, I was able to learn that my car's speedometer is accurate to better than 1 mph and precise to 2.5 mph. My conclusion: don't let nerds go on road trips!

Sunday, February 17, 2013

Oh, God(win)

Godwin's Law, not really a law as such but more an idiom in the Murphy's Law sense, states that "as an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches 1."
More plainly, Godwin's Law asserts that, given a long enough time, any online discussion (regardless of topic, scope or participants) will eventually contain someone making a comparative reference to Hitler or the Nazis.*
I had never seen this happen firsthand, however, no matter how well acquainted I was with the idea. It was the stuff of legend, something that happened to famous people or internet trolls.

That is, until a week ago.

An acquaintance on facebook, known to purposely provoke debate, posted a statement about gun control really being about our Second Amendment rights to bear arms against our government if necessary. Of course, knowing the person behind this particular argument, several people (myself included) responded with snarky comments: how can a government's own constitution condone treason against said government; or that if the point is really to arm ourselves against the government, why can I not have some submarines and an Air Force and a few tactical nuclear weapons to make the fight more even?

And then, so swiftly that no one saw it coming, enter The Thread Killer.

Thread Killer jumped right into the heretofore good-natured banter with an epic description of how he knew people who owned tanks and anti-aircraft guns and that they were prepared to use those weapons against the government if necessary. Without time for a breath, he stated additionally that it appeared it would be necessary because the "Socialist-in-Chief" Obama was trampling all over our constitutional rights.
I hoped, naively perhaps, that the levity of the original mood could be regained. I didn't know the person, but he was friends with the original poster, as I was... so surely it couldn't be as bad as the dark and dangerous internet, where comments are a free-for-all. "Obama a socialist? That's funny," I said. Especially (I thought it wise to point out) if one goes to Europe, where the real socialists can't stop rolling in the aisles every time they hear such ridiculous accusations.
Oh, no. Nope. It's not funny at all. Thread Killer pounced: it's not funny, it's horrifying, Obama is going to take away our guns and turn our country into Nazi Germany, and little me, the obviously brainless and acquiescing sheep that I am, well, well, well, I was just going to let it happen just like I would have let all those Jews waltz right into the gas chambers**. Luckily, Thread Killer was on the side of Israel, unlike the rest of us socialist Nazi wannabes who would rather do useless things like vote. (Ah, yes, and how dare I assume that he'd never been to Europe?)

I was taken aback. I tried to keep a lighter conversation going with the original poster, but I simply could not let such hateful and harassing statements go without reply. So I was matter-of-fact and brief: I should hope he would take back what he said, as it was unacceptable and shameful.

Well, that didn't work. Instead, Thread Killer demanded to know why he should take back his statements when they were true and Obama was a fascist, or, I assume, whatever Obama's currently thought to be on Fox News.

Because, I said. You just called me a Nazi. And you don't even know me.

After this, Thread Killer's comments in the thread disappeared, as if by magic. I thought that perhaps reason and civility had won the day, that he had seen the unnecessary cruelty and inappropriateness of his statements and thought it better to rescind them from public view. I continued a conversation with the original poster, but something seemed off about our back-and-forth. It seemed, well, not precisely back-and-forth. And a couple of days later, I discovered why.

Thread Killer hadn't deleted his posts - he'd blocked me.

Someone else, someone capable of seeing the entire thread, finally told me so. The conversation had continued without me. What I found the most discouraging was that Thread Killer didn't even want a debate to air his opinions. He didn't want me saying contradictory things or accusatory things in relation to his comments. He preferred the uncertainty of knowing whether anyone was listening at all to the certainty that I would argue with him.
The night of that post, I happened to be home alone, and let me tell you the sleep I got that night was fleeting and troubled. I heard every creak and snap of the house, every rustle and whisper of the trees outside, my brain convinced that someone willing to accuse me of destroying the country might also be willing to take whatever steps he deemed necessary to prevent me from doing so. A day or so later, I finally got up the courage to write one last post, explaining to anyone else who could see the thread that I had been blocked and thus could not read or respond to any of Thread Killer's comments. I should think, I said, that ownership of opinion is one thing that we can agree upon in this country. We take the right to free speech just as seriously as anything, or, at least, I thought that was true. So let it be known, world, I concluded, that I was disappointed in Thread Killer's behavior, despite the fact that the discussion itself had been essentially worthless. At least, while it was a discussion, it had the potential to get somewhere, but now that potential was lost completely. Two monologues do not a dialogue make.

It does sadden me that this happened; not just that it happened to me, but that it could happen at all. At what point did we give up on trying to compromise? At what point did we decide it's ok to hurl accusations at people we don't know, simply because they disagree with us on something? When did we begin to allow ourselves to believe that a person could be categorized so completely that the words we use to describe them don't even have to make sense anymore - how can someone be a socialist and a fascist dictator at the same time? - or, in another sense, when did we become so angry that it didn't matter what words we chose? Why does it seem to be so easy for an online thread to escalate from discussion to accusations and threats of violence? And how do we know if such violence isn't (or is) really intended?
Is it the nature of modern communication? We are so easily able to remain anonymous and hidden in a word of internet memes and text messages that perhaps we don't bother with civility anymore, the same way we're more likely to pick our nose in the car than elsewhere in public. Or is it the media, which blasts us with so much information that we're unable to process anything but whatever simple mantras they feed us, without even knowing what they mean: socialist, Second amendment, constitutional rights. Is it just human nature to feel that it's always "us versus them"?
I worry, mainly, because whatever the cause, this behavior is indicative of a dangerous current; the hidden undertow of willfully ignorant tribalism that threatens to destroy the very democratic social contract on which our country was, at least in theory, founded. We are losing our educated and informed public opinions, and we are losing our desire to even engage in the discussion and debate necessary to fairly and communally apply those educated and informed opinions. And that is a very frightening thought. Far more frightening than whether Obama is trying to take away our guns.


*It should be noted that Godwin's Law does not apply in the circumstances that the discussion is actually about something to which a comparison with or reference to the Nazis is relevant, such as WWII.

**I do not wish to make light of the tragedy that happened, but I also do not wish to mince words. The relevant portion of the exact quote in question: "Apathy like yours was the same apathy that allowed millions of Jews to walk into ovens." This was a hateful and cruel statement directed straight at me, and it is not acceptable to let such behavior go unchecked.

Sunday, January 13, 2013

On Holst, and the benefit of uniformity

This afternoon, I took in a chamber orchestra performance of four pieces by British composers. Before the concert began, the man seated next to me mentioned to his companion that the orchestra always reminded him of funerals - everyone dressed in black, with dark, heavy curtains draped around the stage as if at a wake*. I was surprised by this admission, actually, because to me, the pageantry is obvious; the purpose of the dark clothing and darkened stage boundaries is to allow the performers to fade into the background, and the music to take center stage instead.
The conformity - or uniformity - of the orchestra serves this purpose. Not only must they all dress alike (so much so that entire clothing companies exist solely to provide "concert black" - imagine if one cellist was wearing really dark blue instead), but they must even behave alike, bowing together and in time. One or two musicians from a section who are out of sync with the rest cause us to wonder if they are capable players; if everyone in the orchestra "did their own thing," it would be horribly distracting, and we would be unable to appreciate the music itself.
So there are, in fact, circumstances wherein conformity to a prescribed system is beneficial; there are instances where we must be willing to give up our individualism for the "greater good." In the case of the orchestra, it is the music; in the case of society, it is the betterment of human life. Of course, as soon as that system ceases to provide a net positive, it and its defined limits of conformity can be discarded. But we should not be disparaging of the system simply because it demands uniformity of its members.
Of course, it also helps when the "system" in question is Holst's St. Paul's Suite.


*The man seated next to me also indicated, rather matter-of-factly, to his companion that watching "a lot" of BBC America - Downton Abbey, presumably - makes one "an anglophile." So we musn't take his opinions too seriously.

Sunday, January 6, 2013

The Scientific Stereotype

Paul Dirac was once famously (though perhaps anecdotally) quoted as saying, "In science one tries to tell people, in such a way as to be understood by everyone, something that no one ever knew before. But in the case of poetry it's the exact opposite!"
And we laugh, and perhaps picture Sheldon on Big Bang Theory saying something inanely similar.
Of course, we have a right to laugh, and I do not wish to sound as though I am a spoil-sport.

A little context to begin. The anecdote involving Dirac comes from the book Brighter Than a Thousand Suns: A Personal History of the Atomic Scientists by R. Jungk. The full story goes thusly:
Nearly all of the Americans who became well known later on for the development of atomic energy had been at Gottingen at various times between 1924 and 1932. They included Condon, who complained in lively fashion of the lack of comfort in the Gottingen lodgings; the lightning-brained Norbert Wiener; Brode, always deep in thought; the modest Richtmyer; the cheerful Pauling - one of Sommerfield's pupils, who often came over from Munich; and the amazing "Oppie," who managed to pursue in Gottingen not only his physical studies but also his philosophical, philological and literary hobbies. He was particularly deep into Dante's Inferno and in long evening walks along the railway tracks leading from the freight station would discuss with colleagues the reason why Dante had located the eternal quest in hell instead of in paradise.
One evening Paul Dirac, who was usually so silent, took Oppenheimer aside and gently reproached him. "I hear,' he said, 'that you write poetry as well as working at physics. How on earth can you do two such things at once? In science one tries to tell people, in such a way as to be understood by everyone, something that no one ever knew before. But in the case of poetry it's the exact opposite!"
The difference between the two men could not be more pronounced.
At the Trinity test, which demonstrated the success of the first nuclear weapon in history and culminated the years of the secretive Manhattan Project, the well-read Oppenheimer would quote the Bhagavad Gita: "Now I am become Death, the destroyer of worlds." He is often cited as having additionally thought of the verse: "If the radiance of a thousand suns were to burst at once into the sky, that would be like the splendor of the mighty one." (Incidentally, this second verse is where the Jungk book derives its name.)
Dirac, on the other hand, was "pathologically reticent, strangely literal-minded and almost completely unable to communicate or empathise."
Which of these two men should we wish to emulate? Which of these two has stayed true to the calling of humanity, that is, to be human?
And yet, Dirac often spoke of beauty, especially the beauty of mathematics. He went so far as to say that "getting beauty in one's equations" was a "sure line of progress," and that in fact he preferred the mathematical beauty to the "physical concepts" he "learnt to distrust." So he obviously understood the poetic impulse, if only on an unconscious level.

My point in relating this story is that, while we find the renowned social ineptitude and narrow focus of the scientist (specifically physicist) humorous, we must not be taken in by the lie. We must not be content to fall into that stereotype. We must not continue propagating this myth. On becoming scientists, we do not resign our titles as human beings.
It is to the detriment of both ourselves and the world if we, as scientists, fail to engage in other spheres of life. We have so much to gain from literature, music, art, nature, food, politics, economics, spirituality, philosophy... and life has much to gain from our involvement. We are all the more hypocrites if we focus only on science and yet demand that the rest of humankind accept our scientific views in addition to their own. If there is never any quid pro quo, then we will never be considered trustworthy. I do not mean to say that we should allow religious doctrine to steer scientific inquiry or that politics has a right to direct the topic of scientific study, but instead that we, as scientists and human beings, should at least understand that these different views exist. We should try our best to see the merit in any point of view, not degrade them simply because they differ, and we should be able to take a step back - out of science, if you will - to understand the context of our own view. Science is not all there is, and we scientists should not live as though it is.
All of us, scientist or not, will benefit from the poetic impulse as well as the scientific one.

Humorous as it is, I have to side with Oppie on this one.