Monday, December 14, 2015

Keeping the Mass in Christmas

It's that time of year again, that festive season when people become outraged over red cups and shout angrily about the War on Christmas.
I'd like to address a couple of these idiosyncrasies, if I may, from the point of view of someone who cries a little bit every time an article headline in The Onion hits a little too close to the truth.
First: that all-too-familiar battle, "Happy Holidays" vs "Merry Christmas". First of all, let's get one thing straight. If someone came up to me on the street right now and said "Merry Christmas," he/she would be factually incorrect. It's not Christmas. It's Advent. Advent is one of those periods on the church calendar that most churchgoers (Catholics and Episcopalians excepted) have completely forgotten exists. Advent is the solemn and reflective period before the joy-of-hope-fulfilled that is Christmas, which officially begins on December 25th and lasts for 12 days (hence the song). It is most definitely not Christmas right now. (How would you feel if, already in April, I was wishing you a happy Independence Day?)
From a more secular point of view, at the moment we exist in that nebulous and uncertain period of the year in between Thanksgiving, Christmas, and the New Year - all of which, I might mention, are known as "holidays" - and thus a broader greeting such as "happy holidays" or "happy holiday season" is perfectly acceptable, as well as being factually true. We don't say "happy holidays" because we hate Christmas. We say it because we love Thanksgiving and New Years Eve.
Second: keeping the Christ in Christmas. I wonder why no one ever seems upset that we don't keep the Mass in Christmas (a good long mass, starting early, with lots of standing and sitting and mumbling). But emphasizing the Christian tradition over the pagan one isn't historically accurate. Christmas is a mash-up of pagan traditions (cf. Saturnalia) and Christian values. We celebrate the birth of Christ as the actualizing of hope in a time of darkness (midwinter), but the idea of light vs dark is far older. The tradition of bringing a tree indoors and bedecking it with candles (later, a string of electric lights) is a pagan ritual, not a Christian one. Same goes for having a huge feast with your extended family, a demonstration of the wealth of love and the hope that winter will someday end and your depleted stores will be replenished. In fact, many early Christians didn't celebrate Christmas at all. Anyone who wants to keep Christ in Christmas to the detriment of everything else that has become part of the Christmas tradition had better avoid getting a tree, or cooking a ham, or decorating the house with lights, or even gift-giving... that tradition used to be tied to end-of-the-year blessings. Also, American Christmas carols. Pretty much the lot of them are secular in their content.
Third: anyone who fights over aspects of the Christmas season is completely missing the point of the Christmas season. Go have a hot chocolate in a red cup and relax.

Monday, June 8, 2015

Shooting yourself in the foot, science policy style

Last week, the White House's Office of Science and Technology Policy posted a note on the value of basic research (read it here). Of course, the letter is from people who do value basic research, people who understand that it is important and deserves to be funded. The thing they miss - the enormous, glaring, incomprehensible point they miss - is that the argument is all wrong.
I appreciate that people like to know science is doing something for them, making their lives better. But why has that become the single indication for whether science is worthy to be funded? Basic research, by definition, seeks only to know something which isn't yet known. It is knowledge for its own sake. Yes, sometimes (and the OSTP letter gives many excellent examples) the knowledge we gain can be applied to something to make our lives better. Knowing the mating habits of the screwworm allowed us to eradicate it from our cattle ranches, saving billions of dollars. But this shouldn't be the reason that we give for having funded the research in the first place.
We scientists get upset when someone comes up with a new criteria for funding, saying that we have to prove that the science we're doing will come to some eventual use. But we're the ones who have allowed this sort of criteria to evolve. We apply it to the past - we take examples of basic research and show how they eventually came to some public fruition - but we complain when that same logic is applied to the future. Why can't we just fund basic research, no strings attached (if you will), the end result being only knowledge? Not "let's fund basic science because maybe sometime in the future it will be useful to you," but "let's fund basic science because it is important that we know."
Knowledge, and its pursuit, have intrinsic value. We are not humans because we can make tools. We are humans because we care enough to understand them.

Friday, May 8, 2015

Lamar Smith and the science funding legislation

There's been a lot of angry press lately regarding the proposed science legislation. Don't get me wrong, I appreciate that cutting funding for climate science is bad. Denying that climate change is real is bad enough. But there's a silver lining here that people are ignoring.
The legislation wants to increase funding for basic research. It even goes so far as to explicitly state that basic research should be funded by the government, instead of applied science. There is a tremendous truth in this - the private sector does not, and generally cannot, fund basic "pie in the sky" science. Things that have no obvious application; things that may never have any useful application, other than the increasing of humanity's store of knowledge. Things which, fifty years from now, will surprise us by coming out of left field to create the next amazing thing. This is precisely the kind of science the government, as the largest single source of money, should be funding. Let the private sector use the knowledge we gain from our basic research to develop applications and sell them. That's what they do.
So the climate research folks are up in arms about having funding cut, but isn't it only fair that solar cell companies continue the development of more efficient solar cells? Isn't it only fair that continued research into renewable energy be supported by those corporations and individuals and institutes for which renewable energy itself, and not basic knowledge, is the goal? On the other hand, is it fair to assume that Elon Musk and Bill Gates are going to fund research into fusion or biology?
Of course, all of our problems would be solved if only there was more funding overall - if the government could fund renewable energy research and basic fusion research, biology and climate science, mathematics and engineering. But we don't have that luxury, so we have to make tough choices. I don't often agree with the Republicans, and even now I don't agree with Smith's motivation, but I don't think it's necessarily a bad thing that he wants the government to fund basic science.

Wednesday, April 1, 2015

Movie Madness

So lately, I've been on a B-movie kick. Mostly low-budget monster and disaster films, though there was also a DVD set of police drama/action movies that I'm amazed anyone was ever given funding to make (picture this: Randy Quaid as a serious precinct detective hunting down a serial killer, or a 20-minute long montage of Chuck Norris and Eddie Cibrian doing roundhouse kicks). I must admit, I've become a rather big (and ironic) fan of the folks at The Asylum, those brilliant minds behind Sharknado and Sharknado 2: The Second One. (NEWS FLASH: Sharknado 3 airs July 22nd!) It turns out, these people have made a hundred movies. Yep. One hundred. Most of them are blatant ripoffs of contemporary big-budget films, but some are real gems.
One noticeable flaw in every bad monster film or disaster flick is how inappropriately the characters respond to what's happening around them. Somehow, the best course of action always seems to be to run into the woods alone, or go skinning dipping in that swamp, or try to fly through the storm, or have a gala fundraiser, or stand nearby holding a blowtorch and shouting theatrically while you watch your friend get eaten by something which is decidedly not fireproof. Now, I understand that in such... unique circumstances you may not be thinking straight, but even in these movies the main characters will eventually - after about 87 of the movie's 90 minutes - figure out the right course of action. So why not sooner? Why wait until most of the characters have been chewed to bits or crushed under rubble or lost to the vagaries of unimaginable death? It would save money on hiring extras.
It seems that there's a very fine line between killing off characters at random, based sometimes on the consequences of their poor decisions, and making characters make poor decisions which get them killed off. The former is realistic, while the latter is the delicious fodder for MST3K. We might react stupidly out of fear or incredulity, and that reaction might lead to our demise. That's pretty human. But to formulate a plan, maybe not even a bad plan, execute it, then seem so shocked that it failed completely that we stand still and let the monster eat us out of some kind of weird stubbornness to believe our own fallibility... I dunno. Maybe that is realistic.
Is the Sharknado phenomenon because we love the campiness of ridiculous plot settings and learned-this-on-the-internet CGI? Or do we connect with the characters somehow, allowing them to play out on the screen all of the stupid mistakes that we hope to never make in life? I'm not sure.
What I am sure about is that little feeling of glee I get when the giant snake's dismembered head eats Debbie Gibson at the last moment.

Wednesday, August 13, 2014

Twitterpated, The Sequel

I've been neglecting you, I know.
Part of it is this new media - twitter and, in a way, facebook - and the demands of it. I make a point of posting something every day, which, it turns out, is difficult to sustain. My facebook page now has over 300 followers, and I feel that I owe them something (this is not necessarily a bad thing), so each day I find news stories about science and link them. My twitter handle has only 33 followers, however, but finding content to pass along is somewhat easier. The difficulty with twitter is finding something unique to say, instead of merely forwarding what others have already posted. Each tweet must stand alone, be self-sufficient and self-explanatory (at least cursorily), and be interesting enough that someone might care to read it or share it with others. My blog, on the other hand - and you know who you are - has about half a dozen readers.
Now, I'm not fooling myself into thinking that 300+ people like my facebook page because they really like it and care about what I have to share. I think that the majority of those people have either agreed to follow my page because I know them personally and harassed them into it, or else they are strangers who accidentally liked my page because it's the name of that song by The Killers (that song knocked me from #2 in a Google search to pages behind a nearly infinite number of youtube videos and lyrics sites... grumble grumble). But maybe I can do something positive given those circumstances - maybe if I share enough stories about cool science facts and discoveries, things that we've learned over the years (evolution, climate change, and neurology come to mind), some of it will get through. Maybe someone out there will be chatting with her friends over coffee and recall seeing some news story about how scientists built a self-folding robot, and they'll discuss how neat that is, and maybe she'll take some engineering courses in college. Ok, a bit optimistic perhaps, but the point is valid. Maybe I can make a positive difference, however small.
There is one other reason why I have been neglecting my blog, however, and it answers the question you probably have right now: "if you do want to make a difference, why should that involve twitter and facebook, but not blogger?" It's a valid question. The answer is simple: health. (There is one additional but less important answer: summers are always busier, with interns and conferences and so forth.) Blogging requires a lot more effort than does retweeting photos from NASA, and lately I've not been well enough to have the energy to do it. Without going into detail, this health issue has actually been a real struggle for me. I want to write - desperately - but I very often find that I can't.
Does it matter to anyone but myself? Probably not. I write mostly for my own benefit, so there's very little (if any) global impact if I stop. This is why I've concentrated what energy I do have on the upkeep of MAB on facebook instead of here. It (potentially) matters to more people. (NB: this is not an underhanded ploy to garner empty compliments!)
So that's that. I've let myself become twitterpated because it's less work than being bloggerpated. I am a limited resource (even more so with the health issue) and so I must take care to delegate my effort. But for all that, I enjoy blogging far more than posting news stories on facebook. So I won't completely give it up.

Monday, April 21, 2014


As Thumper explained to Bambi, "he's twitterpated."
In the real world, twitter has taken on new meaning. Having recently joined the twitterverse - with considerable trepidation, I might add - I wanted to share my experience.
I have to admit, the Twitter format seemed impossible, at least it did before I started. But the 140 character limit turns out to be a boon for creatively sharing information. The strict format fosters creativity, much in the same way that a highly structured poetic form encourages more imaginative use of language, or a small studio apartment motivates clever uses of space. I have to think about what I'm going to say and how I'm going to say it, to a greater extent than when blogging or posting to Facebook (to say it without using txt spk is an even more difficult undertaking!). That isn't to say that I am sloppy with other forms of media, but instead that the format affects the content in a particular (and interesting) way.
Because of this, I think Twitter would be a good exercise for any scientist who wants to work on getting better at communicating with the public. If it takes more than 140 characters to explain, in basic terms, what you're researching, then perhaps it's not so compelling research after all... either that, or (more likely), you're not doing a good enough job of explaining! (A decade or so ago, we used to refer to this kind of communication as "talking points.")
So call me twitterpated. I don't mind.

Tuesday, April 8, 2014

The Many Worlds of Leo Szilard

I've just returned from the APS April meeting, which hosted a special session entitled The Many Worlds of Leo Szilard (yes, Neil deGrasse Tyson was also there....).
Sadly, I overheard two students in the auditorium behind me commenting that they had no idea who this "Sizlard? Zilard? Lizard?" guy was.
Dr. Szilard, a physicist from Austria-Hungary, was instrumental in the development of the first nuclear reactor, electron microscope, linear accelerator, cyclotron, and was involved in the Manhattan Project. He is even anecdotally credited with designing his own radiation cancer therapy. My work would not be possible without the solid foundation he provided.
Despite his work on the nuclear bombs of the Manhattan Project, he had a tremendous respect for human life and hoped desperately that the United States would not actually use the weapons. He drafted a petition, collected signatures, and presented the result to President Harry Truman, but to no avail - the bombs were dropped on Hiroshima and Nagasaki, ending the war in the Pacific arena. Szilard was eventually dismissed from the Manhattan Project by General Leslie Groves for having suspected communist sympathies.
Today, few people outside of those interested in our nuclear history know his name (let alone how to pronounce it). In spite of this, he is forever memorialized with a crater on the dark side of the Moon named after him.
I can only hope those students learned something. I'd rather not repeat the history that Dr. Szilard tried so valiantly to prevent.