I'm a fan of science. I use it every day. I'm using it right now. And with a smile. But that doesn't mean, in my view, that science should poke its nose into everything. Particularly when the science under consideration is the kind that scientizes something just because it can. I'm talking here about two related phenomena: The first is for researchers to study and then quantify something that was more fun when it was explained via non-science, and the second is for news organizations to report such psuedo-science as, well, news. Not only does the scientization suck the fun out of these events, it also cheapens the idea of scientific journalism.
I bring this up because today, on three major (online) news outlets, I came across headlines for three articles detailing such scientization. These were: an ABCNews.com story on the mortality rates of rock stars, a CNN.com report on a study that "confirms" that men are attracted to attractive women, and a stop-the-presses from MSNBC.com about how scientists have (finally!) located the gene which controls skinny. My issue is not just that these studies seem a little unfocused (look at the sample rates, for instance), or that this is headline news (which it was on two of the three sites), but rather that I can't figure out the motive behind the stories and their newsworthiness in the first place. Why is it worth scientizing things that most people knew anyway? Who benefits from this research and reporting? What am I supposed to do with this information?
Now, I should be upfront about my standards: the only two magazines that I read cover-to-cover each week are Entertainment Weekly and Newsweek. So, I certainly admit to having a lot of flex in what I'm willing to take as "news." But I do spend at least an hour each morning reading over my local and national online news sources. So I have some standards. And I've spent a good deal of time in my professional life looking at how science and popular culture intersect, so I think I'm coming at this from a fairly reasonable perspective.
Despite all of that, I am absolutely flummoxed by these three articles. If you read them, you'll see that they are very descriptive, but not at all prescriptive. By that I mean, they say a lot about what the studies report (Shocker! A rock 'n' roll lifestyle is not conducive to longevity!), but say very little about what they mean or what you or I, or even a rock 'n' roller, should do about any of this. You're either genetically skinny or you're not. If you're a heterosexual female, you're also either hot enough or you're not (interesting that these two stories came out on the same day, no?). And it turns out that, in both cases, being skinny and "hot" may not be the best thing, except the articles don't go into too much detail about what you could or could not do about it if you were or were not. You are whatever you are, and then that's that. Frankly, it's neither science nor reporting so much as it is just charting. They put the diagram up on the board, and then you stand next to it and see how you measure up. What comes next ... well, apparently that isn't important. To my mind, I don't think it is too much to ask of science journalism that, when reporting a study, they take a few steps out on the limb and and answer the notorious "So what?" question.
But maybe I'm missing the point. Maybe the "So what?" question is presumed (or is it assumed?). Maybe it's enough these days simply to know how one measures up. But is that the most we should expect, even from pseudo-science? That it provides people with a scapegoat? "I'm not fat; I'm genetically predisposed to surviving famine." This is America's new, brilliant strategy! Eat all the food so that when the rest of the world starves, we'll be able to have it all. But if this logic is true, then it may explain the skinny and hot articles, but not the dead rockers. What is the point of studying, and then reporting, the mortality rates of rock stars? I suppose, and I'm stretching here, that such a study would be beneficial to actuaries. But I'd also suppose that, in the general risk assessment, giveaways like a fiendish drug habit and an inability to blink in unison -- not to mention checking the "Yes" box for the "Are You Keith Richards?" question -- would make such research redundant.
As I said, I'm not just irked by the fact that this seems to be both dead-end psuedo-science and dead-end journalism -- marginally informative yet woefully under-instructive. I'm also upset that scientists feel they need to apply their methodologies to things that are more fun, even useful, without quantitative data. The morbid mythology of rockers was running along just fine before someone drew a graph across Kurt Cobain's tombstone. I don't think most people need a study confirming for them who they are attracted to (maybe a study telling them why, but certainly not who). And I'd wager that, just like the skinny gene study, seeing the answers to those questions quantified isn't going to do a whole lot for one's outlook on life. And I think that's a real concern. It's not just un-fun, it's depressing to see things like rock 'n' roll, dating, and even body types reduced to numbers without a sense of what comes next or even what to make of it.
There are plenty of things out there that need the efforts and attention of real science. I love the science that lights up the imagination, and science journalism that puts those ideas within reach. I need that kind of science, and I want to read about that kind of science. I'm not as sure that I need to read about a study that proves that Cobain, Lennon, and Elvis all died before their time, that they probably were attracted to "hot" women, and that two out of the three probably had the skinny jean. I kinda sorta had that stuff figured out by myself already. I read it in Entertainment Weekly, where that kind of journalism happily belongs.