Tuesday, November 24, 2009

Global warmi - I mean "climate change"

Megan McArdle, writing here in The Atlantic, criticizes the "scientists" (scare quotes rendered necessary by their alleged reprehensible behavior) involved in the CRU anthropogenic-global-warming email collusion/data fudging/destroying-fest. That's good; she should. She says that

...I have yet to see the makings of a grand conspiracy, rather than the petty bullying of the powerful over the weak, the insider of the outsider. I'll take the statements of this particular group of scientists with a little more salt in the future. But as far as I can tell, the weight of the evidence--and what we know about the history of the planet, and carbon dioxide--still seems to be on their side.


But here's the problem: what is the "weight of the evidence"? This group's original data set was destroyed. Inadvertantly? Possibly... but considering that they were discussing destroying it in the event of a freedom-of-information request, that too is questionable. As one of McArdle's commenters said, "One notes that, in both law and common sense, when a person destroys evidence, you are allowed to presume that it supported his opponent's case."

Sure, it's not the only evidence. But there have been problems with every data set I've heard of: new urban heat-sinks where temperature stations used to be rural, cherry-picked tree ring data, glaciers growing where not measured for an AGW paper, shrinking where the AGW boosters measured, inability to explain the mechanism behind either an ice age or a warming period, pre-1960s temperature data that dramatically smooths out temperature variance graphs, Mars warming up in the absence of coal-fired power plants and SUVs... With all this uncertainty out there - uncertainty noticed only by "skeptics" (another McArdle commenter noted that skepticism used to be considered a virtue in science, not a vice) - one could be forgiven for believing that the really bad acting on the part of these CRU folks calls their findings into question.

I used to be a geologist. In the summer before I graduated, I worked for a gold mining company, collecting rock samples from a very steep hunk of the Sierra Nevada. Here's how you collect a rock sample: you locate yourself on a map, find a rock where you're standing, whack it with a rock hammer until you can collect enough pieces of it to fill a bag about the size of a quart of milk. Mark the location on your map with a unique identifier, mark the bag the same way, stick the bag in your backpack, take it to a lab. The first step - locate yourself on a map - is VITALLY important, because if you don't know where your sample came from, you can't draw any conclusions about where the most gold is. (Samples A, B, and C, in an east-west line, show increasing gold concentrations to the west; perhaps you could put a mine toward the western extent of your sampling area; you certainly wouldn't put it to the east. But say you mislabel your bag - I did that ONCE - or you can't find yourself on the map; where do you dig?)

My map, the best one available for my field area, had 40-foot contours. Think of contour lines as the "bathtub ring" around an object - put a big irregular rock in a tub, fill the tub exactly an inch deep, and look down on it from above. The "bathtub ring" around the rock would be exactly level at one inch, and would show the shape of that rock at a height above ground (the bottom of the tub) of one inch. Now fill the tub two inches high and look down on THAT bathtub ring to see the shape of the rock at two inches above ground. Et cetera. Draw those bathtub rings and you have a contour map of the rock, with bathtub rings closer together where the rock's side is steep, farther apart where it's more gently slanted down. Now scale up: a field area five miles square or so, with the "bathtub rings" at 40-foot intervals. It's mountainous - but most of the outcrops of rock, ridges, cliffs, are shorter than 40 feet. So the map, friends, is nearly useless: you can't tell where you are from it. I could be standing next to a thirty-foot ridge of rock as steep as the side of a building, and it'd be nowhere on my map if it happened to fall between, say, the 5,000- and 5,040-foot contours.

So I had to start from a known point (say, the end of the high-center dirt road I drove to get to the area), tie off a 200-foot nylon tape measure, take a bearing with my trusty compass, and walk to the end of my tape. Mark that spot, on the ground and on the map. Go back and untie my measure, return to new known spot. Tie off measure, take a bearing, walk (or sometimes, essentially rappel, using my tape measure as a line - though the steep slopes naturally mess up the horizontal measurement)... until I could get to the area from which I was to collect a sample. Oh, and sketch in those 20- and 30-foot cliffs and ridges, for the next schmuck. Took a long time.

The point is, my model - my map - was missing a LOT of data, and in fact bore next to no resemblance to reality. I had to fill in those data as best I could, and I had to do it with an eye to the next baby geologist who would be sampling there - he or she would need to know where things were, just as I had. So I had to be (a) careful and (b) transparent. But even so, my map at the end of summer was only a little better than the one I started with, because it only improved where I was sampling; to look at my map, the next baby geologist might think, "Wow, this area here where Jamie was working sure had a lot of cliffs and ridges; luckily it evens out over here in the part she didn't get to. I'll start there!" And that baby geologist would find him- or herself rappelling down scree slopes hanging onto a 200-foot tape for dear life, just as I had.

The dangers of relying on a model, that's what I'm talking about. And when you throw in incomplete reality checks, reality checks in only the convenient or hypothesis-confirming places, and deletion of reality checks that either do not or might not agree with your model, well, the best face you can put on that is that, poor you, you're going to get all wrapped up in the model and lose the ground truth, ending up looking like a fool; the worst face is that you're trying deliberately to deceive.

Sunday, November 22, 2009

And just to clarify...

I've spent too much time today, as I did after seeing Twilight, reading movie reviews. The reviewers, almost to a wo/man, hate New Moon* - and who can blame them? It's not a movie for movie fans. It's a movie adaptation of - not even a book for book fans, but a book for fans of a specific mythos.

I am not a fool (she insists uneasily). At no time have I ever had even the sneakiest secret thought that Stephenie Meyers is a good writer**. I've had a free sample of her post-Twilight-series novel The Host on my Kindle since I finished the T-series, but haven't even been able to slog my way through its few pages; Meyers simply writes like a halfway decent fanfic "author." (I've read too much fanfiction; most writers in the genre really suck wind.) And of course I've dealt with Bella as Mary Sue, last year after reading Twilight.

In other words, it's not the writing.

As for plot, well, I wouldn't be the first to point out that there isn't much that makes sense, has any kind of inevitability, or appears to be moving forward in any of the four Twilight books. It isn't the plot.

Characterization: in short, all characters in Meyers' books appear to have come out of either a Harlequin romance or an iconography - a bad one. I remember a quote from the VERY funny Jean Kerr in Penny Candy, in which she describes her lengthy convalescence after a bad cold; she talks about how she wants to use this enforced bedrest to read a "good" book, and keeps trying to pick up some giant worthy tome - but reverts again and again to a trashy novel, even though she knows it's trashy. I don't have Penny Candy, to my sorrow, but she says something like this: "What is it with the way these characters are portrayed? If he has 'crisp black hair,' then she simply can't have 'moist red lips'; things like that just don't happen in real life. She should have 'dry pale lips,' or he should have 'limp gray hair.'" And so, no, it's not the characterization.

As I've said too many times now, it's seventeen, that's what it is: for someone like me, much older than seventeen, it's a grasping for what once seemed like a possibility, now revealed to be a fever dream - not just impossible but not even really desirable once the fever cools. For the kids of the right approximate age, I have to speak from memory, but it seems to me to be an affirmation that that bruised and battered sense of possibility, so alive in the ultra-romantic anti-cynics who think they're world-weary, could indeed be real... somewhere.

* I do find it interesting that they hate it for such varying reasons: the split between those who think Stewart is brilliant but hampered by the screenplay and those who think she's a one-trick pony and in this movie the screenplay asks her to do another trick that she's just not up to is particularly clear. Lautner gets kudos for being the "only" member of the young cast who can act, or emote, or smile naturally, or make you care what happens to him; he also gets slammed for posing, reading lines, being utterly unbelievable, etc., etc. Pattinson is a minor character, really, but because he was so important in the first movie, HE gets roundly slammed for being all emo and stuff. Dakota Fanning gets oddly enthused reviews considering how little she had to do - and honestly, put me in red contact lenses and I could smile enigmatically as well as she does. Across the board, they all seem to love Michael Sheen, whom I've never even heard of but who is evidently a "serious" actor - apparently you can be totally embarrassingly over-the-top and all the reviewers around will say you're being "deliciously campy" if you were a "serious" actor before. Some loved the soundtrack; some hated it. (I hated it.) Many, but not all, liked the cinematography. Weitz's direction was generally noted as a step up from Hardwicke's - but not in every case; some reviewers thought Hardwicke, as opposed to Weitz, really understood her subject matter, whereas Weitz is just out to make a buck. Whatever.

** But at least the Twilight books do explicitly encourage the reading of good stuff. Bella's a big booster of classical romance.

As the kids say, zomg!

For a both hilarious and oogy take on the whole Twilight series, go here. But brace yourself: while everybody who's paid the slightest attention to the books knows that Stephenie Meyers, a Mormon, pays a certain homage to her faith throughout the series, the link seeks to explore JUST HOW MUCH homage.

F'rinstance: Edward Cullen, from the books' descriptions, looks a whoooole lot like Joseph Smith. Yeesh!

Friday, November 20, 2009

Seeing New Moon

It's November, and that means it's Twilight time! The breathlessly-awaited Part Deux, New Moon, came out today; I saw it with my oldest kid at 12:30 in the morning, all the midnight showings at our local theater being sold out.

So I'm tired. But here's the thing: I'd see it again this minute if I could, and not just because I find Robert Pattinson to be what the fangirlz call "a hot mess." (Though I do.) I went in Team Edward; I remain Team Edward, but only because it's fantasy, where it costs nothing to say, "You love who you love": in reality, who couldn't see that Jacob is better for Bella? Even my son was whispering that undeniable truth to me at almost three in the morning. I'd see it again for the same reason that I saw Twilight as many times as I could get to a theater, then squee'ed like those fangirlz again when my children got me the DVD for Christmas: because there's nothing like an utterly unreal romance.

The premise, set up by Stephenie Meyers, is perfect: vampires live forever, or near enough, because they simply no longer change physically; similarly, they change mentally or emotionally only with great difficulty, and any change of that sort that they undergo is for all intents and purposes permanent. So when Edward, a 109-year-old vampire, falls in love - for whatever reason! though I appreciate the bootlegged Midnight Sun's Edward-voiced explanation, on which maybe more later - with human Bella, it's a true endless love. And Bella, who's set up in the books more effectively than in the movies as a sort of vampire-lite even as a human - pale though she's grown up in Phoenix, standoffish, super-constant, readily accepting of the vampires' world - reciprocates that love in every measure including its permanence.

Of course, anyone who fell in love at seventeen knows that the love of a seventeen-year-old is like an old-fashioned sparkler: white-hot and exciting, quickly fading, and suddenly gone. Some few seventeen-year-olds find that their loves evolve into something deeper and longer-lasting; a couple of friends of mine who started dating at that age are happily and solidly married now, twenty years later, on that account. But the dastardly appeal of Edward and Bella's romance is that the white-hot excitement never has to fade. And who, as they put yet another load of laundry into the washing machine and read yet another story to the children who have resulted from a different species of love, wouldn't want to believe in that possibility?

My favorite moments: the collective gasp through the theater when cute little TOTALLY hunky (and recently legal) Taylor Lautner gratuitously whips off his shirt to stanch Bella's bleeding head wound; even though we'd all seen bazillions of pictures of Lautner's buffing-up, it was jolly good fun to see it all together with our (mostly) commadres on the big screen. The latter third of the movie, wherein they finally let Edward ditch the red lipstick and look absolutely haggard and awful in his grief. And, even though it didn't have the *whoo-ee* of the first kiss in Twilight, the few kisses in New Moon focused less on Edward's giddy triumph at managing to kiss Bella without killing her and more on his pain and difficulty in kissing her; one kiss in particular, I can't recall which, stood out because he gives a little whimper at the end. And I'd be lying if I said that the scene in which Edward leaves Bella in the woods, when he's trying to convince her that he's leaving because he doesn't want her any more rather than because he's desperately afraid that he'll end up either killing her or not being able to protect her from his own family, didn't make me turn cold all over.

Pattinson was, I thought, spot-on as a man with no more will to live, and then, finally, after Edward and Bella's reunion, a man who's decided to live again but is terrified of the price; I've seen some reviews call him wooden or mopey, and I disagree wholeheartedly. He struck me as hopeless, which is exactly what Edward's supposed to be. Stewart's sometimes near-suicidal, sometimes inappropriate-affect Bella is harder for me to feel sorry for - probably because I actually was an eighteen-year-old human girl in love with the wrong guy once, and I lived not only to tell the tale, but to love the right guy and build a life with him. And so we get to Lautner: the right guy. He did a fantastic job making me, die-hard Team Edward as I am, wish that there were some Star Trek-esque alternative reality scenario in which he gets the girl. I bled for him in a way that I couldn't for Bella or Edward - who, after all, were going to end up together; all poor Jacob gets, in the end, is an awkward imprinting on Bella and Edward's baby daughter, a way to heal a mythic breach but hardly more than a consolation prize for a guy whose devotion never flagged.

Wonder when I can get away to see it again...

Monday, November 09, 2009

A useful illustration

Ross Douthat, writing in the New York Times, made the case that the anniversary of the fall of the Berlin Wall (today!) should be more noteworthy than it is. He said that "For most of the last century, the West faced real enemies: totalitarian, aggressive, armed to the teeth. Between 1918 and 1989, it was possible to believe that liberal democracy was a parenthesis in history, destined to be undone by revolution, ground under by jackboots, or burned like chaff in the fire of the atom bomb....Twenty years ago today, this threat disappeared."

One commenter, a popular guy (his comment recommended by 252 readers as I write) who likes the word "specious," responded thusly:

That is utter nonsense. More like the chaff of fear that Mr. Douthat's ilk uses to obfuscate the truth. Douthat needs to go back to school and study history. It was Mikail Gorbachov that in 1988 announced that the Soviet Union would abandon the Brezhnev Doctrine and allow the Eastern bloc nations to freely determine their own internal affairs.

It was Gorbachev that ended the cold war. Not Günther Schabowski and not Ronald Reagan. It was the insightful courage of Mikail Gorbachev that ended it.

After a not so veiled dig at President Obama for not attending a 9 November ceremony in Germany, Mr. Douthat delivers this false paean:

"Never has liberation come to so many people all at once — to Eastern Europe’s millions, released from decades of bondage; to the world, freed from the shadow of nuclear Armageddon; and to the democratic West, victorious after a century of ideological struggle."

Balderdash, Mr. Douthat. Why is it that it was Mikail Gorbachev who was awarded the Nobel Prize for Peace and not Günther Schabowski (and never Reagn)? Why does Mr. Douthat ignore the facts of history? Because he cannot make his specious point if he adheres to the truth.

The final nail in the coffin of Mr. Douthat's specious treatise comes in a statement from his last paragraph: "Maybe we miss living with the possibility of real defeat." The problem, Mr. Douthat, is that the rise and fall of a great nation has its lesson even today. Great nations don't fall from forces arrayed against them from without. They fall from the corrupt forces that rot them from within.

Yes, America is in danger of defeat but not from external enemies. We are on the road to defeat because of the naysayers in our Congress and the hatemongers who cannot abide Barack Obama's Presidency.

And that, as Edith Ann was wont to say, is the truth.



I've reproduced virtually the entire comment, because of two things: first, the commenter's contention that Gorbachev's winning of the Nobel Prize for Peace is evidence of some kind (please see my prior post - oh please, if you're one of the few who really do believe that Pres. Obama "earned" that prize! - for how I and many others feel about the anointing of somebody or other by a few Scandihoovians - of whom, two generations removed, I'm one in part). Sheesh.

And second, gosh, I agree with his third-to-last paragraph - the one where he says a grave danger to the United States is rot from within. But in the penultimate paragraph, wherein he says that the rot emanates from "hatemongers cannot abide Barack Obama's Presidency" - that's where I think it's obvious I disagree. I believe, and I believe that I have actual history on my side, that it's the push to increase government control of individuals that is the "rot from within" we should dread. Not the people who are against increased government control; the people actually fighting to bring it about.

The commenter doesn't have history to back him up; he doesn't even have white-sheeted midnight bonfires to back him up. He has his feelings. And he signs himself "Cmdr" - that is, "Commander." Of what?