Going to see a rock
On the morning of my birthday, we drove down to the Evergreen Aviation Museum outside McMinnville, OR. Evergreen is the home of the Spruce Goose and it looms over the display area making all the other planes look like toys. A titan rocket and an SR-71 are tucked under one wing while the early history of aviation displays are tucked under the others.
The museum sits in the middle of a vineyard and produces it's own label of wine. After touring the displays, we went to the wine shop for a tasting. The pourer did a great job of chatting up the visitors and was soon asking us about our trip.
"Where are you headed next?" she asked.
"Well," my clever wife started, "we're going to see a rock."
"Oh. Are you going out to the coast to see Haystack Rock?"
"No. This is a big rock by the road. There was this glacier..." She paused. "He better explain it."
It's called the Belleview Erratic (also spelled Bellevue) and technically it's a large chunk of metamorphic argillite. It's a big dark rock, about as big as a mid-size sedan. It's flat and broken into three pieces. It sits on the top of a hill. To the broadly imaginative, a big flat rock sitting on the top of a hill might look vaguely ceremonial. To others, it probably just looks like a good place to stand while taking in a view of the valley. It's not a ceremonial altar. It is in fact, just a rock. What makes it interesting is how it got onto this hilltop in western Oregon.
Those who took a geology class and remember their lessons will know that an erratic is a rock that is out of place. Most famously, erratics are rocks that glaciers carry down from the mountains and deposit in the plains. It was the discovery of erratics that allowed early geologists to conceive of the ice ages and to trace the outlines of the lost ice sheets and glaciers. But the Willamette valley was never glaciated and the Belleview Erratic doesn't come from the surrounding mountains. it is a piece of southern British Columbia and it floated to that hilltop in Oregon.
Toward the end of the last ice age, the Cordilleran Ice Sheet extended across the Canadian border and covered a sliver of northern Washington, Idaho, and Montana. The western part of Montana, between the Continental Divide on the crest of the Rockies and the Idaho state line on the crest of the Bitterroots, is drained by the Clark Fork of the Columbia River. Unlike most rivers in the United States, the Clark Fork flows north. It loops across the Idaho panhandle and joins the Columbia in the extreme northeastern corner of Washington. During the ice age, this loop through Idaho was blocked by the southern tip of the Cordilleran Ice Sheet, a glacier called the Purcell Lobe. Behind the Purcell Lobe all of the drainage of the Clark Fork was damned up eventually to form a Great Lake sized body of water called Lake Missoula. When the lake was full, the future location of the town of Missoula would have been a thousand feet underwater.
The Belleview Erratic probably began its journey in southern British Columbia under the uphill end of the Purcell Lobe. Glaciers are made of snow, normal snow, which fell and covered the ground. When the snow gets deep enough, its own weight compresses it into ice. When a big enough, heavy enough mass of this ice gathers on a hillside, it will start to slowly slide down the hill. At this point, it becomes a glacier.
Hillsides are not perfectly smooth. They have irregularities, such as trees and rocks. As a glacier forms, these irregularities are encased in the ice. They become part of the glacier. When the glacier moves, they move with it. As the front end of the glacier melts, the debris that it has carried downhill are deposited in a disorderly pile. In time this pile forms a ridge in front of the glacier called a moraine. When the glacier retreats, the moraine remains. By following these ridges of erratics--these terminal moraines--geologists can map where the ancient front edges of glaciers were.
At least that's how it usually works.
One day, about fifteen thousand years ago, the Belleview Erratic was nearing the end of its glacial journey. It had been carried from the mountains by one glacier. That glacier joined with others to form the Purcell Lobe and carried the Belleview Erratic south into Idaho. As it neared the front of the glacier, to be deposited in the moraine something new happened.
The water that had been gathering behind the glacier reached a critical depth. At that end of the lake, the water was almost two thousand feet deep. The pressure at the bottom was enough to force its way under the glacier. In a few hours time, the water floated the southern tip of the Purcell Lobe and began to rush out under it. The glacial dam collapsed and was torn apart by the rushing water (the link leads to an impressive animated view of the collapse). One piece of glacier, carrying the Belleview Erratic, became an iceberg and floated away on the flood.
The Lake Missoula Flood is the biggest flood ever discovered. In eastern Washington, it tore up entire counties worth of topsoil and carried them away. In parts of the Columbia valley it passed as a five hundred foot high wall of muddy water traveling up to one hundred miles per hour. It pushed hurricane force winds ahead of it. The water itself was a churning brown mass the color and texture of a runny chocolate milkshake. It carried with it topsoil, rocks, trees, entire herds of mammoths, and the icebergs that had formerly been part of the Purcell Lobe. For two weeks, the Columbia carried several times more water than all the rivers in the world combined.
Where the flood ran into narrows, especially on the Lower Columbia, it backed up to form gigantic temporary lakes. These lakes backed up side valleys and there, when the water slowed down, it deposited some of its load of soil and rocks. The topsoil of Eastern Washington has made the valleys of the Yakima, Walla Walla, and Willamette some of the most productive agricultural area in North America, especially for wine.
The Belleview Erratic was carried into Lake Allison, the temporary Lake that filled the Willamette valley. South of Portland, it would have been out of the most active part of the flood. The waters would have grown calmer as the iceberg drifted south and finally ran aground on a submerged hill. After a few days, the waters would have receded leaving the iceberg stranded. Then the iceberg melted and there was just a big, dark rock on a hilltop.
The Lake Missoula flood wasn't a singular event. After the waters rushed out, the Purcell Lobe pushed forward and blocked the Clark Fork valley again. The river backed up and formed a new lake. After forty or fifty years, it was deep enough to float the glacier again and flood the Columbia again. This cycle dominated the Columbia valley for over two thousand years, there is sedimentary evidence of between forty-one and eighty-nine floods, although at least two came from lakes other than Lake Missoula. No one knows which flood brought the Belleview to its current home.
After we left the Evergreen Aviation Museum, my clever wife and I traveled west on Highway 18. A geology book I brought along said the rock was on the north side of the highway, just before Sheridan. We scanned the hilltops. We had a picture that showed above and behind a small white farmhouse. We hoped to recognize the farmhouse.
The Oregon Department of Highways has put up nice blue signs pointing out all the vineyards, especially the ones with some kind of tour or gift shop. On the signpost for Oldsville Road was a sign for Yamhill Valley Vineyards. Beneath it was a small, brown, older style sign that said simply "Glacial Rock."
We passed it doing fifty. I turned around about two miles down the road and came back. Oldsville Road immediately forked with one fork paralleling the highway and one heading into the hills. The highway was visible in our picture, so I took the road closest to it. I saw a brown sign behind some trees and thought, "Aha, that must be a Park Service sign." It wasn't, it was the winery. We followed the road a little further. It approached the highway and then swerved away. In the crook of the curve was another little, brown sign. This one said "Glacial Erratic Rock" and it marked a trail. The road was widened to create two parking spaces across from the sign. We looped around and parked.
Beneath the sign was a Park Service plaque explaining about the Lake Missoula flood. The trail followed a fence line. On one side the owners had planted Pinot grapes. There were ripe grapes on the vines. Blackberries grew along the fence. This is a noxious invasive species in the Northwest, but the berries were ripe and we picked a few as we climbed. It was a hot day and the trail was steep, but it was also short and there was a breeze. The trail went under a tree, rounded a bend, and approached the hilltop. And there it was--a big, dark rock. It's not as impressive in real life as in the pictures.
The Bellevue Erratic is the largest erratic in the Willamette valley. Unfortunately, since it was identified in the 1950's, tourists have been plucking pieces off it for souvenirs. When first measured, the scientists estimated it at one-hundred-sixty tons. In 1980, they estimated it at ninety tons. Today, about forty tons are left. For fifteen thousand years it sat unmolested in anonymity. Fifty years of marginal notoriety are all that it has taken to nearly destroy it.
In a few years I suppose the last significant chunk will be carted off to a museum, where it will just be another rock with a label. Although the best museum might provide the rock with an exciting display--a mural background, a diorama, or sophisticated animation showing how it got there--they will never be able to capture the experience of sitting on the rock in the very place where it was deposited fifteen thousand years ago feeling the sun and the air and trying to strip away the effects of time and man on the valley.
I sat on the rock for a few minutes and tried to imagine its journey. I looked across the valley at the Cascades trying to pick out our altitude on the other side, trying to picture what the valley looked like with all that water. I felt the heat that the morning sun had imparted to the rock. I thought of the other people who had done the same thing. Then we were on our way.
When we left, we only took pictures and blackberries with us. I already had all I needed from the rock.
Update: There are several book that at least mention the floods, but these are the two I read before the trip.
David Alt. Glacial Lake Missoula and Its Humongous Floods. Mountain Press Publishing Company, 2003.
John Eliot Allen, et al. Cataclysms on the Columbia. Timber Press, 1986.
Thursday, August 31, 2006
Wednesday, August 30, 2006
Keith Olbermann is my new hero
Ever since 9/11 the Bush administration has followed a contemptible strategy of besmirching the patriotism of any who dare dissent from or even question their policies. It has been a calculated policy designed to extract the maximum partisan advantage from a national crisis. At a time when we should all pull together for our common security, they have chosen to ignore the real dangers and divide us by demonizing a large part of the American population. Now that they are losing support, they are lashing out like a wounded and cornered animal. Their accusations are getting cruder and more hateful. Usually, they allow supporters outside the administration itself to do their dirty work. However, as that support has been slipping away, they now find it necessary to do their own dirty work.
Fascism is the theme of the moment for the administration. It is a theme we'll hear a lot about for at least the next two weeks--until the anniversary of 9/11--and possibly right up to the November elections. Earlier this month, Bush tried out a variation of Christopher Hitchens' vile phrase "Islamo-fascism." It went over well with the faithful and so it has become an official talking point for the GOP and its supporters. It is a new low for an administration that has been characterized by low behavior.
Tomorrow, Bush will begin a three week series of campaign style speeches to remind us how much danger we are in if we question his policies. Yesterday, Defense Secretary Donald Rumsfeld accused critics of the Bush administration's Iraq policies, of "moral and intellectual confusion," of lacking the courage to fight terror, and he drew explicit parallels between those critics and those who appeased Hitler in the 1930s. Bush says he hopes no one will politicize his clearly political marketing campaign.
In a moment like this, we need a public person who can combine eloquence, clear sight, and historical perspective to call this contemptible behavior for what it is. Keith Olbermann has stepped up to try and fill that roll. His words deserve to be spread as far as possible.
Go watch the video.
Ever since 9/11 the Bush administration has followed a contemptible strategy of besmirching the patriotism of any who dare dissent from or even question their policies. It has been a calculated policy designed to extract the maximum partisan advantage from a national crisis. At a time when we should all pull together for our common security, they have chosen to ignore the real dangers and divide us by demonizing a large part of the American population. Now that they are losing support, they are lashing out like a wounded and cornered animal. Their accusations are getting cruder and more hateful. Usually, they allow supporters outside the administration itself to do their dirty work. However, as that support has been slipping away, they now find it necessary to do their own dirty work.
Fascism is the theme of the moment for the administration. It is a theme we'll hear a lot about for at least the next two weeks--until the anniversary of 9/11--and possibly right up to the November elections. Earlier this month, Bush tried out a variation of Christopher Hitchens' vile phrase "Islamo-fascism." It went over well with the faithful and so it has become an official talking point for the GOP and its supporters. It is a new low for an administration that has been characterized by low behavior.
Tomorrow, Bush will begin a three week series of campaign style speeches to remind us how much danger we are in if we question his policies. Yesterday, Defense Secretary Donald Rumsfeld accused critics of the Bush administration's Iraq policies, of "moral and intellectual confusion," of lacking the courage to fight terror, and he drew explicit parallels between those critics and those who appeased Hitler in the 1930s. Bush says he hopes no one will politicize his clearly political marketing campaign.
In a moment like this, we need a public person who can combine eloquence, clear sight, and historical perspective to call this contemptible behavior for what it is. Keith Olbermann has stepped up to try and fill that roll. His words deserve to be spread as far as possible.
The man who sees absolutes, where all other men see nuances and shades of meaning, is either a prophet, or a quack.
Donald H. Rumsfeld is not a prophet.
Mr. Rumsfeld’s remarkable speech to the American Legion yesterday demands the deep analysis—and the sober contemplation—of every American.
For it did not merely serve to impugn the morality or intelligence -- indeed, the loyalty -- of the majority of Americans who oppose the transient occupants of the highest offices in the land. Worse, still, it credits those same transient occupants -- our employees -- with a total omniscience; a total omniscience which neither common sense, nor this administration’s track record at home or abroad, suggests they deserve.
Dissent and disagreement with government is the life’s blood of human freedom; and not merely because it is the first roadblock against the kind of tyranny the men Mr. Rumsfeld likes to think of as “his” troops still fight, this very evening, in Iraq.
It is also essential. Because just every once in awhile it is right and the power to which it speaks, is wrong.
In a small irony, however, Mr. Rumsfeld’s speechwriter was adroit in invoking the memory of the appeasement of the Nazis. For in their time, there was another government faced with true peril—with a growing evil—powerful and remorseless.
That government, like Mr. Rumsfeld’s, had a monopoly on all the facts. It, too, had the “secret information.” It alone had the true picture of the threat. It too dismissed and insulted its critics in terms like Mr. Rumsfeld’s -- questioning their intellect and their morality.
That government was England’s, in the 1930’s.
It knew Hitler posed no true threat to Europe, let alone England.
It knew Germany was not re-arming, in violation of all treaties and accords.
It knew that the hard evidence it received, which contradicted its own policies, its own conclusions — its own omniscience -- needed to be dismissed.
The English government of Neville Chamberlain already knew the truth.
Most relevant of all — it “knew” that its staunchest critics needed to be marginalized and isolated. In fact, it portrayed the foremost of them as a blood-thirsty war-monger who was, if not truly senile, at best morally or intellectually confused.
That critic’s name was Winston Churchill.
Sadly, we have no Winston Churchills evident among us this evening. We have only Donald Rumsfelds, demonizing disagreement, the way Neville Chamberlain demonized Winston Churchill.
History — and 163 million pounds of Luftwaffe bombs over England — have taught us that all Mr. Chamberlain had was his certainty — and his own confusion. A confusion that suggested that the office can not only make the man, but that the office can also make the facts.
Thus, did Mr. Rumsfeld make an apt historical analogy.
Excepting the fact, that he has the battery plugged in backwards.
His government, absolute -- and exclusive -- in its knowledge, is not the modern version of the one which stood up to the Nazis.
It is the modern version of the government of Neville Chamberlain.
But back to today’s Omniscient ones.
That, about which Mr. Rumsfeld is confused is simply this: This is a Democracy. Still. Sometimes just barely.
And, as such, all voices count -- not just his.
Had he or his president perhaps proven any of their prior claims of omniscience — about Osama Bin Laden’s plans five years ago, about Saddam Hussein’s weapons four years ago, about Hurricane Katrina’s impact one year ago — we all might be able to swallow hard, and accept their “omniscience” as a bearable, even useful recipe, of fact, plus ego.
But, to date, this government has proved little besides its own arrogance, and its own hubris.
Mr. Rumsfeld is also personally confused, morally or intellectually, about his own standing in this matter. From Iraq to Katrina, to the entire “Fog of Fear” which continues to envelop this nation, he, Mr. Bush, Mr. Cheney, and their cronies have — inadvertently or intentionally — profited and benefited, both personally, and politically.
And yet he can stand up, in public, and question the morality and the intellect of those of us who dare ask just for the receipt for the Emporer’s New Clothes?
In what country was Mr. Rumsfeld raised? As a child, of whose heroism did he read? On what side of the battle for freedom did he dream one day to fight? With what country has he confused the United States of America?
The confusion we -- as its citizens— must now address, is stark and forbidding.
But variations of it have faced our forefathers, when men like Nixon and McCarthy and Curtis LeMay have darkened our skies and obscured our flag. Note -- with hope in your heart — that those earlier Americans always found their way to the light, and we can, too.
The confusion is about whether this Secretary of Defense, and this administration, are in fact now accomplishing what they claim the terrorists seek: The destruction of our freedoms, the very ones for which the same veterans Mr. Rumsfeld addressed yesterday in Salt Lake City, so valiantly fought.
And about Mr. Rumsfeld’s other main assertion, that this country faces a “new type of fascism.”
As he was correct to remind us how a government that knew everything could get everything wrong, so too was he right when he said that -- though probably not in the way he thought he meant it.
This country faces a new type of fascism - indeed.
Although I presumptuously use his sign-off each night, in feeble tribute, I have utterly no claim to the words of the exemplary journalist Edward R. Murrow.
But never in the trial of a thousand years of writing could I come close to matching how he phrased a warning to an earlier generation of us, at a time when other politicians thought they (and they alone) knew everything, and branded those who disagreed: “confused” or “immoral.”
Thus, forgive me, for reading Murrow, in full:
“We must not confuse dissent with disloyalty,” he said, in 1954. “We must remember always that accusation is not proof, and that conviction depends upon evidence and due process of law.
“We will not walk in fear, one of another. We will not be driven by fear into an age of unreason, if we dig deep in our history and our doctrine, and remember that we are not descended from fearful men, not from men who feared to write, to speak, to associate, and to defend causes that were for the moment unpopular.”
And so good night, and good luck.
Go watch the video.
Dr. Strangerobbins*
Every so often, Seed magazine asks its community of science bloggers to comment on a question. It helps make the place more of an interactive community and probably encourages readers to try out some of the blogs that they might not regularly follow. At least, that's how it works for me. They're only asking the questions of the blogs that they host, but there is nothing to stop other bloggers from throwing in their two cents worth. This week's question caught my interest, so here I am.
The question is:
The article in question by James S. Robbins, is called "Hooray for Global Warming" and is a real horror. This is his thesis:
I'm all in favor of it even though it isn't happening. He flips back and forth between those two points and gets in the expected slaps at Al Gore and the Left. He brings up the, by now, quite tired argument that we'll be able to farm arctic Canada. He implies that the whole thing is a socialist plot to encourage big government seizures of wealth and writes one paragraph of tone perfect, blind, conservative ranting.
It displays and arrogant and contemptuous ignorance of science. Carl Zimmer points out that "micobiotic" isn't even a word and that "microbiotic," if that's what he meant, means something completely different (it's a type of dirt). His cold hearted lecture on responsibility in making evolutionary choices is so over the top that I would think it was a parody if it appeared in any other source.
So, besides being a complete doofus, mouthing conservative talking points without thinking about them, does Robbins have a point? Are their opportunities to be had in global warming? Of course there are.
Global warming presents major opportunities for certain corporation to make enormous profits.** The companies in line to make those profits just happen to be among the most dependable supporters of the Republican Party. For a conservative, what's not to like about global warming?
* Or: How I Learned to Stop Worrying and Love Global Warming
** Before civilization completely collapses, that is.
Every so often, Seed magazine asks its community of science bloggers to comment on a question. It helps make the place more of an interactive community and probably encourages readers to try out some of the blogs that they might not regularly follow. At least, that's how it works for me. They're only asking the questions of the blogs that they host, but there is nothing to stop other bloggers from throwing in their two cents worth. This week's question caught my interest, so here I am.
The question is:
I read this article in the NRO, and the author actually made some interesting arguments. "Basically," he said, "I am questioning the premise that [global warming] is a problem rather than an opportunity." Does he have a point?
The article in question by James S. Robbins, is called "Hooray for Global Warming" and is a real horror. This is his thesis:
Global warming is great. Granted, maybe it isn’t really happening, and if it is there are strong reasons to doubt that humans have anything to do with it. But if the world is warming, I say “bravo.” People in most parts of the globe should have no objection to a warmer, wetter climate. If the aliens were watching they’d conclude we were making our planet more habitable on purpose.
I'm all in favor of it even though it isn't happening. He flips back and forth between those two points and gets in the expected slaps at Al Gore and the Left. He brings up the, by now, quite tired argument that we'll be able to farm arctic Canada. He implies that the whole thing is a socialist plot to encourage big government seizures of wealth and writes one paragraph of tone perfect, blind, conservative ranting.
Granted, there will be some negative impacts in marginal areas. Some rare plant and animal species, hyper-adapted to highly specific climate conditions or micobiotic zones, are already unable to cope with the change. Many may go extinct; some already have. That’s tough, but chalk it up to bad evolutionary choices. When those rigidly specialist species bet everything on a small part of the world in hopes it would never change, they made a very bad bargain. For our part, we have air conditioners, lightweight fabrics, and sunscreen. Why infinitely adaptable humanity has to pay the price for the evolutionary shortsightedness of other life forms is beyond me.
It displays and arrogant and contemptuous ignorance of science. Carl Zimmer points out that "micobiotic" isn't even a word and that "microbiotic," if that's what he meant, means something completely different (it's a type of dirt). His cold hearted lecture on responsibility in making evolutionary choices is so over the top that I would think it was a parody if it appeared in any other source.
So, besides being a complete doofus, mouthing conservative talking points without thinking about them, does Robbins have a point? Are their opportunities to be had in global warming? Of course there are.
- Pharmaceutical companies have made a fortune off of AIDS. As tropical diseases like malaria begin to spread in wealthy, formerly-temperate countries those same companies will make billions selling medicines that formerly went to countries that couldn't pay premium prices.
- Construction companies have made big bucks "rebuilding" Iraq and New Orleans. Imagine how much work they'll have when every coastal city in the world needs to be protected from tidal surges or rebuilt inland. Again, this won't be the usual mass destruction happening to dirt poor people in the tropics; it will be happening to people who can afford to pay premium prices.
- Energy companies have wanted to drill for oil on the Arctic coastal shelf for decades but have been prevented by their inability to work among moving ice flows. No ice flows, no problem.
- Do I need to explain what a couple billion people forced to move will do for the real estate market?
Global warming presents major opportunities for certain corporation to make enormous profits.** The companies in line to make those profits just happen to be among the most dependable supporters of the Republican Party. For a conservative, what's not to like about global warming?
* Or: How I Learned to Stop Worrying and Love Global Warming
** Before civilization completely collapses, that is.
Maybe the Bad Astronomy blog should tackle this one
Joe Lieberman's communications director, Dan Gerstein, claims the sun rises in the West. Really.
The latest television ad by the Lieberman campaign shows a red sun hovering over the ocean in what is supposed to be the sunrise of a better day. The video they used is stock footage sold by Getty Images of a sunset on a beach in Santa Barbara, CA. When confronted with this fact, Gerstein said, "It's actually a sunrise. It's very much a sunrise." The sun rising over the ocean, when viewed from the West Coast, would be big news because it would mean the Earth had reversed its rotation over night. And, if the Earth can reverse its rotation in that short of a time without destroying civilization and most life other than a few types of thermogenic bacteria, it would mean Immanuel Velikovsky was right and the science guys all owe him a big apology.
Update: One of Lieberman's media consultants, Josh Isay, has decided to fall on his sword and say he purchased a video of a sunset, reversed it, and inserted it into the ad, all by accident. It could happen to anyone.
Joe Lieberman's communications director, Dan Gerstein, claims the sun rises in the West. Really.
The latest television ad by the Lieberman campaign shows a red sun hovering over the ocean in what is supposed to be the sunrise of a better day. The video they used is stock footage sold by Getty Images of a sunset on a beach in Santa Barbara, CA. When confronted with this fact, Gerstein said, "It's actually a sunrise. It's very much a sunrise." The sun rising over the ocean, when viewed from the West Coast, would be big news because it would mean the Earth had reversed its rotation over night. And, if the Earth can reverse its rotation in that short of a time without destroying civilization and most life other than a few types of thermogenic bacteria, it would mean Immanuel Velikovsky was right and the science guys all owe him a big apology.
Update: One of Lieberman's media consultants, Josh Isay, has decided to fall on his sword and say he purchased a video of a sunset, reversed it, and inserted it into the ad, all by accident. It could happen to anyone.
What have we become
Having insulted Rush Limbaugh with a cheap shot pointing out that he's a fat, drug addicted hypocrite, I'll show some liberal even handedness by giving him something he really wants. Rush thinks it's terrible and suspicious that the Democrats and liberal media are talking about Katrina this week instead of 9/11. Don't try to tell him that it's because the anniversary of Katrina is this week and the anniversary of 9/11 is still two weeks away; he knows talking about Katrina this week is a cynical political ploy to make Bush look bad. Okay, just to show what a nice guy I am, I'll talk about 9/11, just for Rush.
Five years of non-stop fear-mongering by the Bush administration and its allies have turned us into a nation of gun-shy cowards who are ready to cast aside our most sacred values the illusion of a few moments security. In evidence, I give you this:
The key point to note is that at no point did they suggest that he was suspected of being a danger to the security of the flight. It was the mere presence of words in Arabic script that sent people into a panic. And, by simply covering up the scary words, Mr. Jarrar was decreed an acceptable passenger for that flight. Rush Limbaugh, Dick Cheney, and the whole right-wing information machine want us to think about 9/11 and stay fearful every moment of every day. They have made us into a nation of pants-wetting cry-babies.
Having insulted Rush Limbaugh with a cheap shot pointing out that he's a fat, drug addicted hypocrite, I'll show some liberal even handedness by giving him something he really wants. Rush thinks it's terrible and suspicious that the Democrats and liberal media are talking about Katrina this week instead of 9/11. Don't try to tell him that it's because the anniversary of Katrina is this week and the anniversary of 9/11 is still two weeks away; he knows talking about Katrina this week is a cynical political ploy to make Bush look bad. Okay, just to show what a nice guy I am, I'll talk about 9/11, just for Rush.
Five years of non-stop fear-mongering by the Bush administration and its allies have turned us into a nation of gun-shy cowards who are ready to cast aside our most sacred values the illusion of a few moments security. In evidence, I give you this:
An architect of Iraqi descent has said he was forced to remove a T-shirt that bore the words "We will not be silent" before boarding a flight at New York.
Raed Jarrar said security officials warned him his clothing was offensive after he checked in for a JetBlue flight to California on 12 August.
[...]
He said he had cleared security at John F Kennedy airport for a flight back to his home in California when he was approached by two men who wanted to check his ID and boarding pass.
Mr Jarrar said he was told a number of passengers had complained about his T-shirt - apparently concerned at what the Arabic phrase meant - and asked him to remove it.
He refused, arguing that the slogan was not offensive and citing his constitutional rights to free expression.
Mr Jarrar later told a New York radio station: "I grew up and spent all my life living under authoritarian regimes and I know that these things happen.
"But I'm shocked that they happened to me here, in the US."
[...]
"We Will Not Be Silent" is a slogan adopted by opponents of the war in Iraq and other conflicts in the Middle East.
It is said to derive from the White Rose dissident group which opposed Nazi rule in Germany.
The key point to note is that at no point did they suggest that he was suspected of being a danger to the security of the flight. It was the mere presence of words in Arabic script that sent people into a panic. And, by simply covering up the scary words, Mr. Jarrar was decreed an acceptable passenger for that flight. Rush Limbaugh, Dick Cheney, and the whole right-wing information machine want us to think about 9/11 and stay fearful every moment of every day. They have made us into a nation of pants-wetting cry-babies.
Carnival of the Liberals #20
The latest issue of Carnival of the Liberals is up over at The Greenbelt. Our host, The Ridger, takes us on a gentle walking tour of liberal country. It's a pleasant and stimulating walk, but nothing so strenuous that we AARP types can't keep up.
I want all of you to be thinking good liberal thoughts, especially you bloggers, because I'm hosting the next edition of CotL here at archy on Wednesday September 13. If you have a recent post that you're proud of that deals with liberal politics or commentary on almost anything from a liberal perspective, send me a link by dinnertime (or suppertime if you live in a farm state) on the 12th and I'll include it. Naturally, I'll think highly of you if you can get to me earlier than that.
The latest issue of Carnival of the Liberals is up over at The Greenbelt. Our host, The Ridger, takes us on a gentle walking tour of liberal country. It's a pleasant and stimulating walk, but nothing so strenuous that we AARP types can't keep up.
I want all of you to be thinking good liberal thoughts, especially you bloggers, because I'm hosting the next edition of CotL here at archy on Wednesday September 13. If you have a recent post that you're proud of that deals with liberal politics or commentary on almost anything from a liberal perspective, send me a link by dinnertime (or suppertime if you live in a farm state) on the 12th and I'll include it. Naturally, I'll think highly of you if you can get to me earlier than that.
Personal responsibility
The Republican Party and conservatives in general have always been big on personal responsibility. If people are suffering, it is because of bad choices that they made (like being born poor) and they need to stop whining and take responsibility for their situation. Nothing annoys a true conservative more than other people blaming their problems on society or looking for government aid.
Yesterday, Rush Limbaugh claimed that liberals and the UN were responsible for the epidemic of obesity. Clearly, the big, fat, drug addict has fallen off the wagon again.
Rush must be thankful he's independently wealthy and not in danger of becoming fat like those poor people who the liberals prey on.
Limbaugh also demonstrated his keen grasp of agricultural issues:
Rush, killing cows is not an effective way to get butter.
The Republican Party and conservatives in general have always been big on personal responsibility. If people are suffering, it is because of bad choices that they made (like being born poor) and they need to stop whining and take responsibility for their situation. Nothing annoys a true conservative more than other people blaming their problems on society or looking for government aid.
Yesterday, Rush Limbaugh claimed that liberals and the UN were responsible for the epidemic of obesity. Clearly, the big, fat, drug addict has fallen off the wagon again.
I love these kinds of stories, 'cause we're just getting them all over the place: Waistlines continue to grow in the United States. Another crisis story here, ladies and gentlemen, from our old buddies at the Associated Press. "The gravy train -- make that the sausage, biscuits, and gravy train -- just keep [sic] on rolling in most of America last year." Thirty-one states showing an increase in obesity. Mississippi continued to lead the way; an estimated 30 percent of adults there are considered obese, an increase of 1.1 percentage points when compared with last year's report. Indeed, "the five states with the highest obesity rates are Mississippi, Alabama, West Virginia, Louisiana, and Kentucky -- exhibit much higher rates of poverty than the national norm. Meanwhile, the five states with the lowest obesity have less poverty. They are Colorado, Hawaii, Massachusetts, Rhode Island, and Vermont."
[...]
I think you might then say that the obesity crisis could be the fault of government, liberal government.
[...]
This is what happens when you let the left run things. We've been beat about the head. There are hungry people everywhere. UNICEF got it all started. We've seen the babies with the extended tummies, the walking skeletons, told that kids can't learn unless they're fed. We've been guilted into pouring resources on the problem. And now, now, the latest crisis is that there is obesity among those who are impoverished. Because we are sympathetic, we are compassionate people, we have responded by letting our government literally feed these people to the point of obesity.
Rush must be thankful he's independently wealthy and not in danger of becoming fat like those poor people who the liberals prey on.
Limbaugh also demonstrated his keen grasp of agricultural issues:
At least here in America, didn't teach them how to fish, we gave them the fish. Didn't teach them how to butcher a -- slaughter a cow to get the butter, we gave them the butter.
Rush, killing cows is not an effective way to get butter.
Tuesday, August 29, 2006
Population shifts
Immediately after Katrina, I looked at the refugee situation and thought that the the Republicans, the wealthy, and the whites would find a way to keep most of the poor, Democratic, and black refugees from returning. It would give the Republicans another seat in the House and probably also give them Landrieu's Senate seat when she comes up for reelection.
Digby has some numbers up today tha make me want to qualifty that prediction. According to the US Census, in June southern Louisiana had 344,781 fewer people than before Katrina. Many will probably never return. While I still think that this will lead to the Republicans gaining a House seat when they stop letting the refugees vote absentee as Louisiana residents, it might be that they only hold on to that seat for one or two terms. A permanent loss of over a quarter million people from the state will probably cost Louisiana a house seat during the reapportionment following the 2010 census. Sure, Utah will probably gain that seat, but the West has been turning a bluer shade of purple lately.
Immediately after Katrina, I looked at the refugee situation and thought that the the Republicans, the wealthy, and the whites would find a way to keep most of the poor, Democratic, and black refugees from returning. It would give the Republicans another seat in the House and probably also give them Landrieu's Senate seat when she comes up for reelection.
Digby has some numbers up today tha make me want to qualifty that prediction. According to the US Census, in June southern Louisiana had 344,781 fewer people than before Katrina. Many will probably never return. While I still think that this will lead to the Republicans gaining a House seat when they stop letting the refugees vote absentee as Louisiana residents, it might be that they only hold on to that seat for one or two terms. A permanent loss of over a quarter million people from the state will probably cost Louisiana a house seat during the reapportionment following the 2010 census. Sure, Utah will probably gain that seat, but the West has been turning a bluer shade of purple lately.
Monday, August 28, 2006
A stupid word
Last week, while I was wandering around the west coast looking at old airplanes and big rocks, Katherine Harris was hard at work trying to lose the Florida Senate race. Her latest misadventure began when she gave a pandering interview to a Baptist newsletter and never imagined that someone who wasn't a far-right, Christian fundamentalist might also read her words. In the course of the interview she called separation of church and state "that lie we have been told," claimed the founding fathers did not intent America to be a nation of secular laws, and warned "if you’re not electing Christians then in essence you are going to legislate sin." Mustang Bobby has the highlights here.
Naturally a few people who are not far-right, Christian fundamentalists did read her words--some of them members of the liberal press and even unhinged, leftist bloggers--and the poo hit the fan. Harris' explanation/non-apology amounts to,"I was telling a special interest group what they wanted to hear and some of my best friends are Jewish." To make the second point crystal clear, her statement made sure to throw the term "Judeo-Christian" around a lot.
"Judeo-Christian" is one of the more vile bits of double-speak in our political lexicon. For years that term was only used by academics in the humanities and social sciences to make some vague generalizations about Mediterranean and European culture. At some point in the near past it was adopted by right-wing culture warriors in an effort to pry Jewish voters away from the Democratic Party.
Historically, the modern religious right has been a Protestant movement. As they have become a real political power, their leaders realized that they needed to expand their coalition in order to be large enough to rule. Leaders of the Republican Party, in their complicated dance of captive and captor with the religious right, saw an opportunity to poach on some traditionally Democratic voter groups by adopting a modified version of the language of the religious right.
The key was to get the religious right to disguise some of its purely Protestant associations and distance itself from its overt anti-Catholic and anti-Semitic history. American Protestantism, in general, was already headed in this direction, but the rise of Christian Zionism among fundamentalists and their apocalyptic fascination with Israel greatly aided this effort. The phrase "Judeo-Christian values" was the perfect marketing phrase to tell conservative Catholics and Jews that they had something in common with religious right Protestants and that they should all vote for the same slate of candidates.
So far, this marketing campaign has been more successful among Catholics than it has been among Jews. Too often, leaders of the religious right let slip old attitudes. In 1999 Jerry Falwell stated of the Antichrist, "Of course he'll be Jewish." That same year the Southern Baptist Convention restated converting the Jews as a major goal. And many Jews can't help but notice that the central apocalyptic narrative of Christian Zionism, as luridly told in books like the Left Behind series, is that the Jews will all gather into Israel where they can be conveniently killed to bring about the second coming of Christ. Only a tiny number of converts will survive.
In political discourse coming from the religious right, "Judeo-Christian" is essentially a code phrase that means "you should vote for us, you greedy Christ killers." Harris shows the true colors of the religious right when, speaking to a Christian audience, she conveniently forgets to include Jews among the exclusive possessors of morality who should rule this country. There might be some legitimate reasons why a Florida Jew might vote for such a person, but religion and morality should not be among them.
In the next installment of "Stupid Words I Never Want to Hear You Using" I'll explain why drunken lout Christopher Hitchens deserves to be thrown in a volcano for coining the neologism "Islamo-Fascist."
Last week, while I was wandering around the west coast looking at old airplanes and big rocks, Katherine Harris was hard at work trying to lose the Florida Senate race. Her latest misadventure began when she gave a pandering interview to a Baptist newsletter and never imagined that someone who wasn't a far-right, Christian fundamentalist might also read her words. In the course of the interview she called separation of church and state "that lie we have been told," claimed the founding fathers did not intent America to be a nation of secular laws, and warned "if you’re not electing Christians then in essence you are going to legislate sin." Mustang Bobby has the highlights here.
Naturally a few people who are not far-right, Christian fundamentalists did read her words--some of them members of the liberal press and even unhinged, leftist bloggers--and the poo hit the fan. Harris' explanation/non-apology amounts to,"I was telling a special interest group what they wanted to hear and some of my best friends are Jewish." To make the second point crystal clear, her statement made sure to throw the term "Judeo-Christian" around a lot.
"Judeo-Christian" is one of the more vile bits of double-speak in our political lexicon. For years that term was only used by academics in the humanities and social sciences to make some vague generalizations about Mediterranean and European culture. At some point in the near past it was adopted by right-wing culture warriors in an effort to pry Jewish voters away from the Democratic Party.
Historically, the modern religious right has been a Protestant movement. As they have become a real political power, their leaders realized that they needed to expand their coalition in order to be large enough to rule. Leaders of the Republican Party, in their complicated dance of captive and captor with the religious right, saw an opportunity to poach on some traditionally Democratic voter groups by adopting a modified version of the language of the religious right.
The key was to get the religious right to disguise some of its purely Protestant associations and distance itself from its overt anti-Catholic and anti-Semitic history. American Protestantism, in general, was already headed in this direction, but the rise of Christian Zionism among fundamentalists and their apocalyptic fascination with Israel greatly aided this effort. The phrase "Judeo-Christian values" was the perfect marketing phrase to tell conservative Catholics and Jews that they had something in common with religious right Protestants and that they should all vote for the same slate of candidates.
So far, this marketing campaign has been more successful among Catholics than it has been among Jews. Too often, leaders of the religious right let slip old attitudes. In 1999 Jerry Falwell stated of the Antichrist, "Of course he'll be Jewish." That same year the Southern Baptist Convention restated converting the Jews as a major goal. And many Jews can't help but notice that the central apocalyptic narrative of Christian Zionism, as luridly told in books like the Left Behind series, is that the Jews will all gather into Israel where they can be conveniently killed to bring about the second coming of Christ. Only a tiny number of converts will survive.
In political discourse coming from the religious right, "Judeo-Christian" is essentially a code phrase that means "you should vote for us, you greedy Christ killers." Harris shows the true colors of the religious right when, speaking to a Christian audience, she conveniently forgets to include Jews among the exclusive possessors of morality who should rule this country. There might be some legitimate reasons why a Florida Jew might vote for such a person, but religion and morality should not be among them.
In the next installment of "Stupid Words I Never Want to Hear You Using" I'll explain why drunken lout Christopher Hitchens deserves to be thrown in a volcano for coining the neologism "Islamo-Fascist."
Saturday, August 26, 2006
You guys are the greatest
Last night at 9:37 Pacific Time, I logged my seventy-five thousandth visitor. The magic visit came from UC, Irvine. Thanks to all thee people who linked to give me traffic, to all the people who visited, and to all who stopped to say happy birthday. Thanks especially to Coturnix, whose almost daily links prevented the traffic from ever slowing down and allowed me to meet my goal right on schedule.
Last night at 9:37 Pacific Time, I logged my seventy-five thousandth visitor. The magic visit came from UC, Irvine. Thanks to all thee people who linked to give me traffic, to all the people who visited, and to all who stopped to say happy birthday. Thanks especially to Coturnix, whose almost daily links prevented the traffic from ever slowing down and allowed me to meet my goal right on schedule.
Friday, August 25, 2006
This day in history
In 1944 Paris was liberated by the French 2nd Armored Division and the U.S. 4th Infantry Division.
In 1900 Friedrich Nietzsche died.
In 1814 the British burned Washington DC.
It is Independence Day in Uruguay.
Today is the 237th day of the year. It is the 2063rd day of the century (which began on Jan. 1, 2001, not 2000, dammit).
In 325 the Council of Nicea ended, establishing an official doctrine for Christianity, which was promptly ignored by Nestorians, Syrians, Donatists, Armenians, Coptics, Arians, Manicheans, and assorted Gnostics.
In most states, there are only 100 shopping days left until Christmas. In some states there are even fewer.
In 1914 a hurricane destroyed Galveston, Texas.
It is the saint's day for Genisius, patron saint of clowns. he was beheaded by Diocletian who did not have much of a sense of humor.
In 1981 Voyager 2 passed Saturn on its way out of the solar system.
In 1989 Voyager 2 passed Neptune on its way out of the solar system.
Today is the birthday of Walt Kelly, Elvis Costello, Mad Ludwig of Bavaria, Clara Bow, George Wallace, Ivan the Terrible, Martin Amis, Sean Connery, and me.
PS - Clever Wife and I are having too much fun on the road. I don't think we're coming back until we run out of money or break the car.
In 1944 Paris was liberated by the French 2nd Armored Division and the U.S. 4th Infantry Division.
In 1900 Friedrich Nietzsche died.
In 1814 the British burned Washington DC.
It is Independence Day in Uruguay.
Today is the 237th day of the year. It is the 2063rd day of the century (which began on Jan. 1, 2001, not 2000, dammit).
In 325 the Council of Nicea ended, establishing an official doctrine for Christianity, which was promptly ignored by Nestorians, Syrians, Donatists, Armenians, Coptics, Arians, Manicheans, and assorted Gnostics.
In most states, there are only 100 shopping days left until Christmas. In some states there are even fewer.
In 1914 a hurricane destroyed Galveston, Texas.
It is the saint's day for Genisius, patron saint of clowns. he was beheaded by Diocletian who did not have much of a sense of humor.
In 1981 Voyager 2 passed Saturn on its way out of the solar system.
In 1989 Voyager 2 passed Neptune on its way out of the solar system.
Today is the birthday of Walt Kelly, Elvis Costello, Mad Ludwig of Bavaria, Clara Bow, George Wallace, Ivan the Terrible, Martin Amis, Sean Connery, and me.
PS - Clever Wife and I are having too much fun on the road. I don't think we're coming back until we run out of money or break the car.
Wednesday, August 23, 2006
Gone fishing
The cats have been fed. The plants have been watered. The dishes have been washed. The car has a full tank of gas. The engine fluids have all been topped off. The back seat is full of field guides, paperback mysteries, and snacks. The catsitter has all the emergency numbers. This can only mean one thing: John and his Clever Wife are going on a road trip. I'll be back on Friday, my birthday. Till then, read some of the nice people over on the left, play nice with each other, and do whatever you can to annoy our leaders.
The cats have been fed. The plants have been watered. The dishes have been washed. The car has a full tank of gas. The engine fluids have all been topped off. The back seat is full of field guides, paperback mysteries, and snacks. The catsitter has all the emergency numbers. This can only mean one thing: John and his Clever Wife are going on a road trip. I'll be back on Friday, my birthday. Till then, read some of the nice people over on the left, play nice with each other, and do whatever you can to annoy our leaders.
Tuesday, August 22, 2006
He does come from the gas industry*
This has already been spread around the internets, but I think it deserves a few more comments.
We all recall that Bush is all about dringing dignity and respect to the White House. Most bloggers are hooting at the hypocrisy of it all and because it confirms that Bush is the white trash yahoo that we always thought he was. What's really significant is that this appeared in U. S. News & World Report, the most conservative of the news weeklys. Is this a sign that Bush is toast, that he is losing the support of his base? Or, is it a sign that he's toast, and the conservatives are distancing themselves from him in order to prevent a complete electoral disaster from sweeping the movement into the dustbin of history? Or both?
* I stole this joke from Digby.
This has already been spread around the internets, but I think it deserves a few more comments.
He loves to cuss, gets a jolly when a mountain biker wipes out trying to keep up with him, and now we're learning that the first frat boy loves flatulence jokes. A top insider let that slip when explaining why President Bush is paranoid around women, always worried about his behavior. But he's still a funny, earthy guy who, for example, can't get enough of fart jokes. He's also known to cut a few for laughs, especially when greeting new young aides, but forget about getting people to gas about that.
We all recall that Bush is all about dringing dignity and respect to the White House. Most bloggers are hooting at the hypocrisy of it all and because it confirms that Bush is the white trash yahoo that we always thought he was. What's really significant is that this appeared in U. S. News & World Report, the most conservative of the news weeklys. Is this a sign that Bush is toast, that he is losing the support of his base? Or, is it a sign that he's toast, and the conservatives are distancing themselves from him in order to prevent a complete electoral disaster from sweeping the movement into the dustbin of history? Or both?
* I stole this joke from Digby.
Oh boy, a list
Coturnix has a repeat of one of his best posts. It's his essential Science Fiction list. This is a list that most nerds can go nuts over. As with any good list, this one allows you to go in any of a number of possible directions. Here's Coturnix's criteria:
I think the mixture of best-of and my-favorites is what most people would come up with. Science Fiction-for-Biologists, on the other hand is pure Coturnix. Myself, I would go for a history of Science Fiction list. Back in the seventies I put together a syllabus for such a course. I was never allowed to teach it, not having any teaching credentials, but I did give a one hour guest-lecture version of the course to English classes a couple of times. A lot of Science Fiction has been written since then and I've only done a partial job of keeping up with it.
As to the start of Science fiction, I fall in between the two possible extremes. The early origins school practices a sort of literary imperialism, working their way backwards through time, planting their flag on anything that even vaguely resembles a Science Fiction theme, eventually claiming many mythologies because they feature people flying and fantastic beasts. The late origins theory refuses to consider anything that isn't fully recognizable as Science Fiction and usually begin with Jules Verne.
I taught that you had to consider what Science Fiction developed from. I reached back into the eighteenth century and pointed to Enlightenment social satires. Books like "Gulliver's Travels," "Candide," and "Baron Munchausen" exaggerated trends in contemporary life to a fantastic degree, often placing them in fictional locales, to make telling social points and avoid the hand of the censor. The original impulse to Science Fiction was the question, "What if this trend continues?" In "Frankenstein," Mary Shelly wondered about the Faustian implications of pursuing scientific knowledge too far. By the nineteenth century, one of the dominant trends in society was technological change.
The transition to recognizable Science Fiction in the 1860s as Verne and others began to speculate about military technology. The arms races in Europe before World War One led many to speculate about what the next war would bring. At the same time others looked at the closing frontier around the world, the advance of archaeology, and wondered about lost civilizations that lay waiting to be discovered. Finally, mass literacy and cheap publishing had created a seemingly insatiable market for breathlessly told, light stories of the fantastic.
Science Fiction really came together as Science Fiction in the last few years before World War One. The literature that finally took the name, was pulp magazine literature written for pure entertainment. Verne and Wells still had some socially redeeming value. H. Rider Haggard didn't really have any science. I call Edgar Rice Burroughs the first great writer of real Science Fiction. Burroughs gave his readers lost civilizations, space travel, fantastic beasts, and eccentric inventors. His stories were read by millions and he inspired an entire generation of imitators.
Science Fiction was dominated by flashy, meaningless pulp stories from just before World War One till soon after World War Two. The decline of magazine fiction and the rise of the paperback book at the beginning of the Cold War changed things a bit. Although many of the themes looked the same--lost civilizations, space travel, fantastic beasts, and eccentric inventors--the tone began to change. A generation of authors had grown up reading Science Fiction who wrote books that used the themes, but began to sneak some socially significant content back in. Decades of psychology, genocide, and the modern totalitarian state had changed both writers and readers, and it showed. Voltaire would have been right at home with a book like "1984."
The second post-war generation of writers, the New Wave of the sixties and seventies, wasn't nearly as big a break with their predecessors as was believed at the time. They continued to absorb the social issues and anxieties of their day. They were more outspoken about expressing these things in their writing and more concerned about gaining literary respect for their work.
At the time I wrote my syllabus in the seventies, the war in Vietnam had ended, Watergate was over, and the short national nightmare of Disco was upon us. It was clear the New Wave had just about run its course. While better writing and social consciousness were here to stay, Science Fiction was hit with the same "oh, lighten up" mood that had hit popular music. While real science fiction groped for the next thing (Cyberpunk, which wouldn't quite bloom for a few years), really bad Fantasy came forward to fill the gap. Really bad Fantasy was a leap back into the pulp era. Every character was no more than his or her race or occupation. The plots were rarely more than a game of Dungeons and Dragons written out.
Aside from the fact that a generation had changed, there was really very little in the goals that made Cyberpunk different from New Wave. The writers were concerned about a different set of social issues, but they still cared about their literary legacy. A major change in the eighties came on the publishing end. As baby boomers aged, many of those who had read science fiction as teens continued to read as adults. In reacting to this, the publishing industry began to release more Science Fiction books in hardback, began to allow more sex, and, encouraged by the sales of giant fantasy epics, let books get longer. All of these gave the writers a bigger palate to work with.
We’re moving into the post-Cyberpunk era now; I’m not sure what it will entail. In my seventies syllabus, the only predictions I ventured were that the fantasy explosion would continue and that there would be more women writers and readers. I was right on both of those, but not bold enough to predict any further. Today I expect to see the border between Science Fiction and mainstream fiction continue to blur, while at the same time Science Fiction continues to fragment into sub-genres, again following the example of the music industry.
There, I hardly made a list at all, but told you what my criteria would be for making one. Why don’t you tell me what some of the best exemplars are for the periods I just out-lined. If you could get a class to read a book a week for nine weeks, could you create a list of nine books that would get them through that history?
Coturnix has a repeat of one of his best posts. It's his essential Science Fiction list. This is a list that most nerds can go nuts over. As with any good list, this one allows you to go in any of a number of possible directions. Here's Coturnix's criteria:
In some ways, this is a "Best of" list, in others it is a "My favourites" list.
The way I made it was to think what books I would buy for a young person (let's say a niece or nephew going off to college) as an introduction to SF - in other words: where to start when entering this genre. Another way I thought was to think of a long list of SF works that can be used (once pared down to a manageable size) in teaching a course "Science Fiction for Biologists."
I think the mixture of best-of and my-favorites is what most people would come up with. Science Fiction-for-Biologists, on the other hand is pure Coturnix. Myself, I would go for a history of Science Fiction list. Back in the seventies I put together a syllabus for such a course. I was never allowed to teach it, not having any teaching credentials, but I did give a one hour guest-lecture version of the course to English classes a couple of times. A lot of Science Fiction has been written since then and I've only done a partial job of keeping up with it.
As to the start of Science fiction, I fall in between the two possible extremes. The early origins school practices a sort of literary imperialism, working their way backwards through time, planting their flag on anything that even vaguely resembles a Science Fiction theme, eventually claiming many mythologies because they feature people flying and fantastic beasts. The late origins theory refuses to consider anything that isn't fully recognizable as Science Fiction and usually begin with Jules Verne.
I taught that you had to consider what Science Fiction developed from. I reached back into the eighteenth century and pointed to Enlightenment social satires. Books like "Gulliver's Travels," "Candide," and "Baron Munchausen" exaggerated trends in contemporary life to a fantastic degree, often placing them in fictional locales, to make telling social points and avoid the hand of the censor. The original impulse to Science Fiction was the question, "What if this trend continues?" In "Frankenstein," Mary Shelly wondered about the Faustian implications of pursuing scientific knowledge too far. By the nineteenth century, one of the dominant trends in society was technological change.
The transition to recognizable Science Fiction in the 1860s as Verne and others began to speculate about military technology. The arms races in Europe before World War One led many to speculate about what the next war would bring. At the same time others looked at the closing frontier around the world, the advance of archaeology, and wondered about lost civilizations that lay waiting to be discovered. Finally, mass literacy and cheap publishing had created a seemingly insatiable market for breathlessly told, light stories of the fantastic.
Science Fiction really came together as Science Fiction in the last few years before World War One. The literature that finally took the name, was pulp magazine literature written for pure entertainment. Verne and Wells still had some socially redeeming value. H. Rider Haggard didn't really have any science. I call Edgar Rice Burroughs the first great writer of real Science Fiction. Burroughs gave his readers lost civilizations, space travel, fantastic beasts, and eccentric inventors. His stories were read by millions and he inspired an entire generation of imitators.
Science Fiction was dominated by flashy, meaningless pulp stories from just before World War One till soon after World War Two. The decline of magazine fiction and the rise of the paperback book at the beginning of the Cold War changed things a bit. Although many of the themes looked the same--lost civilizations, space travel, fantastic beasts, and eccentric inventors--the tone began to change. A generation of authors had grown up reading Science Fiction who wrote books that used the themes, but began to sneak some socially significant content back in. Decades of psychology, genocide, and the modern totalitarian state had changed both writers and readers, and it showed. Voltaire would have been right at home with a book like "1984."
The second post-war generation of writers, the New Wave of the sixties and seventies, wasn't nearly as big a break with their predecessors as was believed at the time. They continued to absorb the social issues and anxieties of their day. They were more outspoken about expressing these things in their writing and more concerned about gaining literary respect for their work.
At the time I wrote my syllabus in the seventies, the war in Vietnam had ended, Watergate was over, and the short national nightmare of Disco was upon us. It was clear the New Wave had just about run its course. While better writing and social consciousness were here to stay, Science Fiction was hit with the same "oh, lighten up" mood that had hit popular music. While real science fiction groped for the next thing (Cyberpunk, which wouldn't quite bloom for a few years), really bad Fantasy came forward to fill the gap. Really bad Fantasy was a leap back into the pulp era. Every character was no more than his or her race or occupation. The plots were rarely more than a game of Dungeons and Dragons written out.
Aside from the fact that a generation had changed, there was really very little in the goals that made Cyberpunk different from New Wave. The writers were concerned about a different set of social issues, but they still cared about their literary legacy. A major change in the eighties came on the publishing end. As baby boomers aged, many of those who had read science fiction as teens continued to read as adults. In reacting to this, the publishing industry began to release more Science Fiction books in hardback, began to allow more sex, and, encouraged by the sales of giant fantasy epics, let books get longer. All of these gave the writers a bigger palate to work with.
We’re moving into the post-Cyberpunk era now; I’m not sure what it will entail. In my seventies syllabus, the only predictions I ventured were that the fantasy explosion would continue and that there would be more women writers and readers. I was right on both of those, but not bold enough to predict any further. Today I expect to see the border between Science Fiction and mainstream fiction continue to blur, while at the same time Science Fiction continues to fragment into sub-genres, again following the example of the music industry.
There, I hardly made a list at all, but told you what my criteria would be for making one. Why don’t you tell me what some of the best exemplars are for the periods I just out-lined. If you could get a class to read a book a week for nine weeks, could you create a list of nine books that would get them through that history?
What he said
I think Bobby speaks for us all on this one.
I think Bobby speaks for us all on this one.
President Bush, right-wing pundits, and cranky old wingnut gaffers on C-SPAN's call-in shows insist on referring to the opposition party as the "Democrat Party." They know that's not the name of the party, but like a six-year-old on a sugar jag, they keep doing it because they know it pisses some people off. It also pretty much sums up the level of Neener-Neener political discourse coming out of the GOP. I suppose we could counter by referring to the "Republic Party," which has a slightly fascist tone to it, but it would only be playing their game. Besides, the only name that I want to attach to them in November is "Loser."
More Pluto
Last week's compromise at the International Astronomical Union meeting in Prague over the definition of "planet" has been roundly mocked around the world by science writers, teachers, and bloggers alike. It has all the charm of a committee written product and that's exactly what it was. The vote on a definition is still two days away and several competing definitions are being discussed.
Traditionally, planets were lights in the sky that moved differently than the stars. With the introduction of the telescope and the heliocentric theory of the solar system, scientists figured out that the planets not only moved differently than stars, they were actually more like the Earth while the stars were like the Sun. With planets now thought of as Earth-like objects orbiting a star, the definition of planet had changed from one describing its movement to one describing its nature.
In 1781 William Herschel discovered a new planet and named it George after his king. Other astronomers ignored his name and called it Uranus, and started looking for planets of their own to name.
In 1801 Giuseppe Piazzi discovered an object orbiting between Mars and Jupiter, announced that he found a tiny planet, and named it Ceres Ferdinandea. The name seemed to cover all the bases, it had an element from classical mythology and it sucked up to his king. Unfortunately, Ferdinand of Sicily had recently been overthrown by Napoleon and no one went along with naming a celestial object after a powerless refugee. As astronomers began looking at the region in which Ceres had been found, they promptly found three more tiny planets. These were named Pallas, Juno, and Vesta. Naming planets after kings had proved to be a non-starter, so the astronomers stuck with classical mythology.
These tiny planets bothered astronomers. They were smaller than any of the known moons. Even with the best telescopes astronomers could see no details on them. They were just four tiny dots of light. Herschel, who by now had discovered four moons to go with his planet (he named them after characters in "A Midsummer Night's Dream"), suggested not letting these insignificant objects into the august club of planets. He suggested a new word, asteroid (star like), to describe them. The little planets remained in limbo until the 1840s when a new generation of more powerful telescopes led to the discovery of more tiny bodies between Mars and Jupiter. Facing the prospect of dozens or more new planets, the international astronomical community adopted Herschel's suggestion and demoted the asteroids into their own category apart from the planets.
At about this time, the search was on for another planet beyond Uranus. Based on a half century of observing Uranus' orbit some astronomers had come to believe that the gravity of another large body must be affecting it. By the 1840s they had an idea where to look. In 1846 Urbain Le Verrier calculated the exact location and observers had no problem finding the planet. They named it Neptune.
The same orbital mechanics that led to the discovery of Neptune led astronomers to expect another large planet further out. In 1930 Clyde Tombaugh discovered Pluto. Astronomers immediately knew that Pluto was too small to be the expected gravity source. Pluto, however, had an advantage that Ceres never did in becoming accepted as a planet: mass communication and mass literacy. The discovery of new planet was announced in newspapers and newsreels. The name was suggested by an eleven year old girl in Oxford. Walt Disney introduced a character named Pluto into his Mickey Mouse cartoons later that year. For three quarters of a century, children have learned that Pluto is a planet.
But astronomers never liked it. Pluto was too small to be a real planet. It's orbit is highly eccentric and at an angle to plane of all the other planets' orbits. In the very same year that Tombaugh discovered Pluto, Frederick C. Leonard predicted that there was a whole belt of tiny objects beyond Neptune. Sooner or later we would have good enough telescopes to find them and the astronomical community would be faced with the same problem that they had faced with the asteroids: too many and too small to be planets. That day came about fifteen years ago. Today, about 800 of these Kuiper objects, as they are called, have been discovered.
This brings us back to the present and the debate at the IAU meeting. Last week's proposed definition is that a planet must be big enough for its gravity to pull it into a spherical shape. This definition would include the inner planets, the outer planets, Pluto, about four of the asteroids, and about four other Kuiper objects. It also requires a planet to orbit a star and not another planet; that excludes all of the moons in the solar system. That definition gives us between 12 and 17 planets and the list will be sure to grow as other large Kuiper objects are discovered.
On Friday, a group of astronomers introduced a second definition at the IAU. This definition requires a planet to dominate its neighborhood and would eliminate all Kuiper objects, including Pluto, leaving us with only eight planets. Rather than send the two definitions up against each other, the executive committee of the IAU yesterday decided to hold separate votes on each element of the definitions. This has the possibility of going horribly wrong.
In today's New York Times David Overbye says that another definition will be proposed today. It's not clear from his mention whether this is an entirely new definition or an attempt at compromise between the various factions at the IAU.
Personally, I think the IAU should leave it alone. Planet is not an especially useful category. Based on size, composition, orbital characteristics, and probable origin there are at least five types of objects orbiting the sun (not counting moons, human made satellites, and that gigantic space cruiser full of super intelligent squid hiding behind Titan). Based on historical tradition, nine objects in three of those categories are called planets. Geologists have not been hampered by the fact that the traditional continents are a useless category. They don't insist that we give up on continent and start memorizing all twenty or so tectonic plates. Astronomers should learn from that. Let tradition call things what tradition will, and let science go its own way. Nerdy kids know that India is a better candidate for continent that Europe. The same kids will learn the differences between what most people call planets and the categories that matter to scientists.
By spending so much time on this, the IAU just make themselves look silly.
Last week's compromise at the International Astronomical Union meeting in Prague over the definition of "planet" has been roundly mocked around the world by science writers, teachers, and bloggers alike. It has all the charm of a committee written product and that's exactly what it was. The vote on a definition is still two days away and several competing definitions are being discussed.
Traditionally, planets were lights in the sky that moved differently than the stars. With the introduction of the telescope and the heliocentric theory of the solar system, scientists figured out that the planets not only moved differently than stars, they were actually more like the Earth while the stars were like the Sun. With planets now thought of as Earth-like objects orbiting a star, the definition of planet had changed from one describing its movement to one describing its nature.
In 1781 William Herschel discovered a new planet and named it George after his king. Other astronomers ignored his name and called it Uranus, and started looking for planets of their own to name.
In 1801 Giuseppe Piazzi discovered an object orbiting between Mars and Jupiter, announced that he found a tiny planet, and named it Ceres Ferdinandea. The name seemed to cover all the bases, it had an element from classical mythology and it sucked up to his king. Unfortunately, Ferdinand of Sicily had recently been overthrown by Napoleon and no one went along with naming a celestial object after a powerless refugee. As astronomers began looking at the region in which Ceres had been found, they promptly found three more tiny planets. These were named Pallas, Juno, and Vesta. Naming planets after kings had proved to be a non-starter, so the astronomers stuck with classical mythology.
These tiny planets bothered astronomers. They were smaller than any of the known moons. Even with the best telescopes astronomers could see no details on them. They were just four tiny dots of light. Herschel, who by now had discovered four moons to go with his planet (he named them after characters in "A Midsummer Night's Dream"), suggested not letting these insignificant objects into the august club of planets. He suggested a new word, asteroid (star like), to describe them. The little planets remained in limbo until the 1840s when a new generation of more powerful telescopes led to the discovery of more tiny bodies between Mars and Jupiter. Facing the prospect of dozens or more new planets, the international astronomical community adopted Herschel's suggestion and demoted the asteroids into their own category apart from the planets.
At about this time, the search was on for another planet beyond Uranus. Based on a half century of observing Uranus' orbit some astronomers had come to believe that the gravity of another large body must be affecting it. By the 1840s they had an idea where to look. In 1846 Urbain Le Verrier calculated the exact location and observers had no problem finding the planet. They named it Neptune.
The same orbital mechanics that led to the discovery of Neptune led astronomers to expect another large planet further out. In 1930 Clyde Tombaugh discovered Pluto. Astronomers immediately knew that Pluto was too small to be the expected gravity source. Pluto, however, had an advantage that Ceres never did in becoming accepted as a planet: mass communication and mass literacy. The discovery of new planet was announced in newspapers and newsreels. The name was suggested by an eleven year old girl in Oxford. Walt Disney introduced a character named Pluto into his Mickey Mouse cartoons later that year. For three quarters of a century, children have learned that Pluto is a planet.
But astronomers never liked it. Pluto was too small to be a real planet. It's orbit is highly eccentric and at an angle to plane of all the other planets' orbits. In the very same year that Tombaugh discovered Pluto, Frederick C. Leonard predicted that there was a whole belt of tiny objects beyond Neptune. Sooner or later we would have good enough telescopes to find them and the astronomical community would be faced with the same problem that they had faced with the asteroids: too many and too small to be planets. That day came about fifteen years ago. Today, about 800 of these Kuiper objects, as they are called, have been discovered.
This brings us back to the present and the debate at the IAU meeting. Last week's proposed definition is that a planet must be big enough for its gravity to pull it into a spherical shape. This definition would include the inner planets, the outer planets, Pluto, about four of the asteroids, and about four other Kuiper objects. It also requires a planet to orbit a star and not another planet; that excludes all of the moons in the solar system. That definition gives us between 12 and 17 planets and the list will be sure to grow as other large Kuiper objects are discovered.
On Friday, a group of astronomers introduced a second definition at the IAU. This definition requires a planet to dominate its neighborhood and would eliminate all Kuiper objects, including Pluto, leaving us with only eight planets. Rather than send the two definitions up against each other, the executive committee of the IAU yesterday decided to hold separate votes on each element of the definitions. This has the possibility of going horribly wrong.
In today's New York Times David Overbye says that another definition will be proposed today. It's not clear from his mention whether this is an entirely new definition or an attempt at compromise between the various factions at the IAU.
Personally, I think the IAU should leave it alone. Planet is not an especially useful category. Based on size, composition, orbital characteristics, and probable origin there are at least five types of objects orbiting the sun (not counting moons, human made satellites, and that gigantic space cruiser full of super intelligent squid hiding behind Titan). Based on historical tradition, nine objects in three of those categories are called planets. Geologists have not been hampered by the fact that the traditional continents are a useless category. They don't insist that we give up on continent and start memorizing all twenty or so tectonic plates. Astronomers should learn from that. Let tradition call things what tradition will, and let science go its own way. Nerdy kids know that India is a better candidate for continent that Europe. The same kids will learn the differences between what most people call planets and the categories that matter to scientists.
By spending so much time on this, the IAU just make themselves look silly.
Sunday, August 20, 2006
The opinions I want to hear
I entered graduate school at the University of Washington in the fall of 1988. My field was history and my specialty was the modern Balkans, specifically Yugoslavia. History PhDs are one of the slowest degrees to pursue, and financial reasons slowed mine more than usual. As 1988 turned into 1989 Eastern Europe began to dissolve. By 1991 my country of specialty started its dramatic collapse. At that time I was probably one of the two or three dozen leading authorities on the causes of that collapse then living in the United States.
As the slow motion Yugoslav civil war moved into Bosnia, I went to the beat map store in Seattle and bought the largest scale map of Yugoslavia that I could find. I tacked the map up in the hallway of our apartment and covered it with a sheet of acetate. Every day I came home from school or work and looked up the latest news on the war in Bosnia. I carefully marked every change of position between the three sides. I followed every battle over a hillside, village, or roadway and marked it on my map. Consequently, none of the major changes of fortune were a surprise to me. I had followed each army as it negotiated for position around every objective and knew when they were ready to move.
I have not followed the slow motion civil war in Iraq as closely. I can't predict who will move where next. What I can tell you is that there is a very real civil war going on. As the Sunni and Shia Muslims negotiate for position, the Slovenes--I mean the Kurds are hoping to quietly slip out of the Iraqi state without anyone noticing.
No native born American has experienced civil war in his or her own country. We can only base our opinions on historical analogies or on parallel with other modern countries. The most valuable insights into the collapse of Iraq are those that come from places like Bosnia, Tajikistan, or the Transcaucasus. These are the opinions I want to hear. Can anyone point me in his or her direction?
I entered graduate school at the University of Washington in the fall of 1988. My field was history and my specialty was the modern Balkans, specifically Yugoslavia. History PhDs are one of the slowest degrees to pursue, and financial reasons slowed mine more than usual. As 1988 turned into 1989 Eastern Europe began to dissolve. By 1991 my country of specialty started its dramatic collapse. At that time I was probably one of the two or three dozen leading authorities on the causes of that collapse then living in the United States.
As the slow motion Yugoslav civil war moved into Bosnia, I went to the beat map store in Seattle and bought the largest scale map of Yugoslavia that I could find. I tacked the map up in the hallway of our apartment and covered it with a sheet of acetate. Every day I came home from school or work and looked up the latest news on the war in Bosnia. I carefully marked every change of position between the three sides. I followed every battle over a hillside, village, or roadway and marked it on my map. Consequently, none of the major changes of fortune were a surprise to me. I had followed each army as it negotiated for position around every objective and knew when they were ready to move.
I have not followed the slow motion civil war in Iraq as closely. I can't predict who will move where next. What I can tell you is that there is a very real civil war going on. As the Sunni and Shia Muslims negotiate for position, the Slovenes--I mean the Kurds are hoping to quietly slip out of the Iraqi state without anyone noticing.
No native born American has experienced civil war in his or her own country. We can only base our opinions on historical analogies or on parallel with other modern countries. The most valuable insights into the collapse of Iraq are those that come from places like Bosnia, Tajikistan, or the Transcaucasus. These are the opinions I want to hear. Can anyone point me in his or her direction?
Oh, ick
Is it really time for more bad history? It is. David T. Beito at the Liberty & Power group blog has the latest Carnival of Bad History. Where else do Ayn Rand, K-Y Jelly, and Holocaust denial come together in one post? Okay, if you have an answer to that question, I do not want to know.
Is it really time for more bad history? It is. David T. Beito at the Liberty & Power group blog has the latest Carnival of Bad History. Where else do Ayn Rand, K-Y Jelly, and Holocaust denial come together in one post? Okay, if you have an answer to that question, I do not want to know.
There are no foxes in athiest holes*
According to Newsweek (via Coturnix), atheist veterans are getting a little tired of being ignored.
Let's stop here before the obvious next line. According to the standards of he-said-she-said journalism, we all know the author is going to quote a limp statement from an athiest group that says: " we are also good soldiers." Let's go to the tape:
Boy, that sure put Gen. Blum in his place. A severly chastised National Guard attempted to save face:
If you put the two statements together, it almost appears that the unnamed athiest groups are supporting Gen. Blum. Let's look at what Gen. Blum really said. To me, it appears as if he is drawing a moral equivalence between agnostics, atheists, and bigots. And it's not a flattering equivalence. To me, Gen. Blum seems to be saying that even the scum of the earth will pull together when under fire. I'm insulted and a lot of very good and honorable people I know will be insulted to be lumped in with bigots. Imagine if he had said, "Niggers, Chinks, and Klansmen suddenly lose all that when their life is on the line." Do you think we would hear from an official spokesman for the Guard that his comments were "intended to clearly illustrate the positive spirit of camaraderie, human understanding and inclusion of our fine men and women in the National Guard"?
There actually is a certain truth to what he says. If someone tries to kill a group of people, even if that group does not feel that they share anything in common, they will probably pull together in order to survive. However, this is not he truth that Gen. Blum was trying to make. He should only get credit for what he was really saying. That credit is really blame, and he deserves to be condemned as the bigot that he wanted to call agnostics and atheists.
* I stole that title from Coturnix.
According to Newsweek (via Coturnix), atheist veterans are getting a little tired of being ignored.
There are no atheists in foxholes," the old saw goes. The line, attributed to a WWII chaplain, has since been uttered countless times by grunts, chaplains and news anchors. But an increasingly vocal group of activists and soldiers--atheist soldiers--disagrees. "It's a denial of our contributions," says Master Sgt. Kathleen Johnson, who founded the Military Association of Atheists and Freethinkers and who will be deployed to Iraq this fall. "A lot of people manage to serve without having to call on a higher power."
It's an ongoing battle. Just last month Lt. Gen. H. Steven Blum, chief of the National Guard Bureau, said, "Agnostics, atheists and bigots suddenly lose all that when their life is on the line."
Let's stop here before the obvious next line. According to the standards of he-said-she-said journalism, we all know the author is going to quote a limp statement from an athiest group that says: " we are also good soldiers." Let's go to the tape:
Atheist groups reacted swiftly, releasing a statement that "Nonbelievers are serving, and have served, in our nation's military with distinction!"
Boy, that sure put Gen. Blum in his place. A severly chastised National Guard attempted to save face:
The National Guard said it received about 20 letters objecting to Blum's statement, and said his comments were "intended to clearly illustrate the positive spirit of camaraderie, human understanding and inclusion of our fine men and women in the National Guard."
If you put the two statements together, it almost appears that the unnamed athiest groups are supporting Gen. Blum. Let's look at what Gen. Blum really said. To me, it appears as if he is drawing a moral equivalence between agnostics, atheists, and bigots. And it's not a flattering equivalence. To me, Gen. Blum seems to be saying that even the scum of the earth will pull together when under fire. I'm insulted and a lot of very good and honorable people I know will be insulted to be lumped in with bigots. Imagine if he had said, "Niggers, Chinks, and Klansmen suddenly lose all that when their life is on the line." Do you think we would hear from an official spokesman for the Guard that his comments were "intended to clearly illustrate the positive spirit of camaraderie, human understanding and inclusion of our fine men and women in the National Guard"?
There actually is a certain truth to what he says. If someone tries to kill a group of people, even if that group does not feel that they share anything in common, they will probably pull together in order to survive. However, this is not he truth that Gen. Blum was trying to make. He should only get credit for what he was really saying. That credit is really blame, and he deserves to be condemned as the bigot that he wanted to call agnostics and atheists.
* I stole that title from Coturnix.
I write letters
As long as I'm on the subject of Atrios, there's something I've been meaning to ask him.
Dear Atrios,
I see that you named Glenn Reynolds the Wanker of the Day today, again. How many times does that make so far this year? I think your relentless focus on the inanities of Glenn Reynolds is getting to be a bit unfair. The world is full of bad people. How do you think they feel? They get up each morning and do their worst, only to find that, at the end of the day, Glenn Reynolds been named Wanker of the Day again. Don't they deserve a chance to get their fifteen minutes of infamy?
Here's what I propose: establish a Wanker Hall of Infamy. Glenn Reynolds, Rick Santorum, and Pat Robertson can be the initial inductees. White House press secretaries are entitled to automatic inclusion. This would give other bad people their own shot at the daily title.
If maintaining a Hall of Infamy sounds like too much work, maybe you could just rename your daily award. Instead of the Wanker of the Day, bad people could could be the proud winners of the Golden Glenn.
Think about it.
I remain, because I have no choice,
John McKay
PS - I made a similar suggestion to Keith Olbermann in the spring and received no response. I'm hoping you'll be the first one to act on this really great idea.
PPS - Even if you don't think it's such a great idea, maybe you'll do it just once because Friday's my birthday.
PPPS - Did I mention that I like your blog much more than any of the other major blogs?
PPPPS - If you are repelled by sycophancy, ignore the previous PS.
As long as I'm on the subject of Atrios, there's something I've been meaning to ask him.
Dear Atrios,
I see that you named Glenn Reynolds the Wanker of the Day today, again. How many times does that make so far this year? I think your relentless focus on the inanities of Glenn Reynolds is getting to be a bit unfair. The world is full of bad people. How do you think they feel? They get up each morning and do their worst, only to find that, at the end of the day, Glenn Reynolds been named Wanker of the Day again. Don't they deserve a chance to get their fifteen minutes of infamy?
Here's what I propose: establish a Wanker Hall of Infamy. Glenn Reynolds, Rick Santorum, and Pat Robertson can be the initial inductees. White House press secretaries are entitled to automatic inclusion. This would give other bad people their own shot at the daily title.
If maintaining a Hall of Infamy sounds like too much work, maybe you could just rename your daily award. Instead of the Wanker of the Day, bad people could could be the proud winners of the Golden Glenn.
Think about it.
I remain, because I have no choice,
John McKay
PS - I made a similar suggestion to Keith Olbermann in the spring and received no response. I'm hoping you'll be the first one to act on this really great idea.
PPS - Even if you don't think it's such a great idea, maybe you'll do it just once because Friday's my birthday.
PPPS - Did I mention that I like your blog much more than any of the other major blogs?
PPPPS - If you are repelled by sycophancy, ignore the previous PS.
Not as dumb as we think
We bloggers, as we unhealthily obsess on the news, often catch politicians and talking heads sagely announcing certain "truths" that are completely at odds with verifiable reality. We usually credit such statements as evidence of how much the inside the beltway crowd is out of touch. Such statements on our part include a not small amount of self-congratulation on being in touch ourselves. While we may deserve a certain amount of credit most of the time for paying attention, in many cases we are only noticing part of the story. Let me use an Atrios post as an example.
Atrios quotes John McCain on Meet the Press this morning saying the following: "Most Americans, when they're asked if they want to set a date for withdrawal [from Iraq], say no." Atrios pulls out the latest polls from the major national polling firms. CNN on 8/2-3/06 polled 57 percent in favor of setting a date for withdrawal. CBS polled 56 percent. Gallup for USA Today polled 51 percent. Fox News polled 58 percent. Atrios' only comment is a dry, "So much for the straight talk express." Atrios is not calling McCain stupid or out of touch; he's implying that McCain has some other game up his sleeve. What is that game?
McCain is trying to create conventional wisdom. People in the opinion business--as all politicians are--know that conventional wisdom is more valuable than measurable facts. A small amount of conventional wisdom can balance out and replace a far larger amount of truth.
In the mid-nineties, my Clever Wife observed a particularly graphic example of this principle in action. Downtown Seattle was undergoing a major round of urban renewal. Several major stores were moving to new locations, several new stores were coming into town, and the city was spending hundreds of millions to revitalize a corridor connecting two healthy areas. This was at the beginning of the tech boom in which Seattle was an especially vibrant player. Despite all of the evidence that downtown was on an upswing, the conventional wisdom was that downtown was dying and drastic action needed to be taken before it was too late.
One rather pleasant relic of the previous round of saving downtown from imminent doom was a medium sized pedestrian plaza, called Westlake Park, in front of the Nordstrom's department store. Just as the pieces of the new urban renewal project were falling into place, Nordstrom's announced that would quit the project and move out of downtown unless the city punched a street through the Westlake Park. They announced no good reason for this demand, a development which wouldn't even have a strong effect on Nordstrom’s since they were already scheduled to move into a new space a block off the park.
The city panicked. After all, everyone knew downtown was dying (despite the evidence of their own eyes to the contrary). Without Nordstrom's, all efforts to save downtown were surely doomed. The demand was put up to a vote with the city fully backing Nordstrom's. The Seattle suburbs--people who rarely come downtown, but can't imagine life without Nordstrom's--voted overwhelmingly to give Nordstrom's their street.*
A new conventional wisdom arose to help people justify getting rid of a park in the middle of a dense urban core. At that time, my Clever Wife worked on the edge of downtown, close enough that she and her co-workers could go to Westlake on their lunch breaks to shop or to eat. A few days before the election they were discussing the park. All agreed that they liked the park and that it would be a shame to have a street cut it in half. Having established their own personal experience of the park, one her co-workers added, "I've heard that only homeless people hang out there." They all agreed that they had heard that, too. The conventional wisdom that only unsavory people used the park and that, therefore, ruining the park would be no loss outweighed the fact that they all used the park and would miss it. What they had heard was more important than what they actually experienced.
This is clearly the game that McCain is playing. Even though most people want to get out of Iraq, most of them will acquiesce to staying if the makers of opinion can convince them that most other people want to stay. McCain obviously has a stake in wanting to form opinion in this direction. He is not a stupid man and probably knows exactly what he is doing. The question deserving of examination is why the talking heads of the news industry go along with his efforts to manufacture a conventional wisdom at odds with real public opinion. Some of them have their own stakes in this game. Some have their own reasons for supporting the war or supporting the war's supporters.
Of course, we shouldn't eliminate the possibility that some of them really are just as stupid as we always thought.
* Clever Wife and I have boycotted Nordstrom's ever since.
We bloggers, as we unhealthily obsess on the news, often catch politicians and talking heads sagely announcing certain "truths" that are completely at odds with verifiable reality. We usually credit such statements as evidence of how much the inside the beltway crowd is out of touch. Such statements on our part include a not small amount of self-congratulation on being in touch ourselves. While we may deserve a certain amount of credit most of the time for paying attention, in many cases we are only noticing part of the story. Let me use an Atrios post as an example.
Atrios quotes John McCain on Meet the Press this morning saying the following: "Most Americans, when they're asked if they want to set a date for withdrawal [from Iraq], say no." Atrios pulls out the latest polls from the major national polling firms. CNN on 8/2-3/06 polled 57 percent in favor of setting a date for withdrawal. CBS polled 56 percent. Gallup for USA Today polled 51 percent. Fox News polled 58 percent. Atrios' only comment is a dry, "So much for the straight talk express." Atrios is not calling McCain stupid or out of touch; he's implying that McCain has some other game up his sleeve. What is that game?
McCain is trying to create conventional wisdom. People in the opinion business--as all politicians are--know that conventional wisdom is more valuable than measurable facts. A small amount of conventional wisdom can balance out and replace a far larger amount of truth.
In the mid-nineties, my Clever Wife observed a particularly graphic example of this principle in action. Downtown Seattle was undergoing a major round of urban renewal. Several major stores were moving to new locations, several new stores were coming into town, and the city was spending hundreds of millions to revitalize a corridor connecting two healthy areas. This was at the beginning of the tech boom in which Seattle was an especially vibrant player. Despite all of the evidence that downtown was on an upswing, the conventional wisdom was that downtown was dying and drastic action needed to be taken before it was too late.
One rather pleasant relic of the previous round of saving downtown from imminent doom was a medium sized pedestrian plaza, called Westlake Park, in front of the Nordstrom's department store. Just as the pieces of the new urban renewal project were falling into place, Nordstrom's announced that would quit the project and move out of downtown unless the city punched a street through the Westlake Park. They announced no good reason for this demand, a development which wouldn't even have a strong effect on Nordstrom’s since they were already scheduled to move into a new space a block off the park.
The city panicked. After all, everyone knew downtown was dying (despite the evidence of their own eyes to the contrary). Without Nordstrom's, all efforts to save downtown were surely doomed. The demand was put up to a vote with the city fully backing Nordstrom's. The Seattle suburbs--people who rarely come downtown, but can't imagine life without Nordstrom's--voted overwhelmingly to give Nordstrom's their street.*
A new conventional wisdom arose to help people justify getting rid of a park in the middle of a dense urban core. At that time, my Clever Wife worked on the edge of downtown, close enough that she and her co-workers could go to Westlake on their lunch breaks to shop or to eat. A few days before the election they were discussing the park. All agreed that they liked the park and that it would be a shame to have a street cut it in half. Having established their own personal experience of the park, one her co-workers added, "I've heard that only homeless people hang out there." They all agreed that they had heard that, too. The conventional wisdom that only unsavory people used the park and that, therefore, ruining the park would be no loss outweighed the fact that they all used the park and would miss it. What they had heard was more important than what they actually experienced.
This is clearly the game that McCain is playing. Even though most people want to get out of Iraq, most of them will acquiesce to staying if the makers of opinion can convince them that most other people want to stay. McCain obviously has a stake in wanting to form opinion in this direction. He is not a stupid man and probably knows exactly what he is doing. The question deserving of examination is why the talking heads of the news industry go along with his efforts to manufacture a conventional wisdom at odds with real public opinion. Some of them have their own stakes in this game. Some have their own reasons for supporting the war or supporting the war's supporters.
Of course, we shouldn't eliminate the possibility that some of them really are just as stupid as we always thought.
* Clever Wife and I have boycotted Nordstrom's ever since.
Friday, August 18, 2006
Not doing their job
The founding fathers felt that a free press was vital a functioning democracy, because only a free press could provide voters with the information they need to make informed choices. Over the decades, the press has done a mixed job of living up to that responsibility and the votes have only marginally taken advantage of the information when it has been available. One of the complaints about the centralization of the news media into a limited number of hands is that diverse corporations are more likely to concentrate on the profitability of the news and ignore its information providing role. Most television news has been transformed into something called infotainment and no longer fufills its information function at all. To anyone who doubts that, I offer this:
PS Does everyone else's browsers show a big empty space above the table in this post? Can anyone tell me how to fix my HTML so it won't do that?
PPS Problem solved.
The founding fathers felt that a free press was vital a functioning democracy, because only a free press could provide voters with the information they need to make informed choices. Over the decades, the press has done a mixed job of living up to that responsibility and the votes have only marginally taken advantage of the information when it has been available. One of the complaints about the centralization of the news media into a limited number of hands is that diverse corporations are more likely to concentrate on the profitability of the news and ignore its information providing role. Most television news has been transformed into something called infotainment and no longer fufills its information function at all. To anyone who doubts that, I offer this:
Yesterday, a federal judge in Michigan issued "a sweeping rebuke of the once-secret domestic-surveillance effort the White House authorized following the terrorist attacks of Sept. 11, 2001." The ruling was "a significant blow to Bush’s attempts to expand presidential powers," but you wouldn’t know that by watching last evening’s network newscasts.
All three major TV networks led their evening news with stories on JonBenet Ramsey’s death and the comments made by arrested teacher John Mark Karr. The networks offered multiple segments and numerous expert analyses to provide in-depth coverage on the legal case. The NSA decision received only a passing mention from two of the newscasts, while ABC devoted a full segment to it.
Still, ABC devoted twice as much time to Ramsey as it did to the NSA story. More egregiously, CBS offered seven times as much airtime to Ramsey as it did to the NSA story, while NBC devoted 15 times more airtime. Below is a comparison of the allocation of time made by each network:
NETWORK RAMSEY SEGMENT NSA SEGMENT NBC 7:39 0:27 CBS 3:23 0:25 ABC 4:03 2:00
As CBS host Bob Schieffer wrapped up the Ramsey segment, he reassured the audience, "We'll stay on this case. That’s for sure."
PS Does everyone else's browsers show a big empty space above the table in this post? Can anyone tell me how to fix my HTML so it won't do that?
PPS Problem solved.
Thursday, August 17, 2006
The manly art of hunting
If true, this is completely contemptible.
Gentry's lawyers are using the classical defense for people caught doing unethical things, "My client did nothing illegal." This is complete misdirection and changing of the subject. The crime that Gentry is accused of is a technical one dealing with how he tagged and reported the kill; I have no opinion on the legal question. The way in which he killed the bear is not against the law and I have strong opinions about that. He bought a tame bear from a photographer and staged a fake hunt in a fenced area just to get a trophy and a home movie. That is cowardly and unsportsmanlike.
I explained my opinions on hunting last winter when Cheney was shooting major contributors to the Republican Party.
Shooting domesticated, captive animals is a vile blood sport. It is not real hunting. It is cowardly, unsportsmanlike, and should be illegal. Whether or not he did anything illegal, if Gentry really killed the bear under the circumstances described, he a completely contemptible excuse for a human being and does not deserve anyone's support. Some country music fans are justly outraged by this. I'll be interested to see how Gentry's post scandal career compares to the Dixie Chicks'.
If true, this is completely contemptible.
Troy Lee Gentry, half of the country music duo Montgomery Gentry, has been charged with killing a tame bear and then making it look as if it was a hunting trophy, authorities said on Wednesday.
[...]
According to an indictment unsealed this week, in October 2004 Gentry paid $4,650 to shoot the "trophy-caliber" bear named "Cubby" at the Minnesota Wildlife Connection in northern Minnesota, which advertises itself as a place where animals can be photographed in the wild.
After using a bow and arrow to kill the animal inside its pen, Gentry and the owner of the preserve tagged the bear and registered it with the state as if it had been killed in the wild. A videotape was edited to make it appear that Gentry had hunted down the bear.
Gentry's lawyers are using the classical defense for people caught doing unethical things, "My client did nothing illegal." This is complete misdirection and changing of the subject. The crime that Gentry is accused of is a technical one dealing with how he tagged and reported the kill; I have no opinion on the legal question. The way in which he killed the bear is not against the law and I have strong opinions about that. He bought a tame bear from a photographer and staged a fake hunt in a fenced area just to get a trophy and a home movie. That is cowardly and unsportsmanlike.
I explained my opinions on hunting last winter when Cheney was shooting major contributors to the Republican Party.
I've spent my entire life in the Northwest among hunting people. Most of my male relatives hunted and many of my friends do. Though I don't hunt myself, I did learn safe gun handling at a very early age. My father was a cowboy and was very serious about such things. Every so often one of my friends takes me out target shooting, just to make sure I remember which end of the gun to point away from myself (despite their best efforts, I remain a terrible shot. I blame my vision; they blame my attention span). Consequently, I'm not especially impressed with the tales of Cheney's "hunting trips."
Every hunter I know hunts for food or as an excuse to spend a day in the woods. A large percentage of the meat I ate growing up was game: deer, elk, moose, caribou, and, once, bear. With that background, I find trophy hunting a little creepy, and what Cheney does, not hunting at all. Asthe Humane Society noted after a 2003 "hunting" trip of Cheney's:
Monday's hunting trip to Pennsylvania by Vice President Dick Cheney in which he reportedly shot more than 70 stocked pheasants and an unknown number of mallard ducks at an exclusive private club places a spotlight on an increasingly popular and deplorable form of hunting, in which birds are pen-reared and released to be shot in large numbers by patrons. The ethics of these hunts are called into question by rank-and-file sportsmen, who hunt animals in their native habitat and do not shoot confined or pen-raised animals that cannot escape.
The Pittsburgh Post-Gazette reported today that 500 farm-raised pheasants were released yesterday morning at the Rolling Rock Club in Ligonier Township for the benefit of Cheney's 10-person hunting party. The group killed at least 417 of the birds, illustrating the unsporting nature of canned hunts. The party also shot an unknown number of captive mallards in the afternoon.
"This wasn't a hunting ground. It was an open-air abattoir, and the vice president should be ashamed to have patronized this operation and then slaughtered so many animals," states Wayne Pacelle, a senior vice president of The Humane Society of the United States.
This is not hunting. It's not really sport unless you call it something like "organic skeet shooting." It's blood sport. Cheney killed "70 stocked pheasants and an unknown number of mallard ducks" in one afternoon. There's no way he was planning to eat all of those birds. The object is simply to rack up an impressive kill score. Real hunting involves--well--hunting for things, actually walking around and looking for game, not loitering around the buffet, making small talk, and waiting for the help to hold something in front of your gun so you can shoot it.
This kind of mass game killing was popular among Europe's royals before the First World War. It forms a fairly convincing data point for any argument comparing the current United States to a decadent imperial power. The Bushes are our Habsburgs and the Cheneys are our Bathorys.
Shooting domesticated, captive animals is a vile blood sport. It is not real hunting. It is cowardly, unsportsmanlike, and should be illegal. Whether or not he did anything illegal, if Gentry really killed the bear under the circumstances described, he a completely contemptible excuse for a human being and does not deserve anyone's support. Some country music fans are justly outraged by this. I'll be interested to see how Gentry's post scandal career compares to the Dixie Chicks'.
Real news
While the arrest of a suspect in the ten year old killing of JonBenet Ramsay is dominating the headlines, something far more important just happened.
The administration will, of course, appeal the decision and Right Blogistan will, of course, condemn the judge as a terrorist encouraging activist judge. Still, this is good news and a sign that the seperation of powers and constitutional rule aren't quite dead in this country.
Update: Highlight from the decision:
While the arrest of a suspect in the ten year old killing of JonBenet Ramsay is dominating the headlines, something far more important just happened.
A federal judge on Thursday ruled that the U.S. government's domestic eavesdropping program is unconstitutional and ordered it ended immediately.
In a 44-page memorandum and order, U.S. District Judge Anna Diggs Taylor, -- who is based in Detroit, Michigan -- struck down the National Security Agency's program, which she said violates the rights to free speech and privacy.
According to The Associated Press, Taylor is the first judge to rule the eavesdropping program unconstitutional.
The administration will, of course, appeal the decision and Right Blogistan will, of course, condemn the judge as a terrorist encouraging activist judge. Still, this is good news and a sign that the seperation of powers and constitutional rule aren't quite dead in this country.
Update: Highlight from the decision:
We must first note that the Office of the Chief Executive has itself been created, with its powers, by the Constitution. There are no hereditary Kings in America and no power not created by the Constitution. So all "inherent power" must derive from that Constitution.
Wednesday, August 16, 2006
No justice
Men's Health magazine just released its list of the 100 angriest cities in the U.S. Orlando, FL scored number one. I can see that, crying kids, standing in line for tickets in ninty plus weather, overpriced rides. Of course those smartasses down in Portland (number 96) scored calmer than Seattle (46). They think they're so mellow. I hate peole like that. What's really insane is that Spokane (68) scored calmer than Seattle. Who are they kidding!?! SPO-FRIKKING_KANE!!! No way!! SEATTLE IS SO MUCH CALMER THAN SPOKANE THAT IT ISN'T EVEN FUNNY. ARRRRGHH!!!!!
Men's Health magazine just released its list of the 100 angriest cities in the U.S. Orlando, FL scored number one. I can see that, crying kids, standing in line for tickets in ninty plus weather, overpriced rides. Of course those smartasses down in Portland (number 96) scored calmer than Seattle (46). They think they're so mellow. I hate peole like that. What's really insane is that Spokane (68) scored calmer than Seattle. Who are they kidding!?! SPO-FRIKKING_KANE!!! No way!! SEATTLE IS SO MUCH CALMER THAN SPOKANE THAT IT ISN'T EVEN FUNNY. ARRRRGHH!!!!!
"A bit insensitive"
Northwest Airlines is advising the employees that they are about to lay off that they should take up dumpster diving.
I can't think of anything nasty or sarcastic enough to do justice to this. I'm stunned.
Northwest Airlines is advising the employees that they are about to lay off that they should take up dumpster diving.
The No. 5 U.S. carrier, which has slashed most employees' pay and is looking to cut jobs as it prepares to exit bankruptcy, put the tips in a booklet handed out to about 50 workers and posted for a time on its employee Web site.
[...]
Northwest spokesman Roman Blahoski said some employees who received the handbook had taken issue with a couple of the items. "We agree that some of these suggestions and tips ... were a bit insensitive," Blahoski told Reuters.
The four-page booklet, "Preparing for a Financial Setback" contained suggestions such as shopping in thrift stores, taking "a date for a walk along the beach or in the woods" and not being "shy about pulling something you like out of the trash."
The booklet was part of a 150-page packet to ground workers, such as baggage handlers, whose jobs will likely be cut after their union agreed to allow the airline to outsource some of their work, Blahoski said.
I can't think of anything nasty or sarcastic enough to do justice to this. I'm stunned.
Reprieve for Pluto
Revisiting one of its favorite debates, the International Astronomical Union, meeting in the Prague, is trying to resolve whether Pluto is a planet or not. The problem is that there has never been an internationally accepted definition for planet. Until now defined planets using the Justice Potter Stewart definition of pornography, "I know it when I see it." The nine planets that we all learned in school do not easily fit into a single category that is very meaningful to science.
The word "planet" comes from an old Greek word for wanderer--"planeten." It described the way certain bodies appeared to move across the night sky. The stars all appear to stay in the same place, compared to each other, with the whole starry canopy circling the pole. The five known planets appeared detached from the night sky moving in paths that were independent and mysterious. The name planet has nothing to do with the origin or structure of the planets and isn't that useful for modern astronomy. The inner planets are small, rocky, and have very few moons. The outer planets are gigantic, gaseous, and have elaborate systems of moons. And then there is Pluto.
Ever since Clyde Tombaugh discovered Pluto in 1930, it has been an astronomical oddball. Astronomers at the time believed that we needed a large ninth planet to account for certain irregularities in the orbits of Uranus and Neptune. While they were excited about the discovery of Pluto, it was clear from the beginning that it was too small to be the longed for Planet X. As time went by, better observations showed that Pluto was even smaller than at first believed--smaller than the Earth's Moon--and that it had an irregular orbit far different that that of any other planet. That didn't stop generations of little kids from memorizing Pluto as one of the planets.
In the early nineties, astronomers began to confirm the existence of Kuiper Belt objects beyond the orbit of Neptune. These objects are various sized icy bodies that orbit in a disc extending from the orbit of Neptune to a distance about twice that far. While they might be the source of some of the moons of the outer planets and short-term comets, they are separate from the Oort cloud, which extends much further out and is the source of most comets. There was an immediate move by some astronomers to downgrade Pluto from planet to Kuiper object. My own feeling at the time was that Pluto probably is a Kuiper object but that it was extremely insensitive to tackle the subject while Clyde Tombaugh was still alive (he died in 1997 at the age of 91).
The International Astronomical Union proposed a definition for planet yesterday, but has still to vote on it. The new definition is that a planet must be big enough for its gravity to pull it into a spherical shape. This definition would include the inner planets, the outer planets, Pluto, about four of the asteroids between Mars and Jupiter, about four other Kuiper objects, and the list will be sure to grow as other large Kuiper objects are discovered. It also requires a planet to orbit a star and not another planet; that excludes all of the moons in the solar system.
This looks to be a compromise that will satisfy nobody. The definition, including four distinct types of objects is too broad to be of much scientific use. By doubling the number of planets, it’s too abrupt of a change to be embraced by popular culture. Basically, it's the type of decision that will mostly appeal to junior high nerds who will now get to tell their family and friends that they are wrong about a piece of common knowledge. “Ha ha, you can’t name the planets.” The family and friends will then knock the nerds down and steal their pants as they have done to generations of nerds in the past. And thus the cycle of life continues.
Revisiting one of its favorite debates, the International Astronomical Union, meeting in the Prague, is trying to resolve whether Pluto is a planet or not. The problem is that there has never been an internationally accepted definition for planet. Until now defined planets using the Justice Potter Stewart definition of pornography, "I know it when I see it." The nine planets that we all learned in school do not easily fit into a single category that is very meaningful to science.
The word "planet" comes from an old Greek word for wanderer--"planeten." It described the way certain bodies appeared to move across the night sky. The stars all appear to stay in the same place, compared to each other, with the whole starry canopy circling the pole. The five known planets appeared detached from the night sky moving in paths that were independent and mysterious. The name planet has nothing to do with the origin or structure of the planets and isn't that useful for modern astronomy. The inner planets are small, rocky, and have very few moons. The outer planets are gigantic, gaseous, and have elaborate systems of moons. And then there is Pluto.
Ever since Clyde Tombaugh discovered Pluto in 1930, it has been an astronomical oddball. Astronomers at the time believed that we needed a large ninth planet to account for certain irregularities in the orbits of Uranus and Neptune. While they were excited about the discovery of Pluto, it was clear from the beginning that it was too small to be the longed for Planet X. As time went by, better observations showed that Pluto was even smaller than at first believed--smaller than the Earth's Moon--and that it had an irregular orbit far different that that of any other planet. That didn't stop generations of little kids from memorizing Pluto as one of the planets.
In the early nineties, astronomers began to confirm the existence of Kuiper Belt objects beyond the orbit of Neptune. These objects are various sized icy bodies that orbit in a disc extending from the orbit of Neptune to a distance about twice that far. While they might be the source of some of the moons of the outer planets and short-term comets, they are separate from the Oort cloud, which extends much further out and is the source of most comets. There was an immediate move by some astronomers to downgrade Pluto from planet to Kuiper object. My own feeling at the time was that Pluto probably is a Kuiper object but that it was extremely insensitive to tackle the subject while Clyde Tombaugh was still alive (he died in 1997 at the age of 91).
The International Astronomical Union proposed a definition for planet yesterday, but has still to vote on it. The new definition is that a planet must be big enough for its gravity to pull it into a spherical shape. This definition would include the inner planets, the outer planets, Pluto, about four of the asteroids between Mars and Jupiter, about four other Kuiper objects, and the list will be sure to grow as other large Kuiper objects are discovered. It also requires a planet to orbit a star and not another planet; that excludes all of the moons in the solar system.
This looks to be a compromise that will satisfy nobody. The definition, including four distinct types of objects is too broad to be of much scientific use. By doubling the number of planets, it’s too abrupt of a change to be embraced by popular culture. Basically, it's the type of decision that will mostly appeal to junior high nerds who will now get to tell their family and friends that they are wrong about a piece of common knowledge. “Ha ha, you can’t name the planets.” The family and friends will then knock the nerds down and steal their pants as they have done to generations of nerds in the past. And thus the cycle of life continues.
Tuesday, August 15, 2006
This is amusing
According to a British psychiatrist, younger children are funnier that their older siblings. And sometimes in a ha-ha way. As the third member in my litter, I already knew that. I suppose we could set up a point-counterpoint debate on this subject between younger siblings Curley Howard and Zeppo Marx.
According to a British psychiatrist, younger children are funnier that their older siblings. And sometimes in a ha-ha way. As the third member in my litter, I already knew that. I suppose we could set up a point-counterpoint debate on this subject between younger siblings Curley Howard and Zeppo Marx.
Bring back the mammoth
Stories about Japanese or Russian geneticists who think they might be able to bring back the woolly mammoth are becoming something of a staple of science journalism. Every couple of months we get a new story about it. Basically, there are only two stories. The first, is the scientist who hopes to recover intact mammoth DNA from a frozen mammoth to use in cloning experiments. So far no one has recovered any intact mammoth DNA. The second story is about a scientist who wants to recover sperm cells (with intact DNA) from a frozen mammoth to use in artificial insemination or in vitro fertilization. Both approaches then use an Indian elephant as a surrogate mother to carry the mammoth fetus to term. Today's story is one of the latter type.
There is a big difference between mouse testes stored fifteen years under laboratory conditions and mammoth testes left outside for fifteen thousand years.
While Ogata and Mazur might be great biologists in their own fields, neither is much of a paleontologist. A spokesman for Ogata's team refers to frozen mammoths as millions of years old and Mazur refers to them as hundreds of thousands of years old. All of the frozen mammoths ever dated fall into an age range of fifty thousand to twelve thousand years old--an order of magnitude younger than Ogata and Mazur's references.
Both methods for bringing mammoths back face monstrous obstacles. While the cloning method is less discriminating in its source of DNA it depends on implanting ancient DNA from one species into another. Simple cloning with the best materials from the same species is still quite difficult. The mammoth plan adds additional levels of complexity with additional barriers to success. While artificial insemination and in vitro fertilization are methods with a far higher rate of success than cloning, the odds of finding mammoth sperm cells with intact DNA are much smaller than those of finding any old cell with intact DNA.
Assuming they do find intact DNA and accomplish fertilization, both methods need to get an Indian elephant to carry the baby mammoth (or mammoth hybrid) to term--another difficult task. Finally, if we can overcome all of these obstacles, the baby mammoth is going to be an orphan like no orphan has been--the only member of its species. There will be no adult mammoth available to teach it how to be a mammoth; it will have to learn to be an elephant. In time some uniquely mammoth behaviors might emerge. If we produce a whole herd, they might get more mammothy over the course of a couple of generations. Since we don't have any mammoth to compare them to, we wouldn't know if this was authentic mammoth behavior or if our orphans were inventing new behavior in response to the new environment in which we have placed them.
So many questions. So many problems. As much as I'd like to see a mammoth stomping around on a healthy mammoth steppe, it might be better if I held out for time travel to show me one.
Stories about Japanese or Russian geneticists who think they might be able to bring back the woolly mammoth are becoming something of a staple of science journalism. Every couple of months we get a new story about it. Basically, there are only two stories. The first, is the scientist who hopes to recover intact mammoth DNA from a frozen mammoth to use in cloning experiments. So far no one has recovered any intact mammoth DNA. The second story is about a scientist who wants to recover sperm cells (with intact DNA) from a frozen mammoth to use in artificial insemination or in vitro fertilization. Both approaches then use an Indian elephant as a surrogate mother to carry the mammoth fetus to term. Today's story is one of the latter type.
Descendants of extinct mammals like the giant woolly mammoth might one day walk the Earth again.
It isn't exactly Jurassic Park, but Japanese researchers are looking at the possibility of using sperm from frozen animals to inseminate living relatives.
So far they've succeeded with mice--some frozen as long as 15 years--and lead researcher Dr. Atsuo Ogura says he would like to try experiments in larger animals.
"In this study, the rates of success with sperm from 15 year-frozen bodies were much higher than we expected. So the likelihood of mammoths revival would be higher than we expected before," Ogura said in an interview via e-mail.
There is a big difference between mouse testes stored fifteen years under laboratory conditions and mammoth testes left outside for fifteen thousand years.
Less enthusiastic was Dr. Peter Mazur, a biologist at the University of Tennessee who has worked with frozen eggs and sperm and is a past president of the Society for Cryobiology.
[...]
"The storage temperature of frozen mammoths is not nearly low enough to prevent the chemical degradation of their DNA over hundreds of thousands of years," he commented. And "even if the temperature were low enough to prevent chemical degradation, that would not prevent serious damage over those time periods from background radiation, which includes cosmic rays."
While Ogata and Mazur might be great biologists in their own fields, neither is much of a paleontologist. A spokesman for Ogata's team refers to frozen mammoths as millions of years old and Mazur refers to them as hundreds of thousands of years old. All of the frozen mammoths ever dated fall into an age range of fifty thousand to twelve thousand years old--an order of magnitude younger than Ogata and Mazur's references.
Both methods for bringing mammoths back face monstrous obstacles. While the cloning method is less discriminating in its source of DNA it depends on implanting ancient DNA from one species into another. Simple cloning with the best materials from the same species is still quite difficult. The mammoth plan adds additional levels of complexity with additional barriers to success. While artificial insemination and in vitro fertilization are methods with a far higher rate of success than cloning, the odds of finding mammoth sperm cells with intact DNA are much smaller than those of finding any old cell with intact DNA.
Assuming they do find intact DNA and accomplish fertilization, both methods need to get an Indian elephant to carry the baby mammoth (or mammoth hybrid) to term--another difficult task. Finally, if we can overcome all of these obstacles, the baby mammoth is going to be an orphan like no orphan has been--the only member of its species. There will be no adult mammoth available to teach it how to be a mammoth; it will have to learn to be an elephant. In time some uniquely mammoth behaviors might emerge. If we produce a whole herd, they might get more mammothy over the course of a couple of generations. Since we don't have any mammoth to compare them to, we wouldn't know if this was authentic mammoth behavior or if our orphans were inventing new behavior in response to the new environment in which we have placed them.
So many questions. So many problems. As much as I'd like to see a mammoth stomping around on a healthy mammoth steppe, it might be better if I held out for time travel to show me one.
Monday, August 14, 2006
Birthday story
A few months before I turned four years old, my family moved into a new house in a new neighborhood. Most of the families in the neighborhood were about the same, WWII vets with young families. The dads all worked for the same employer, the Atomic Energy Commission. The moms were all stay at home moms. The kids formed into packs according to age and sex and owned all of the yards and went to the same school.
As we were moving in, I was excited to discover that the family moving in across the street had a boy my age. His name was Billy Curran. I was amazed to discover that Billy had a birthday coming up just a few days before my own. At four, it's easy to be amazed and I had never met anyone else with a birthday in August (August itself being a new concept to me).
As I mentioned, the neighborhood was a new one. All around us were vacant lots, houses under construction, and holes in the ground where there would soon be houses under construction. Our mothers sternly warned us to stay away from the dangerous construction sites and Billy and I wasted no time in heading over to those same really cool construction sites.
Billy and I grew up together. The next year we went to the same kindergarten at the Methodist church and our mothers took turns driving us. We went to the same school and walked together most days. We were in the same Cub Scout troop (my mom was the den mother). We went trick-or-treating together on Halloween every year. Billy was a lovable goofball. He was an occasional class clown. He was the kid who crossed his eyes and stuck out his tongue for the class picture.
Billy was also a tough little guy who occasionally had to knock a bully off of me. He had a heart. When I had a concussion from a tricycle accident, he gave me his Etch-a-Sketch. When a friend cut his foot wading in an irrigation canal--another place we were forbidden to play--Billy picked him up and carried him a block to his home and patiently explained to his mother what had happened.
Me and Billy in the third grade
Billy was never my absolute best friend, but he was never my enemy. We never fought; he was my most constant friend. That might have changed as we got older, but I moved to Alaska after the seventh grade. The changes in interests and new social structures that inevitably come with adolescence and junior high didn't have a chance to work on us. The only Billy I ever knew was the unchanging Billy of grade school. I last saw him when I passed through town for a wedding three years later. We were sixteen.
Thirty years later, I became reacquainted with David Neiwert, another veteran of the old neighborhood. David lived a block over from Billy and I and didn't arrive in the neighborhood till we were all in the fifth grade. Since David mostly remembered Billy from high school, he knew him as Bill. When I got back together with David, he told me that Bill had died the day before. Little Billy grew up to be an alcoholic who managed to drink himself to death at age forty-five.
Something strange happens when we hear that someone from our past has died. Suddenly, we want to see them, even though we might not have given them a thought for years--even thirty years. As long as we suspect someone might still be out there, they remain in an eternal state of hold. We haven't seen them and we might never see them, but we theoretically could see them if we really wanted to. They are available to us; we still have the opportunity to see them (with an unknown amount of effort). That's often enough to satisfy us. But when we find out they're gone, the opportunity is forever closed to us. That's an intolerable state.
Today would have been Billy’s fiftieth birthday. The eternal Billy of my memory should still be out there waiting for me to call him tonight and say, "fifty, eh? How did that ever happen?" We should be able to have a laugh and make an insincere promise to stay in touch.
Billy, dammit, why did you ever stop being the goofy kid of my memory and become a grown-up Bill faced with grown-up temptations and grown-up frustrations? Why aren't you still sitting by the little canal, watching the water-skippers, and hoping we don't get caught by our mothers? Why aren’t you waiting for my phone call?
A few months before I turned four years old, my family moved into a new house in a new neighborhood. Most of the families in the neighborhood were about the same, WWII vets with young families. The dads all worked for the same employer, the Atomic Energy Commission. The moms were all stay at home moms. The kids formed into packs according to age and sex and owned all of the yards and went to the same school.
As we were moving in, I was excited to discover that the family moving in across the street had a boy my age. His name was Billy Curran. I was amazed to discover that Billy had a birthday coming up just a few days before my own. At four, it's easy to be amazed and I had never met anyone else with a birthday in August (August itself being a new concept to me).
As I mentioned, the neighborhood was a new one. All around us were vacant lots, houses under construction, and holes in the ground where there would soon be houses under construction. Our mothers sternly warned us to stay away from the dangerous construction sites and Billy and I wasted no time in heading over to those same really cool construction sites.
Billy and I grew up together. The next year we went to the same kindergarten at the Methodist church and our mothers took turns driving us. We went to the same school and walked together most days. We were in the same Cub Scout troop (my mom was the den mother). We went trick-or-treating together on Halloween every year. Billy was a lovable goofball. He was an occasional class clown. He was the kid who crossed his eyes and stuck out his tongue for the class picture.
Billy was also a tough little guy who occasionally had to knock a bully off of me. He had a heart. When I had a concussion from a tricycle accident, he gave me his Etch-a-Sketch. When a friend cut his foot wading in an irrigation canal--another place we were forbidden to play--Billy picked him up and carried him a block to his home and patiently explained to his mother what had happened.
Billy was never my absolute best friend, but he was never my enemy. We never fought; he was my most constant friend. That might have changed as we got older, but I moved to Alaska after the seventh grade. The changes in interests and new social structures that inevitably come with adolescence and junior high didn't have a chance to work on us. The only Billy I ever knew was the unchanging Billy of grade school. I last saw him when I passed through town for a wedding three years later. We were sixteen.
Thirty years later, I became reacquainted with David Neiwert, another veteran of the old neighborhood. David lived a block over from Billy and I and didn't arrive in the neighborhood till we were all in the fifth grade. Since David mostly remembered Billy from high school, he knew him as Bill. When I got back together with David, he told me that Bill had died the day before. Little Billy grew up to be an alcoholic who managed to drink himself to death at age forty-five.
Something strange happens when we hear that someone from our past has died. Suddenly, we want to see them, even though we might not have given them a thought for years--even thirty years. As long as we suspect someone might still be out there, they remain in an eternal state of hold. We haven't seen them and we might never see them, but we theoretically could see them if we really wanted to. They are available to us; we still have the opportunity to see them (with an unknown amount of effort). That's often enough to satisfy us. But when we find out they're gone, the opportunity is forever closed to us. That's an intolerable state.
Today would have been Billy’s fiftieth birthday. The eternal Billy of my memory should still be out there waiting for me to call him tonight and say, "fifty, eh? How did that ever happen?" We should be able to have a laugh and make an insincere promise to stay in touch.
Billy, dammit, why did you ever stop being the goofy kid of my memory and become a grown-up Bill faced with grown-up temptations and grown-up frustrations? Why aren't you still sitting by the little canal, watching the water-skippers, and hoping we don't get caught by our mothers? Why aren’t you waiting for my phone call?
Meanwhile in Florida...
A new poll has Katherine Harris forty five points behind Bill Nelson in the race for Nelson's Senate seat. I'm sure you've all been following Harris' highly entertaining race, with her weekly staff replacements and legal problems. The party has begged her to drop out of the race. The only real question now is whether or not she will manage to beat Alan Keyes' disasterous showing in 2004. As of this poll she is on target to tie with Keyes.
A new poll has Katherine Harris forty five points behind Bill Nelson in the race for Nelson's Senate seat. I'm sure you've all been following Harris' highly entertaining race, with her weekly staff replacements and legal problems. The party has begged her to drop out of the race. The only real question now is whether or not she will manage to beat Alan Keyes' disasterous showing in 2004. As of this poll she is on target to tie with Keyes.
Bad ideas in medical research
A new report by the Institute of Medicine of the National Academies of Science recommends relaxing the rules for medical research on prisoners. Looking at that story in the paper yesterday, my Clever Wife read me the headline, "Panel Suggests Using Inmates in Drug Trials." My immediate reaction was, "reword it to 'Panel Suggests Using Prisoners in Medical Experimentation' and see what reaction you get."
If my rewording brings to mind thoughts of Mengele or the Tuskeegee institute, that's good. It should. Medical research on captive and powerless groups was horribly abused in the last century. I would much rather we be over cautious on this particular ethical question that allow those kind of crimes to continue.
There is no philisophical reason why using prisoners should be different from using any other group, say grad students or soldiers. From the researchers' point of view there are some special advantages in using prisoners, all of which relate to control and observation. With prisoners it is easy to know what they ate, when they slept, and how much they exercised. A prisoner isn't going to complicate things by taing a vacation and suddenly changing his habits for ten days. However, the very things that make a prisoner an ideal practical subject make them an unacceptable ethical risk.
The ethical problem rests on the question of coercion. In any kind of human medical trials, the experimental subjects must be fully informed going in and free to back out at any time. I can see no way to guarentee that the choice made by someone in a totally coercive environment, like a prison, is ever completely free. Lindsay Bayerstein explains the issues in detail:
Currently, prisoners do participate in low risk drug trials. Until we can completely guarentee a system of ethical safeguards for the prisoners' basic human rights and individual interests, we should not move into riskier areas of research. And, to repeat myself, I don't see how we can make that guarentee.
Other blogs are also discussing this.
A new report by the Institute of Medicine of the National Academies of Science recommends relaxing the rules for medical research on prisoners. Looking at that story in the paper yesterday, my Clever Wife read me the headline, "Panel Suggests Using Inmates in Drug Trials." My immediate reaction was, "reword it to 'Panel Suggests Using Prisoners in Medical Experimentation' and see what reaction you get."
If my rewording brings to mind thoughts of Mengele or the Tuskeegee institute, that's good. It should. Medical research on captive and powerless groups was horribly abused in the last century. I would much rather we be over cautious on this particular ethical question that allow those kind of crimes to continue.
There is no philisophical reason why using prisoners should be different from using any other group, say grad students or soldiers. From the researchers' point of view there are some special advantages in using prisoners, all of which relate to control and observation. With prisoners it is easy to know what they ate, when they slept, and how much they exercised. A prisoner isn't going to complicate things by taing a vacation and suddenly changing his habits for ten days. However, the very things that make a prisoner an ideal practical subject make them an unacceptable ethical risk.
The ethical problem rests on the question of coercion. In any kind of human medical trials, the experimental subjects must be fully informed going in and free to back out at any time. I can see no way to guarentee that the choice made by someone in a totally coercive environment, like a prison, is ever completely free. Lindsay Bayerstein explains the issues in detail:
Even if there are no explicit incentives to participate, inmates may still interpret a request to participate as an implied order. It is unethical to create a situation in which prisoners may sign on to a risky research project out of fear of reprisal (well-founded, or not). Prisoners often lack access to basic medical care and may be pressured into accepting experimental treatments because they can't obtain standard medical care.
Then there's the thorny issue of payment. On the one hand, free research subjects are frequently paid for their participation. If free subjects are getting paid, prisoners deserve the going rate. However, paying prisoners raises its own ethical complications. Risky medical experiments might be disproportionately attractive to people who have no other opportunities to make money.
Unlike children, prisoners are social outcasts. The institutional review board is not a loving parent who weighs the costs and benefits for individuals. Even the most thorough and conscientious committee would be making calculations on behalf large numbers of strangers from a marginalized group. The decisions of the review boards may be colored by society's disdain for prisoners. I doubt these committees will be as solicitous of the well-being of inmates as individual parents are about the welfare of their sick children.
Currently, prisoners do participate in low risk drug trials. Until we can completely guarentee a system of ethical safeguards for the prisoners' basic human rights and individual interests, we should not move into riskier areas of research. And, to repeat myself, I don't see how we can make that guarentee.
Other blogs are also discussing this.