Raven's Flickr

Media Viruses: Things You Wouldn’t Say in Broad Daylight.



What Historians Do (And Why I Want to Be One)

This fall, I changed my long-term academic goal from an English Literature and Film Studies major with a minor in History to a History major with a minor in Anthropology/Sociology. There were a lot of practical reasons for this, in addition to the fact that I just plain like History more and was getting bored with the repetitive patterns of Literature as an academic discipline. The entire English major seems to be comprised of the following: read a book. Analyze it. Write your analysis. Rinse and repeat.

Well frankly, that’s just boring. I still love writing, and of course I love reading, but the patterns in fiction aren’t remarkably different, or at least not enough for me to feel like I’m accomplishing something if I systematically study thousands of books and find parity in most of them. I don’t want to be dismissive of writing, of its place as an art form, or of its significance in academic study. But I now know that I’ll be bored and unhappy for the rest of my college career, if not the rest of my life, if I get an English degree. Hence, the switch.

Sunday, I had the family over for dinner, and when I disclosed this change to them, I got a wall of sort of bored, blank stares. There wasn’t quite any eye-rolling, but we were a hair’s breadth from it, and I could feel the body language asking me “Why? History is dead. It’s booooorrrrrring.”

I’m sorry, but it really isn’t. Not to me, anyway. If it’s boring to you, you probably had terrible history teachers. But as Faulkner said in Requiem for a Nun (ahoy! my three years as an English major weren’t entirely wasteful) “the past isn’t dead. It’s not even past.” So give me a moment to explain what history really is, why it’s still relevant, and why I find it interesting.

At its simplest, history as an academic discipline is the study of the events of human lives. Though that statement may seem fairly self-evident, the practice of studying history draws from and influences many academic focus areas—some obvious, others obscure—and touches many academic disciplines, notably sociology, anthropology, and political science, which often overlap it. History involves not only the recording and study of events, but also their analysis and interpretation: both in how a particular event influenced the other events of its contemporary or subsequent ages, and how events of past ages influence the current age. Besides studying individual events or persons, historians also study broad movements in human history.

So then, what distinguishes history from sociology, anthropology, or political science? These social science disciplines have areas of overlap, but the focus of each is distinct. Though sociology studies the influence of events on human behavior and social constructs, it limits itself to social behavior. Political science does much the same, but limits itself to political behavior. Anthropology examines trends and changes in all aspects of human existence, from biology to culture.

History may encompass aspects of these other disciplines, and in fact, those three can be loosely viewed as specialized historical studies, but history is distinct because of its breadth and its depth. Historians may study anything that can be classified as history—a particular event or a broad movement, a massive shift in cultural thought or a single human life. History isn’t even relegated to studying the past, as historians may be called upon to offer an analysis or critical opinion of the possible impact of contemporary events based on their knowledge of similar events in history. Historians accomplish their studies through diverse methods, mostly centered on research, analysis, and interpretation of historical evidence.

We analyze facts, not suppositions. A historian examines any evidence available, as long as it is credible. There are two broad categories of evidence we use: primary sources and secondary sources. Primary sources are first-degree-of-contact sources that are contemporary to the event, individual, or period being studied. Some common examples are census records, tax rolls, written or recorded transcriptions of eye-witness testimony, and inscriptions on physical objects and structures such as grave markers, standing stones, monuments, and clay or stone tablets. In order to be acceptable as primary sources, these documents and inscriptions must be dated to the period they refer to. The other kind of source, a secondary source, is any second-degree or further removed account of an event, generally written or compiled from primary source documents after that event’s time. Examples of secondary source documents include scholarly reviews of primary documents, analysis and interpretation of a historical event, journal articles, and any other non-first-degree source that uses primary sources as a basis for analysis and interpretation.

Sometimes, the line between primary and secondary documents becomes difficult to distinguish, as in an old document that refers to even older documents, or a contemporary biography. In the first case, take the example of Aristotle and Plato’s writings. Aristotle’s works may refer to certain works of Plato, of which no copies are extant, and may also include Aristotle’s commentary and his own exposition on Plato’s concepts. This is considered a primary source for Aristotelian thought, but only a secondary source for Platonic thought, though both would likely be considered credible, since the relationship between Plato and Aristotle is well-established. In the second case, consider the biography Steve Jobs by Walter Isaacson. Though contemporary to the life of Jobs himself, Isaacson’s original notes or tapes from interviews with Jobs and others who provided source material would be considered primary sources, but the book itself is a secondary source because it includes the author’s own analysis of Jobs’ life. An autobiography of verifiable provenance, such as Up From Slavery by Frederick Douglass, however, is considered primary because Douglass recorded events of his own life and times.

Further complicating the question of sources are the issues of corroboration and peer review. If a source is stand-alone—that is, there are no other records from the period that corroborate it—it is considered an uncorroborated source. However, merely being uncorroborated does not immediately render a source as suspect. Many early primary source documents are the only records of a particular event, but are accepted as historically valid nonetheless, such as a single church parish birth record, where no other evidence of an individual’s life is extant, but where the parish record itself has been verified as credible and authentic to the period. All sources should be corroborated when possible. For all secondary sources, and for uncorroborated or newly discovered primary sources, peer review is of paramount importance. Any historical writing based on non-peer-reviewed secondary sources will not be considered scholarly. Though new primary sources can be introduced when discovered, they should be submitted to peer-review, both to establish their veracity and to share with other historians.

Beyond traditional sources, historians may occasionally use supposition and other non-evidential leads as clues to look for more evidence, but we will never base conclusions on those kinds of sources. Basing a conclusion or analysis on anything other than established or verifiable evidence does not stand up to the rigor of academic scrutiny. History, when done right, is as complicated and exacting as engineering or any other profession that requires a high level of training.

Now I’ve defined what history is, what historians do, and how they do it. But none of this answers the most important question of all: why study history?

The purpose of history is not, as many schoolchildren probably believe, to memorize a list of names and dates. That’s not history; it’s just a list of historical information. Where history becomes both relevant and interesting is in its analysis and interpretation. Historians are not just trained to dredge up long-forgotten facts. They are trained to look at historical evidence and use it to tell a compelling story of humanity. Historical study lends understanding to the human condition by helping us understand the patterns of events, whether over a single life, or over generations. At its most abstract, then, history is not the study of human life—that’s anthropology. History, instead, is the study of human change. It encompasses how we change ourselves as well as how we change the world around us. History offers us insight into how the world we live in came to be. Just as literature, poetry, and art show us what human beings can dream, history shows us what we have been, what we can be, and sometimes, what we should not be.

Once we have examined all the available, credible, and verifiable evidence, we use it to sew together a narrative. We are a little bit like Sherlock Holmes, except it’s “The Case of the Civil War Battle, and What it Taught Us About Southern Politics,” instead of “A Scandal in Bohemia.”

What about the old adage that history repeats itself, and that those who do not learn from history are doomed to repeat it? Is that accurate, or just a cliché? Here’s an experiment: I’ll tell you a true story, and you can judge that old saw for yourself.

Once upon a time, there was a town at the edge of a massive American river. A dike protected the town from the rising and falling tide, and for a long time, the town was safe. Then one day, the dike broke. The town flooded, killing residents and destroying millions of dollars’ worth of homes, businesses, and infrastructure. The government had been aware there was danger of the dike breaking, but had issued only a mild warning the night before, followed the next morning by another warning to not panic. They sent no edict to evacuate, and sent no pre-flood help to the residents. When the dike did break, the government took its time, considered what to do, and eventually sent aid, but too little, and too late. While the rest of the world looked on in rubbernecking fascination, residents prayed for rescue while sheltering in trees and on rooftops. Analysts who examined the situation after the fact determined that a main factor in the government’s slow reaction may have been racially and socially motivated, because the town was mainly populated with poor and working-class people, nearly half of them African-American.

By now, you probably think I’m talking about New Orleans and Hurricane Katrina. You think this is a story of that city’s tragic destruction in 2005. You’re wrong.

On the afternoon of May 30, 1948, a 200-foot long section of the dike holding back the rushing waters of the Columbia River—our nation’s fourth largest—collapsed, flooding the boom town of Vanport City, Oregon, despite the government’s warnings that morning that the dike was safe, and that there was no reason to panic. The flood destroyed almost the entire city, killing 15 people and leveling the entire town, once home to 40,000 residents. Vanport, established as transient labor housing for the World War II and post-war manufacturing boom in Portland, Oregon; and Vancouver, Washington; housed mostly working-class residents—many of them workers at Kaiser Shipyards and other riverside manufacturing operations. Around 40% of the population was African-American, the highest concentration of non-whites in Portland at that time.

Though the Vanport Flood is not on the same scale as Hurricane Katrina and the tragic loss of life seen in New Orleans, the basic pattern is the same—and in both cases, racial tensions and class prejudice likely contributed to the nonchalant attitude of government officials who ignored the warning signs of impending doom. In Vanport, the government even issued a warning to not “get excited” over the prospect of flooding, just six hours before the dike broke. They did issue one warning, a very mildly-worded missive, the night before the flood, that the previous winter’s high snowfall—135% above normal—meant a greater water volume in the river and a possibility of some flooding in riverside areas. Because of that warning, many Vanport residents moved belongings to upper floors of residences, but no one thought to evacuate the town, and those who might have were reassured by the assurance message sent out the next morning. Despite enacting what amounts to almost no precautions, only 15 people died in the flooding, several of them swept away and drowned in the initial 10-foot tall crushing wave of water that engulfed the northernmost portion of the city when the S&P railroad dike broke at 4:17 pm.

Several factors contributed to the low loss of life. First, Monday, May 31—the day following the flood—was Memorial Day, and thousands of Vanport residents were out of town for the holiday weekend. Second, the area around Vanport contains numerous sloughs and backwaters which were able to accommodate some of the extra water volume caused by the dike breaking. Third, the population of Vanport at the time of the flood was down to around 18,500, less than half of its wartime population, due both to a fall-off in manufacturing volume after the war’s end, and racial prejudice from Portland residents—mainly led by Ku Klux Klan propaganda—that had turned the minds of many Portlanders against Vanport’s somewhat egalitarian social setting of mixed black and white housing and schools, a concept almost unthinkable in some regions in a pre-Brown v. Board society, and had driven many Vanport residents of color out of the city and into more accommodating areas.

Despite the low death toll, the entire city was wiped away—housing, businesses, personal belongings, infrastructure, automobiles, and even trees. An estimated loss of 5300 houses left almost 20,000 people homeless, and the area once known as Vanport City was left a reeking swamp littered with the broken trappings of its former incarnation as an urban center. Instead of rebuilding immediately, the government allowed the area to sit vacant until 1959, when reconstruction began, eventually forming the area now known as Delta Park, Portland International Raceway, and the current neighborhood of Bridgeton.

The pattern of the Vanport flood is similar to the Katrina flood and the destruction of New Orleans in so many ways that it has an undeniable and eerie resonance for historians. Our failure, as a nation, to learn the lesson of the Vanport flood may or may not have impacted the situation in New Orleans in 2005. But the episode certainly could have taught us something about disaster preparation and management, had we only listened.

Knowing where we come from is a major step in discovering where we’re going. History is a time machine that can take us to our past, our present, and our future. And that is why I changed my major.


Independent...and Ignorant

I have a friend at work who grew up in Poland. We were talking about education in America and although we both agree that education is broken, we don’t agree on how to fix it. He thinks the problem with American schools is that they don’t provide a good enough learning environment for students. He says that since American education is compulsory to age 16, American schools are responsible for providing a sound education for American students. He was also perplexed as to why there is no huge public outcry that our level of “basic” education is so far behind where it should be.  


I argued that the onus is on the student to bring their A game and learn, because schools can’t possibly be expected to support each student in the way he was suggesting. I don’t want to just teach to the kids who want to be there, and damn the rest, but my fear is that mandates like No Child Left Behind and its ilk force teachers into grade inflation and teaching to the test, rather than teaching knowledge and problem-solving skills. If educators feel bound to make sure every student passes, it puts them into an untenable situation where self-motivated students basically educate themselves while the educator spends a majority of his or her time trying to motivate the students who have no internal sense of the value of the education they’re squandering.


Ever since we had that conversation, I’ve been thinking about what the real roots of this problem are. Here’s what I’ve come up with: 


First of all, Americans don’t value education. We aren’t willing to pay for it, and we don’t think it’s necessary, or valuable. Second, American kids growing up in a culture that undervalues education seldom develop any sense that education is valuable, so like their parents, they grow up not caring about what they don’t know. Third, because they don’t have a base of knowledge and a grasp of critical thinking, they fail to evaluate statements made by media, politicians, and their peers for veracity, validity, and reason—which leads them to apathetic participation, or straight-up non participation in the political process. They don’t vote. They don’t think. And they don’t even know what they don’t know. Americans simply don’t know enough to even realize how undereducated they are, so they don’t think it’s a big deal. 


What do I mean that we don’t know enough? Well, a common line I heard growing up was that things like sports, music, art, and other electives weren’t important, because schools should focus on the inappropriately named “three ‘Rs’” of Reading, Writing, and Arithmetic. So for a moment, just for the sake of argument, let’s take the position that it’s a good idea to cut school funding and just focus on reading, writing, and math. If we do, we can see that we’re even failing in those areas.


But, what do I mean?! Surely Americans can read, write, and do basic math, right? No, we really can’t.


Can we even read? Well, some of us. But let’s look at the numbers. According to the 2010 census, the population of the United States is approximately 313 million people, and 250 million of them are adults. 14% of American adults are functionally illiterate. That’s 35 million people—3 million more than the entire population (men, women, and children) of Poland. According to the literacy advocate group First Book (www.firstbook.org) “Studies confirm that the number of books in the home directly predicts reading achievement. Children who grew up with books in their homes reached a higher level of education than those who did not. One study found that in middle income neighborhoods the ratio is 13 books per child; in low-income neighborhoods, the ratio is one book for every 300 children.” 


Poor kids don’t have access to books. Middle class kids only have access to a limited number of books. And these are the kids who later go to public school, because free school is all they can afford. They have no basis for learning. In many cases, their parents also grew up poor and/or middle class, and they had limited or nonexistent access to books, too. We have generations of people who have no access to the basic building block of knowledge—books. And you can try to argue that they have the Internet, but as most of us know, the Internet is filled with a myriad of statements, some true, some half-true, and some absolutely false. Without basic education of how to use critical thinking and how to evaluate the veracity of a statement found on the Internet, how do Americans know if what they’re reading is fact or opinion? How do they know what they even think about an idea unless they have been shown how critical thinking is accomplished? 


Next up: writing. If you think our writing, spelling, and grammar are fantastic, think again. Rather than citing statistics at you, I’ll just say this: how many blogs, books, and websites exist predicated entirely on the premise of pointing out—usually with photographic evidence—the grammatical foibles of Americans? The answer: a great number. And the reason this counts as “funny” material is that the only people who get uptight about it are the ones, like me and all the other English majors out there, who actually care whether people know the difference between things like “assure,” “ensure,” and “insure.” Everyone else just writes like cave people and gets upset if you correct them.


Well, who cares? English is a difficult language, sewn together from at least two others, so our grammar rules don’t really make sense anyway, right? And spelling, hell, that’s only 400 years old. We’ve got time to grow into modern English still. But math…that’s intrinsic to daily life! We have to pay for things, count change, figure out how many cantaloupes we can buy for $10. These are things we do daily or weekly. So surely we can do math. Or can we? 


One problem we have at the Community College level is that a large number of students who come to us as first-time freshmen, straight from local area high schools, test in to math, reading, and writing at below college level. The high school education they are receiving—for whatever reason—fails to prepare them adequately for college. I don’t have exact figures for you, but I’m the one who conducts these placement tests. I see all the scores. And I can tell you, the two classes most of our students score into in mathematics are MTH 20 and MTH 60: those are Basic Arithmetic and Elementary Algebra. After those two, students must also take MTH 65 (the second half of Elementary Algebra) and MTH 95 (Intermediate Algebra) to even be eligible to take the lowest college-level mathematics course we offer, MTH 111: College Algebra. That means at a two-year college, even if they take classes in Summer term, a student who scores into MTH 20 when they start school will attend college for an entire academic year before they are eligible to take college-level math. And many of them never bother. They achieve the minimum mathematics required for their degree (often MTH 65) and stop there. In some years, we haven’t had enough students to even offer higher math classes like Trigonometry and Calculus, because we only had 4 or 5 people sign up for them and we cancelled because we can’t afford to subsidize a class that doesn’t even pay for the teacher’s salary. So, people who want to take higher math, and qualify for it, don’t get to do so at our college because we spend an inordinate amount of time, money, and energy bringing most of our students up to the level they should start at when they come out of a high school.


Well, what about other subjects? What about science?

A New York Times article from a few years ago stated that “American adults in general do not understand what molecules are (other than that they are really small). Fewer than a third can identify DNA as a key to heredity. Only about 10 percent know what radiation is. One adult American in five thinks the Sun revolves around the Earth, an idea science had abandoned by the 17th century.” http://www.nytimes.com/2005/08/30/science/30profile.html?pagewanted=all


What about history? The National Assessment of Educational Progress, an evaluation of fourth, eighth, and twelfth grade students’ knowledge of a particular subject for the year, assessed American students’ understanding of American history for 2011 and found it greatly lacking. But in our own history, we knew more, right? No way. A New Yorker article from last year about the NAEP study pointed out that in a similar study conducted in 1915, a majority of students could not distinguish between Thomas Jefferson and Jefferson Davis. http://www.newyorker.com/talk/2011/06/27/110627ta_talk_paumgarten


So where should we look for “American knowledge?” We must know something! What about politics and religion; those two topics are always on the airwaves. That must be where Americans have invested all of our learning capacity.


A 2011 Newsweek poll of American’s knowledge of material covered in the U.S. Citizenship Test—that’s the test we give people from other countries before we let them become Americans—found that 80% of those surveyed didn’t know who was president during WWI, only 27% knew why America was involved in the Cold War, and 44% could not define the Bill of Rights. 6% (now remember, these are AMERICANS BORN HERE) could not remember what day and month is Independence Day. Today. July 4. They did not know when it was. 


Well, that leaves religion. If there is one area where Americans are likely soaring forward in basic “knowledge,” it must be religion, right? Wrong. A 2010 survey by the Pew Forum on Religion and Public Life found that of the 32 survey questions asked, the overall average was 16 correct questions, and atheists scored an average 20.9—the highest scoring group. In fact, the survey found that “atheists and agnostics, Jews and Mormons are among the highest-scoring groups” on their survey of religious knowledge, “outperforming evangelical Protestants, mainline Protestants and Catholics on questions about the core teachings, history and leading figures of major world religions” including religions the survey participants claimed as their own. Though 89% knew it is illegal for public school teachers to lead a class in a prayer, only 55% knew the Golden Rule is not one of the Ten Commandments, 47% knew the Dalai Lama is a Buddhist, and only 27% knew that the majority religion in Indonesia is Islam. 



We can’t read. We can’t do math. We don’t understand science. We don’t know history. We don’t care to know civics. We don’t know anything about other people’s religions, or even our own.


The problem isn’t American schools—the problem is Americans. 


We are so entrenched in a culture of ignorance that we don’t even know how stupid we are. And that, my Polish friend, is the problem.


Eleventh Time’s a Charm

“David Tennant and [Christopher] Eccleston were good actors but Matt Smith is the epitome of Dr. Who in the tradition of Tom Baker & Peter Davison.”
~ @Aprillian posted to Twitter, 5 December, 2010

When a friend of mine posted this to Twitter recently, I had to agree. If you have no idea what we’re talking about, it’s because you’ve never seen Doctor Who, the BBC science-fiction television series that has been a staple of geek love more or less constantly since its debut in1963.  

The show features a character called the Doctor (it’s a name, not a title) who travels around Earth and other places in the Universe in a time-space machine called a TARDIS (an acronym for Time And Relative Dimension In Space), which, due to a malfunctioning chameleon circuit is permanently made to look like a 1960’s London Police phone box. This gives it the appearance of a blue phone booth that is, like a Bag of Holding, much bigger on the inside than it is on the outside. TARDIS prop used in Doctor Who series

The original Doctor Who series ran from 1963 to 1989, and featured eight different actors playing the titular character. Instead of employing the James Bond method (ignoring the issue) they use a bit of science-fiction magic to explain this. The Doctor, you see, is from the planet Gallifrey, and when a Gallifreyan’s body becomes too damaged to heal, it will attempt to regenerate. An unsuccessful regeneration results in a dead Gallifreyan, but a successful one results in a new actor standing up and saying “Hullo. Where am I? Who am I? Oh, right. I’m me. I’m hungry. But for what, I’m not sure…” 

It’s a bit of a corny device to keep the show going when the main player bows out, but like an Apple product, it just works. During the original run, the Doctor was played by William Hartnell, Patrick Troughton, Jon Pertwee, Tom Baker, Peter Davison, Colin Baker, Sylvester McCoy, and Paul McGann.

The Eleven Doctors (L-R Top: William Hartnell, Patrick Troughton, Jon Pertwee Tom Baker; Middle: Peter Davison, Colin Baker, Sylvester McCoy, Paul McGann; Bottom: Christopher Eccleston, David Tennant, Matt Smith)

In 1996, an attempt to restart the series after its 1989 cancellation featured actor Sylvester McCoy, the last-known Doctor, regenerating into Paul McGann in the TV film Doctor Who. The film was not as commercially successful as producers might have hoped, so the idea of using it as a springboard for a renewal of the Doctor Who TV series was killed.

Then, in 2005, writer/producer Russell T. Davies convinced the BBC to let him have a go at rebooting the series. Christopher Eccleston was cast as the Doctor’s ninth incarnation, and the melancholy master of time and space found a niche with fans—both in the UK and in the US when the show was broadcast on cable’s BBC America station.

But Eccleston left after just one year, and David Tennant was cast to replace him as the Tenth Doctor. In the years since, Tennant has been voted “Best Doctor” by fans on numerous occasions, and seems to have even replaced the former favorite, Fourth Doctor Tom Baker, in most fans’ hearts.

After three wonderful years, Tennant decided to leave Doctor Who. And since he’d been cast as the Royal Shakespeare Company’s official Hamlet, who could honestly blame him? That’s not a chance everyone gets, and for Tennant to say “No thanks. I’d rather be the Doctor than Hamlet” would be neither expected nor reasonable.

David Tennant as Hamlet. Photo Royal Shakespeare Company, Ellie Kurttz/AP.

Since producer Russell T. Davies was leaving the show along with Tennant, it looked to fans like this might be the end (again) of Doctor Who.

The show has been saved, in my opinion, by two things.

First, when Davies left, his duties as Producer, Show-Runner, and Head Writer were filled by Steven Moffat. Moffat’s previous credits include the amazing ‘relationships’ show Coupling, the TV drama series updates of Victoriana staples Jekyll (based on The Strange Case of Dr. Jekyll and Mr. Hyde by Robert Louis Stevenson) and Sherlock (based on the collected works of Sir Arthur Conan Doyle featuring Sherlock Holmes), as well as several of the most popular episodes of Davies’ rebooted Doctor Who series.

And then, there’s Matt Smith.

Matt Smith as the DoctorSmith is a baby. At only 27, he’s the youngest actor ever to play The Doctor. He has a mop of mouse-fur colored hair that resembles a horse’s forelock. He’s gangly. He’s not particularly good looking. But he may be the best Doctor, well, ever.

Why is Smith so good? And why is he more “my” Doctor than Tennant, Davison, or even Tom Baker?

Well, I’ll admit that when I first saw Tennant as the Doctor, I actually laughed out loud. I initially viewed the casting of a heart-stoppingly handsome man as my favorite Time Lord as yet another example of the warped Hollywood aesthetic infringing on my beloved Britain. “British people,” [says a woman who’s never been there] “aren’t drop-dead gorgeous as a rule of thumb. They look like normal people.”

Ahem. Maybe a bit naive of me, I’ll admit. And before I get a slew of angry emails, I’m not trying to slander the beauty of Britain. There are undoubtedly many attractive people in England, Scotland, and Wales (Christian Bale and Prince William immediately come to mind) but I just didn’t imagine anyone cast as the Doctor would, or should, be one of them.

My first Doctor was Tom Baker. He was not the most attractive man on television, even by the late 1970’s standards that made Telly Savalas into a sex symbol. Baker had a bigish nose and a gap in his front teeth. He wore strange clothes. In fact, with his huge coat and looooooong scarf, and ’70’s curly ‘fro, a la Roger Daltrey, he looked rather…alien.

Which is, of course, the point. The Doctor is NOT a human being. 


“But, you look human!” ~ Lady Christina
“Well, we came first, so actually, you look Time Lord.” ~ The [Tenth] Doctor 
The Planet of The Dead, 2009

I think the key to Matt Smith’s amazing portrayal of the Doctor lies in the fact that The Doctor isn’t supposed to be normal. He’s a bit of an oddball. Some of the Doctors have achieved this otherworldlyness better than others. Some have been more personable than others. Some have been fun, others broody. But until Smith, none have managed to pull off all of the traits you’d expect in a 900-year-old alien.

The First Doctor, William Hartnell achieved the proper degree of alienness, but was a bit too stiff and formal to ever be considered fun.

The Second Doctor, Patrick Troughton was a bit too ‘evil-eyed-mad-scientist.’ And, he had a Three Stooges haircut. Not his fault, but still…

The Third Doctor, Jon Pertwee, was a little more fun, and almost achieved the well-roundedness of character I wanted to find, but was hindered by his time period. Early 1970’s production values and some truly awful scripts probably hurt Pertwee more than his acting or interpretation of the character ever could have.

The Fourth Doctor, Tom Baker, was my favorite for many years, because he managed to bring the seriousness needed in dramatic and life-threatening script situations, but followed them up by offering a Jelly Baby with a cheerful sincerity that made you think “Hell, the world may end, but at least we have candy.”

The Fifth Doctor, Peter Davison, was the youngest Doctor yet, and his vanilla ice cream suit and cheery outlook modernized the Doctor quite a bit. He was kind of dim though, and not as quick-witted as some other Doctors.

The Sixth Doctor, Colin Baker, seems to have been paralyzed by coming in the wake of what was, at the time, the two most popular Doctors Who ever. His portrayal is a mashup of the other Baker and Davison. Unfortunately, he was never able to find his own footing in the character, and his Doctor is a bit lackluster as a result.

The Seventh Doctor, Sylvester McCoy was a Doctor losing his mind. He started as a kind of buffoon, but that humor quickly gave way to a dark, introspective character who seemed on the verge of breakdown.

At that point, the series was cancelled until the 1996 TV movie.

The Eighth Doctor, Paul McGann, is positively VICTORIAN. He would have been much more welcome in a Sherlock Holmes adaptation than a desperate drive to jump start a failing science fiction series.

And after Davies’ reboot in 2005,

The Ninth Doctor, Christopher Eccleston, is depressed. Not just moody, but full-on, ‘you-need-whatever-Gallifreyans-take-instead-of-Prozac’ sad. Granted, he’s the last of his kind, and his planet has been destroyed and time-locked, but he’s just…not much fun to watch. It’s like watching Morrissey do Macbeth.

The Tenth Doctor, David Tennant, is beautiful. I tried desperately to NOT watch the show when Tennant came, because, as I said before, I thought he was too pretty and was probably a bit of fluff. Tennant surprised me by being, not just a good actor, but actually very interesting as the Doctor. His journey begins as a madcap jaunt across the universe and ends as a Shakespearean tragedy. Beautiful, moving, and breathtaking.

And Smith. Eleventh Doctor, fun when he needs to be, silly when called for, smart, devilishly clever, kind, personable, and young. This Doctor is at home doing stage magic, playing “futbol,” or saving the universe. He’s a little ugly, a little weird, and very, very charming. 

In short, he’s a lovable alien, and more the Doctor than all of his predecessors.

Best. Doctor. Ever.


Dylan, Poe, and Me

So one day I was driving along downtown.

The Dalles is your standard medium-sized rural Oregon town. We have 12,000 people, one zip code, and 79 churches.

Occasionally, we have famous people in town. Kurt Russell once took a limo through the drive through at the 6th Street Coffee Company. Harrison Ford has been known to land his plane at the Dallesport Airstrip, which usually services crop dusters and the like. Wa spotted Kevin Costner at Rite-Aid buying a cheap watch while on his way to go fishing on the Deschutes River. Billy Idol supposedly stopped here for French Onion Soup from the Baldwin Saloon, but that story came from a less than reliable source, who “saw the whole thing” from a block and a half away.

But Bob Dylan was really here. I saw him. In fact, I almost ran him over.

Dylan was in town a few years ago when he played at the Maryhill Winery just east of here. But there really aren’t any hotels of quality out there. When it comes right down to it, there aren’t any hotels of quality here either, but Bob Dylan is from New York, and he wasn’t always the most famous person in the world who has been mistaken for a homeless man. He was once a young struggling songwriter in New York City, and has no doubt stayed in some dubious shit holes.

As I said, it was early morning, I was driving downtown, and this little old man stepped off the curb in front of me. I slammed on the brakes and the guy walked on.

My brain has this quality. Maybe it’s unusual, or maybe everyone does this. I don’t know. But I usually think about 3 or 4 or 5 things at once. The thoughts don’t seem to get confused unless I try to explain them, but if I’m just thinking it’s perfectly clear. So here’s what my brain was thinking in that moment:


Got to turn left    Is that guy going    That guy sort of looks   Is that a tour bus
at the next         to step off the        a little bit like              parked in the parking
street to get to   curb? I’m not sure   Bob Dylan. Didn’t he      lot at The Dalles
where I’m going   he even sees me.    play at Maryhill last       Best Western?
on time.             Hit the brake          with some other            I wonder why it’s
OH MY FUCKING GOD THAT’S BOB DYLAN!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

Accustomed as I’m sure he is to heavy New York traffic, he took no notice of me and kept on crossing the street. I continued where I was going.

Now, you have to understand that Bob Dylan is not the first rock star I’ve tried to kill accidentally. I almost killed Poe too.

In 2001, when Depeche Mode was touring in support of their Exciter album, and they invited Poe to open for them on their North American tour. She was promoting her album Haunted. That August, Wa and I drove to the Gorge Amphitheater in George, Washington to see them. It’s a long trip, and we were hungry, so we stopped at a restaurant that was ominously named “The Golden Harvest.” It sounded to me like the diner where everybody gets killed in a Stephen King novel, and that should have tipped me off.

I had blueberry pancakes, hash browns (with ketchup), a glass of orange juice, and one egg over hard. I know exactly what I ate because I saw it all again.

Take my word on this one: the only thing worse than being sick is being sick in public.

I started feeling bad at about 6 that evening, and as Poe was playing her songs, I became more and more nauseated. Eventually, although I didn’t want to leave, I felt I had to go find somewhere to throw up, because I knew it was on the way. I tried to make it to the bathrooms.

I failed.

I failed on the amphitheater’s stairs. I failed all over my brand new Dragonflies t-shirt. I failed in front of about 100,000 people.

The worst part wasn’t even getting sick. It was the embarrassment of hearing people around me ask if I’d had too much to drink. As a lifetime teetotaler, just about the worst thing that can happen to you is for people to think you’re drunk, and to not be believed when you tell them you have food poisoning. “OH!!! She has “food poisoning!” One guy actually said that, and even made little finger quotes in the air around the words “food poisoning” to show that he didn’t believe me. For a person who has spent their entire life trying to stop other people from drinking alcohol to be mistaken for a drunk is like being a Christian who is accused of attending a Satanic ritual. Its pure anathema, and it hurt me more than the pains shooting through my stomach.

I drank as much water as I could and threw up again in the bathrooms. Throwing up in public toilets is an experience I hope to never repeat.

I felt a little better, so I returned to my seat. By this point, Poe was singing her last song, the title track from “Haunted.” She decided to perform her first stage dive. The crowd caught her, and lowered her to the floor. She began running up and down the aisles with her cordless microphone, singing the end of “Haunted” over and over again while the band on stage played on.

“Do do do do. Do do do do.”

Up and down the aisles she went, performing impromptu duets with enthusiastic audience members. She came up our aisle. She moved toward the stairs.

Yes. Those stairs. I saw it in my mind in slow motion before it happened. Poe, all 6 foot tall, beautiful green-eyed, slender 120 pounds of her was going to slip on my puke and die. She was going to step in my blueberry pancakes. And my hashbrowns. And my orange juice. And my one egg over hard. And she was going to skid backward, fall, and crack her lovely blonde head on the pavement. Poe would die because I had dared to eat at the Golden Harvest.

“Do do do do. Do do do Aaaah!”

She barely skipped a beat. She caught herself with her free hand, grasping at the hand rail and spinning around as if it was all intentional.

“Do do do do. Do do do do.”

She turned around.

“Do do do do. Do do do do.”

Back to the stage she went.

“You’ve all been fantastic,” she said. “Except the drunk guy who puked on the stairs.”


Star Trek

I actually went to see this on the first day, but just haven’t had time to post my review until now.


“Star Trek”
Directed by:  J.J. Abrams
Written by:  Roberto Orci, Alex Kurtzman
Based on:  Star Trek original series by Gene Roddenberry
Bad Robot, Paramount Pictures, Spyglass Entertainment


Stardate: 5.07.09

I am receiving a distress signal. It is originating from the vicinity of my heart.

This is not my mother’s Star Trek. It’s not even *my* Star Trek. But somehow, I think that might be okay.

J.J. Abrams, creator of Lost and Alias has taken the characters from the original Star Trek series and pulled them into a rebooted universe that has the flavor of the original and the flair of some of Abrams’ showier work. Through clever use the usual suspects of sci-fi, time travel and alternate universes, Abrams gives himself the freedom to work within the framework of the classic television and film franchise without being too constricted. And the result is a film that is finely crafted, honest to the original work from which it is drawn, and fun.

The details are impeccable, from the hideous pea-green uniforms to the 1960’s retro future ship’s bridge. If we were living in the future 40 years ago, this is definitely what it looked like. Most importantly, the film stays on track by keeping the characterization of our beloved sci-fi heroes intact and human. There are no caricatures here, only carefully crafted impressions of the originals.

The relatively unknown Chris Pine (Just My Luck, Smokin’ Aces) is a superb James T. Kirk. Pine’s young Captain is a boyish, brash, charming, and irascible brat with a chip on his shoulder and fire in his belly. He also manages to pull off Kirk’s fabled womanizing without looking like a fickle jerk—no small feat.

Zachary Quinto (Heroes) is so astounding as the half-human, half-vulcan Spock that it’s hard to believe this is his first feature film. His characterization is so spot-on that even in a scene the young actor shares with the venerable Leonard Nimoy, where the young and elderly Spocks come face-to-face, the audience is left breathless and impressed.

All the actors manage to embed their characters with subtle traits created by the original actors, but still make the performances enough their own that we don’t feel we are watching someone imitating another actor. An excellent supporting cast including Bruce Greenwood as Captain Christopher Pike, John Cho as Sulu, and Simon Pegg as Scotty rounds out the set of heroes.

Still, for all its red shirts, dilithium crystals, and warp drives, Abrams’ Trek bears the same resemblance to Roddenberry’s original series that “Eleanor Rigby” bears to “I Want To Hold Your Hand”—it’s the same band, but a non-fan might not know it if you didn’t tell them. Abrams has pulled off the remarkable feat of staying true to the original work but making an entirely new, unknown story.

The film’s one truly weak link is Eric Bana as the tepid villain Nero. Bana is no Khan. Hell, he’s not even *Shaka* Khan. Thankfully, Bana’s scenes are few and short, so audiences are mercifully spared the long-winded statements of terrible purpose usually imposed by such tyrants. And because of this, the film is spared from what is usually Bana’s crowning achievement—single-handed destruction of every film he’s ever made.

And that distress signal? It stems from a sense of loyalty that seems to be wavering. The original Star Trek series is, and always has been, my favorite. But I feel myself being pulled towards Abrams’ rebooted universe like a spaceship drawn into a black hole. For now, I’ll say I’m keeping my options open. But let’s hope that the newest “old generation” of Star Trek lives long, and prospers.