Saturday, July 30, 2011

Whoever owns the rights to these posters could make a fortune...

No long post here, just a comment about a couple of images image I stumbled across during some aimless surfing for vintage movie posters (something I do from time to time).


Having spent time at college, I just have a gut feeling that if this poster were made widely available it would become an ubiquitous fixture in every stoner's dorm room across the country. I imagine it would have been especially popular during the 1960's and 70's when everybody would have been steeped in the sort of conservative cultural milieu that John Wayne represented and appreciated the juxtaposition of this icon of Americana and the hippie drug of choice. It has the same sort of appeal as the already popular "Nixon Bowling" and "Grow Hemp For the War" posters.

What makes it for me is the ambiguous nature of John Wayne's expression and body language. He could be fighting the evil dope peddlers and fiends, or all hopped up on the wacky weed, or, as I prefer to imagine, giving an arm-swingingly enthusiastic endorsement along the lines of: "Gee, golly! Marijuana sure is swell!!!"

Likewise, for those collegians who already possess refrigerator magnets of kitschy 1950's women with ironic captions (this demographic includeds everyone from party girls to nerds who want to assert that, yes, they too have a naughty side, thank you very much), many may enjoy this vintage movie poster:


I'm telling you, if these posters were sold wherever Scarface,
John-Belushi-wearing-a-"College"-sweatshirtPink-Floyd-album-covers-on-the-backs-of-naked-ladies or Le Chat Noir posters are, then they would sell like hotcakes.

Thursday, July 7, 2011

Luddite? Depends on your definition.

In my last post, wherein I ranted about why I'm not a fan of e-books, I offhandedly referred to myself as a 'Luddite.' ["As e-readers become more ubiquitous (as I'm sure they will, despite the grumblings of Luddites like myself)..."] This led to my wife questioning me on whether that was a hypocritical thing to call myself, being as I had just posted a link to Project Gutenberg, which she viewed as me promoting the very thing I claimed to be against. So for the record, let me state two things:

1) If I link to something, I do not necessarily approve of it, nor am I trying to promote it. I consider hyperlinks to be the internet equivalent of a footnote. I use them primarily to allow curious readers to find out a little more information, while less curious readers may just continue reading.(Although in this particular case, I do happen to support Project Gutenberg.)

2) I do consider myself a Luddite in the sense that I am often resistant to new technology that changes things in a way that I perceive to be negative (and, likewise, I happily embrace technology that I view as improving things). To use one example, I steadfastly maintain that 35mm film produces a superior image to digital capture. As a professional videographer, the realities of the market have forced me to work mostly in HD since I left film school, but my preference is still to shoot film whenever possible. I have also sometimes used the term 'Luddite' to describe myself due to the fact that, despite working in a technology-dependent field, I am often the last to adopt new technologies. I tend to be stubbornly old-school in my tech habits.

This second point got me thinking about the various ways in which the term 'Luddite' is used and abused. There are several legitimate definitions for 'Luddite' which differ from each other, as well as many ways that the term is frequently misapplied. Language, of course, is infinitely malleable and constantly in flux, and so varying uses and definitions of a word are to be expected. That being said, although new uses for words are constantly being found, there are times when a word is used incorrectly. In order to clearly communicate, there must be some standard definitions. So...

What is a Luddite?

First, let's ask the Internet.

One of the first images that Google brings up for 'Luddite.'

Of the more respectable online dictionaries, Merriam-Webster offers a typical definition:
: one of a group of early 19th century English workmen destroying laborsaving machinery as a protest; broadly : one who is opposed to especially technological change
This is the standard dictionary definition of the term, but this hardly encompasses the range of meanings the word has assumed. Other common uses of the word are summed up by whatis.com:

1. A Luddite is a person who dislikes technology, especially technological devices that threaten existing jobs or interfere with personal privacy.
2. A Luddite is someone who is incompetent when using new technology.
By the first definition, an author who still prefers a manual typewriter can be called a Luddite; more specifically, a projectionist who sees his job threatened by "foolproof" digital projectors is also a Luddite. By the second definition, your Grandmother who can't program her VCR is a Luddite (for that matter, the fact that she still uses a VCR also makes her a Luddite).

So aside from the historical Luddites, a Luddite can be someone who is opposed to technological change, or is against technology that threatens people's jobs, someone who is technologically incompetent, or merely someone who dislikes technology. But wait, there's more...

From the user-contributed Urban Dictionary:
1. One who fears technology (or new technology, as they seem pleased with how things currently are...why can't everything just be the same?)

2. A group led by Mr. Luddite durring [sic] the industrial revolution who beleived [sic] machines would cause workers [sic] wages to be decreased and ended up burning a number of factories in protest
 A luddite [sic] generally claims things were "just fine" back in the day, and refuses to replace/update failing equipment/software/computers on the basis that they were just fine 10 years ago.
So according to this person, Luddite is synonymous with technophobe, as well as referring to the type of stick-in-the-mud who still runs Windows 95 or listens to 8-track tapes (depending on the age of the "Luddite" in question).

This assortment of definitions encompasses such a broad range so as to render the word fairly meaningless. This is evident in some of the usage that Google turned up. For example...

A British politician commenting on Prince Charles' concerns about genetically modified food:
It's an entirely Luddite attitude to simply reject them out of hand.
An American politician commenting on George W. Bush vetoing a stem-cell research bill:
This will be remembered as a Luddite moment in American history, where fear triumphed over hope and ideology triumphed over science.
The bloggers Bottom of the Glass have offered their own definition, accompanied by the cartoon below:


 Luddites are the Amish. They are anyone who, at any point in time, drew a line and determined that all technology and modernization up to said point was acceptable while all of it beyond said point was evil, deplorable, of the devil, whatever. They are people who bury their head in the sand and wish new things would just go away...The goal of the Luddite is merely to freeze time, freeze assumptions, freeze change. And they seethe and growl at those who attempt to move things forward.

So which of these definitions/usage is the most accurate? And what does any of this have to do with the folks that smashed and burned textile machinery in England two hundred years ago? Perhaps we would do well to learn a little more about...

The Historical Luddites

[Note: The following information comes from a smattering of websites I perused, some of which can be found here, as well as the old standbys of Wikipedia and Brittanica Online, but my chief source was historian Kevin Binfield's site Luddites and Luddism: History, Texts, and Interpretation.]

Frame breaking (1812)
On March 11th, 1811, a bunch of Nottinghamshire stockingers (skilled artisans who knitted stockings) gathered to protest lowered wages, rising food prices, widespread unemployment and the hiring of unskilled laborers, as they had (peacefully) many times over the previous month. This particular evening's meeting ended with a bunch of the stockingers smashing the wide-frame looms (a semi-automated knitting machine that could be used by unskilled laborers) at a local shop which epitomized their grievances. This sparked a frame-smashing fever in the area, and the Nottingham Journal reported several weeks of nightly frame-smashing throughout villages in Nottingham County, as well as arson and other forms of vandalism directed at textile mills. A typical account from the time relates that, "2000 men, many of them armed, were riotously traversing the County of Nottingham."
King Lud.

These stockingers were not, as the misinformed user of Urban Dictionary claims, "led by Mr. Luddite;" but they soon began to began to be known as Luddites because the frame-smashings were usually attributed pseudonymously to one Ned Lud. According to the Oxford English Dictionary, the real Ned Lud was a fellow from Leicestershire who, in 1779, had destroyed a stocking loom "in a fit of insane rage." We don't know much more about the guy than that, but people had apparently remembered his act of senseless destruction enough that "Lud must have been here," had become a catchphrase used whenever machines broke down. By late 1811, letters full of demands were being sent to textile mill owners and local papers signed by "Ned Lud," "General Lud," "Captain Lud," or, most famously "King Lud." The character began to evolve into a Robin Hood-like folkloric figure. Protest leaders began dressing in outlandish getups as "King Lud" and claiming to be King Lud, while the crowds shouted treasonous chants such as, "No King but Lud!"

Luddite protests, riots, and acts of sabotage spread from Nottingham County throughout the English Midlands, as well as spreading to related industries such as cotton manufacturing. This continued in sporadic bursts through 1816. A few of the Luddite demands were met (certain trade restrictions were rescinded, moderate wage increases were gained for some and food prices lowered slightly), but for the most part they met with stiff government reprisals. The Army was dispatched to quell Luddite riots, and for a time there were more British soldiers engaged with the Luddites than there were fighting Napoleon. In 1812, Parliament passed an act which made all industrial sabotage or "machine breaking" a capital offense. In 1813, seventeen Luddites were executed under this act and several hundred more were transported to the penal colonies in Australia and Tasmania. Three more men were executed for the murder of a mill owner later that year. The movement began to falter quickly thereafter.

The conventional telling of these events holds that the machines destroyed by the Luddites were new technology and thus perceived as newly threatening. The notion that the Luddites rebelled against a new innovation is central to the way the word is typically used today. This, however, was not the case. The stocking frames which the Luddites were smashing had been invented way back in 1589. Fearing that the invention would cost English stockinger jobs, Elizabeth I and James I had banned the machines in England, but from the 1660's onwards they had been playing an increasingly large role in the British textile industry. By the time the Luddites began smashing them there were over 25,000 of the machines in place in Nottingham County, and most of them had been there for many years.

So what provoked the Luddites in 1811? Essentially, there had never been a worse time to be a trained stocking-knitter in England owing to a whole host of factors. The mechanized stocking-frames had been putting skilled artisans out of work and driving down pay steadily for 200 years, but in the first decade of the 19th century several more blows came all at once. The first was this guy:


The Napoleonic Wars forced Great Britain into a massive depression from 1808 onwards. Hardest hit were those that produced commodities. Since Parliament had relented to pressure from the nation's wealthiest to eliminate the income tax, the only way to pay for the war was with high commodity tariffs and taxes as well as increased fees and sales taxes. The burden of all this hit the working classes the hardest. The government also attempted to get back at France by banning trade with them and all countries friendly to France (much of Europe, especially as Napoleon conquered more). This meant no foreign markets to sell textiles in.

As the wars dragged on, Britain began to recall its Army and focus its efforts on its navy and on subsidizing foreign allies, adding to the already sky-high unemployment when hundreds of thousands of soldiers returned to England in need of work. Wartime food rationing, several years of crop failures as well as price-support laws designed to protect farmers led to food prices being at an all-time high just as wages and employment were at an all-time low.

The hardships brought by the Napoleonic Wars also came at the culmination of several centuries worth of economic and social change as the English economy shifted from Mercantilism to Free Market Capitalism, and as the Industrial Revolution continued to gather steam (no pun intended). Once upon a time, knitting had been a prime example of a cottage industry, where skilled artisans worked from home. Stockingers, like all skilled artisans, typically served an apprenticeship of seven years (as required by law). The introduction of automated machinery in the mid-1600's meant that mill owners could get away with hiring unskilled workers for dirt cheap, rather than paying apprenticed artisans the wages they demanded; however, for just this reason trade guilds successfully petitioned politicians to write laws requiring workers in traditional trades to serve their full apprenticeship and requiring mill owners to hire apprenticed artisans. With the expansion of colonialism (and thus the demand for goods and availability of resources), the textile industry shifted slowly from a cottage industry to the factory system in the 18th century. However, even though textiles were now made in factories using machines such as the stocking frames, it was still mostly skilled artisans who were doing the work, thanks to government protection and legislation as well as the influence of the trade guilds. Additionally, the economy was still largely a controlled Mercantilist economy with prices stiffly regulated and wages protected by orders of Parliament and the Privy Council.

At the turn of the 19th century this old economic system was breaking down and the tightly controlled economy of years past was giving way to modern free market capitalism. By the time of the Luddites, laws concerning price and wage regulation had been revoked, and mandatory seven year apprenticeships and required hiring of artisans were also no longer in place. The invisible hand of supply and demand now ran everything, and Britain's labor force shifted to being largely composed of hired unskilled laborers. With the economic hardships brought by the Napoleonic Wars, milliners were forced to cut costs wherever they could, typically by slashing wages, and purchasing more machines so that they could hire unskilled laborers rather than skilled ones.

The Luddites were skilled artisans who had gone through seven years of apprenticeship only to find that they were for the same jobs as thousands of unskilled day laborers. These jobs had wages so low that, with rising food prices and widespread famine, it was increasingly hard to avoid the Work House. This is what they were protesting. They were not technophobes, but labor organizers. The industry was scattered enough to make something like a general strike difficult and impractical if not impossible; destroying machines was one way to make their demands heard and assert some leverage over their employers. It was an example of what historian E. J. Hobsbawm calls "collective bargaining by riot." This is well-evidenced by the various texts left by the Luddites; machine-smashings were always preceded by letters making specific demands about wages and hiring practices; the milliners who met those demands (and a fair number did) did not have their machines destroyed.

It is especially worth noting that the Luddites were hardly the first to smash machines for this reason. Industrial sabotage has a long and illustrious history. As Kevin Binfield points out:
For example, in 1675 Spitalfields narrow weavers destroyed "engines," power machines that could each do the work of several people, and in 1710 a London hosier employing too many apprentices in violation of the Framework Knitters Charter had his machines broken by angry stockingers. Even parliamentary action in 1727, making the destruction of machines a capital felony, did little to stop the activity. In 1768 London sawyers attacked a mechanized sawmill. Following the failure in 1778 of the stockingers' petitions to Parliament to enact a law regulating "the Art and Mystery of Framework Knitting," Nottingham workers rioted, flinging machines into the streets. In 1792 Manchester weavers destroyed two dozen Cartwright steam looms owned by George Grimshaw. Sporadic attacks on machines (wide knitting frames, gig mills, shearing frames, and steam-powered looms and spinning jennies) continued, especially from 1799 to 1802 and through the period of economic distress after 1808.
There were many similar revolts after the Luddites as well, including the Pentrich Uprising of 1817 (a demonstration of several hundred unemployed stockingers, quarrymen and iron workers) and, most notably, the Swing Riots of 1830-31. The Swing Rioters were farm laborers struggling against cripplingly low wages who destroyed threshing machines and threatened those who had them in order to make their demands heard. Like the Luddites, the Swing Rioters sent pseudonymous letters signed by a mythical "leader" of the revolt, in this case the entirely fictional Captain Swing. Some samples from the letters:
Sir, Your name is down amongst the Black hearts in the Black Book and this is to advise you and the like of you, who are Parson Justasses, to make your wills. Ye have been the Blackguard Enemies of the People on all occasions, Ye have not yet done as ye ought,.... Swing
And...
Sir, This is to acquaint you that if your thrashing machines are not destroyed by you directly we shall commence our labours. Signed on behalf of the whole ... Swing
On an unrelated note, Captain Swing has recently been reborn as a steampunk master criminal who can spontaneously generate electricity and fights space pirates (or something) in a comic book by Warren Ellis. He also lent his name to a bizarre 1960's Italian comic book (fumetti) and an even more bizarre Turkish film adaptation.
 
I'm not making this up.

Anyways, returning to my original query, the question remains: How did we get from a 19th century labor revolt to the modern usage of the word?

How the term 'Luddite' was transformed

I believe the main reason why the Luddites, out of the many industrial saboteurs and machine-breakers throughout history, have been singled out as an example is that there is a specific word to refer to them. In the grand scheme of history the Luddite riots were not all that unique, but talking about the "Swing rioters" or "the 15th century Dutch workers who protested new automated looms by throwing their wooden shoes into the machinery" just isn't as direct as saying 'Luddite.'

As near as I can tell, the endurance of the term is due mostly to neoclassical development economists, who seized upon the Luddites as a rhetorical device that could be used to illustrate something they termed the "Luddite Fallacy."  According to these economists, the "Luddite Fallacy" is the mistaken belief that labor-saving technology will lead to fewer jobs. I believe this seriously oversimplifies the actual history of the real Luddites, but it has been an effective enough rhetorical example that anyone from the early-20th century onwards who has taken an Econ 101 class has encountered the term. This kept the word 'Luddite' circulating in the language and perpetuated the oversimplified story of the Luddites as people opposed to new technology.

From the economists the term Luddite was transmitted to the college radicals of the 1960's, some of whom began to identify themselves as Neo-Luddites. The sixties were a time when many leftist and anti-establishment groups happily claimed to be modern-day heirs to the Luddites. Neo-Marxists came closest to the original Luddites' aims, praising the workers for seizing the means of production, and Neo-Leninists and Anarchists praised the Luddite vandalism as 'propaganda of the deed.' However, those that most successfully adopted the term (or had it derogatorily applied to them) were those with objections to technology.

It was a great time to be anti-tech: the hippie counter-culture was in full swing, the conformism of the 1950's was seemingly embodied by new technology such as television and home appliances, and the Cold War threat of nuclear holocaust and the napalm bombings in Vietnam were on everyone's mind. This was the era that saw the flowering of such schools of thought as anarcho-primitivism and environmentalism. The admiration Thoreau and Emerson felt a century earlier for "nature unadorned" was being echoed throughout popular culture; it was a time to "unplug" and "get back to the land." 'Luddite' became firmly cemented in the English language as centrally relating to technology. The economic and social hardships being fought by the original Luddites were all but forgotten. The important thing to remember was that they smashed machines.

Neo-Luddite.

The Next Frontier for Luddism

And so now we have the word as used today, with about eight different varying definitions, all in agreement that a Luddite hates new technology, something that was not necessarily true of the original frame-smashers. With this ambiguity in language, I think there may be currently a new type of Luddite arising: the Language Luddite.

The English language used to be (at least, from the 1600's on) in the hands of the few and the highly trained, like the skilled and apprenticed textile artisans of old. These were the English professors, professional grammarians and the dictionary editors. It used to be a big deal when the staff of the Oxford English Dictionary added a new word due to common usage, or changed a definition owing to officially recognized shifts in language. It might still be a big deal, if anyone still consulted the Oxford English Dictionary for reasons other than trying to sound smart. But they don't.

The OED: Helping people sound smart and needlessly lengthen research papers since 1895.

Just as the economic climate of the textile industry had been shifting for some time before the Luddites, language snobs have always been fighting a losing battle against constant shifts in "proper" language use (as chronicled in Jack Lynch's bestselling The Lexicographer's Dilemma). But language police today face something as catastrophic to them as the Napoleonic Wars and repeal of protective legislation were for skilled stockingers: the Internet.

Dictionary, thesaurus and reference book sales have been plummeting in recent years, thanks to the Internet. No Dictionary, not even the "Unabridged" varieties, could ever hold every word, but the Internet comes as close as anything yet. Besides, why would you need to own a dictionary when you can just use a search engine for the word and find dozens of definitions from dozens of sites? You don't even have to spell the word correctly; Google will still know what you mean.

The old standbys such as the OED, Merriam-Webster's and The American Heritage Dictionary are all online for free. But like the 19th century artisans, these reputable old institutions are competing with hundreds of non-professional, user-contributed sites such as Wiktionary, Whatis.com or Urban Dictionary. Also due to the Internet, the print business is in a slump, publishers and newspapers are cutting staff, especially editors (they'll always need people to produce content, but any computer can do a spellcheck), and serious journalists compete with bloggers. Those who would seek to maintain an iron grip on the purity of language are in dire straights. Just Google "all right or alright" and have a look at today's intense lack of consensus over how to properly speak and write the English language.

There must certainly be many language purists who are attempting to resist the changes being wrought by the Internet. The widespread confusion over a term like 'Luddite' must irritate them to no end. Surely, if anyone deserves to apply the term Luddite to themselves, it would be these "Language Luddites." Of course, they never will. They have too much respect for the term in its original and specific context.

Wednesday, June 29, 2011

Are e-books and unread, leatherbound classics the way of the future?

Anyone who knows me knows that I have two homes: the house I own and the local Barnes & Noble. It has comfy chairs, books to peruse, coffee, and is a place with clean enough floors that my one-year-old daughter can crawl around and explore without too much cause for worry.

Over the last few years, more and more of the space inside the store has been devoted to selling the Nook, Barnes & Noble's e-reader. Now, to get to the books or the coffee, one must pass through a phalanx of Apple Store-esque displays and proselytizing employees urging you to join the e-reader revolution. At first, I thought this was odd; it seemed like a bookstore would want people to buy books, after all. It seemed akin to attending a movie theater where employees at the box office urge you to go home and start a Netflix account instead of purchasing a ticket.


Of course, Barnes & Noble would not be pushing the Nook if it didn't see big profits in the future. Selling physical books at physical locations costs much more than selling a digital download, and with e-reader prices not significantly lower than that of physical books that translates to much more profit. As far as the Barnes & Noble is concerned, they would love it if the stores merely served as showrooms where visitors could browse and select which e-books to download, and that is direction they are trying to steer their customers.

Although I understand their logic, I think the chain is shooting themselves in the foot. Barnes & Noble hopes that visitors will wander the aisles, Nook in hand, and download whatever catches their eye using the complimentary Wi-fi. The problem is that people can just as easily wander the aisles, Kindle in hand, and download the same product, often for less, from Amazon.com or other competitors. Amazon, unlike Barnes & Noble, maintains no physical retail locations and can usually cut costs a little more and still reap huge profits. And for public domain books, free pdf's proliferate the Internet. How many copies of "Pride and Prejudice" or "The Adventures of Huckleberry Finn" do Barnes & Noble stores sell each year? Probably a lot. How many will they sell after everyone moves to e-readers, and can find a free download at Project Gutenberg?

The other potential downside of e-readers, from the sellers' standpoint, is the increased potential for piracy. As e-readers become more ubiquitous (as I'm sure they will, despite the grumblings of Luddites like myself), the publishing industry will have to face more and more of the problems that have plagued the music and entertainment industry ever since Napster came along. When the last Dan Brown novel came out a couple of years ago, its publisher proudly proclaimed that a new era of publishing had been entered because it sold more e-books than hardcover copies. What they weren't keen to publicize was the fact that within 24 hours of the book's release over 100,000 pirated copies had been downloaded for free. Despite whatever encryption they offer or how many people are prosecuted, there will always be ways to download a free copy of the latest bestseller.

Personally, I'm not going to be purchasing an e-reader anytime soon for reasons that are hardly unique. Like many bibliophiles out there, I love the feel of a book, the tactile quality of turning the pages and feeling the weight shift as I get closer to the end. I love the ability to scan ahead a few pages and know how much of a chapter I have left. For non-fiction books, I am an obsessive underliner and note-taker.

Most of all, I like having books on my shelf. This may be shallow of me, but I love having a collection of physical objects on display representing the various texts I have consumed. I love being able to scan across the titles and fondly recall my favorite books. I love being able to take an old favorite down from the shelf and flip through it, briefly reliving the experience of reading it before putting it back. And the OCD part of me loves organizing the books. Filmmaker and bibliophile Guillermo del Toro, in an absolutely fantastic lecture posted to youtube, summed up this impulse best: "We are animalistic creatures," he said. "We need talismans." Del Toro actually built a separate house just for his books (going into debt to do so), with seven libraries in seven rooms. "I'm a very, very organized hoarder," he explains.

Del Toro's library. As much as I love Barnes & Noble, I would rather hang out here.

Barnes & Noble, I think, is attempting to appease the talismanic nature of their e-book customers by ramping up their selection of "collectible" books. These include table after table of coffee table books, all on sale at bargain prices, as well as the Barnes & Noble Leatherbound Classics Series. These leatherbound classics are all very attractive and look the way a 'classic' should- and that's what they're designed to do. And that's all they're designed to do.


I've paged through some of these books. The bindings, attractive as they are, actually make for rather difficult reading. Most of these are fairly thick books (which makes them feel more important), and the bindings are not very flexible which makes reading pages towards the middle quite uncomfortable. The pages are stiff with gilt edges and do not turn very easily. To get a comparison with other leatherbound books, I walked over to the Bible section and paged through some of the deluxe leatherbound Bibles. The Bibles were all quite comfortable to page through. They were meant to be read and used.

The Leatherbound Classics, on the other hand, are not intended for reading (or so I've concluded). They are intended to look good on a shelf. These editions call to mind Mark Twain's saying, "A classic is a book which everyone wants to have read and nobody wants to read."

A true Jane Austen fan would rather have their own, easily readable copy of Pride and Prejudice than have to find it wedged in the middle of a stiffly-bound leather collection of seven Austen novels. However, someone who just read Pride and Prejudice on their e-reader and loved it so much that they want something to show for it might be interested in a leatherbound classic. They don't need to ever read it (that's what the e-reader is for) but there it is, on the shelf in its leatherbound glory, proudly proclaiming, "Why yes, I do enjoy reading Jane Austen. So much so, in fact, that some dog-eared paperback from high school English class just won't do. I admire Austen enough to purchase a collection bound in handsome Italian bonded-leather adorned with decorative endpapers, a ribbon marker, and other features which make this collectible edition a perfect gift or addition to any home library."

"I'm important. I have many leatherbound books, and my apartment smells of rich mahogany."

Similarly, for anyone interested in reading Shakespeare, Barnes & Noble has all of the plays available in very readable editions. According to their website, "Each edition provides new scholarship with an introduction, essays on Shakespeare's England and language, unusually full and informative notes, essays on Shakespearean theaters and significant performances, an interdisciplinary look at the work's influence on other arts, and an annotated bibliography for further reading." As fancy as that sounds, these are hardly the footnote-fattened tomes of the scholarly Arden editions; just basic editions of the plays with the sort of notes and preface you'd expect to find in any edition for modern readers. After all, its nice to have a footnote or two when reading a 400 year-old play, being as words like "fardel" and "bodkin" aren't heard as often these days.

However, if you are less interested in reading Shakespeare than in having Shakespeare attractively displayed on a shelf, then the Barnes & Noble Leatherbound Classic Series Complete Works of William Shakespeare is for you. This handsome edition features no footnotes, essays or prefaces, but that's no matter as its unlikely anyone will be reading it. That's what the paperback and e-book editions are for. The leatherbound edition is there to proudly proclaim, "Yes, I value Shakespeare enough to have a suitably noble edition on my shelf. Not just copies of his more popular plays, mind you, but his complete works, for those times when I might want to read The Phoenix and the Turtle, Timon of Athens or all three parts of Henry VI."

If I had to guess which of the leatherbound books were the least read, I would put money on either Gray's Anatomy or The Assassination of Abraham Lincoln: A Tribute of the Nations. Gray's Anatomy has been through 40 different revisions and editions since its initial publication, but Barnes & Noble's leatherbound version is the original 1858 text. Therefore, it is definitely not for medical study, as the surgical field has advanced quite a bit in the last 150 years. It is, however a book of interest to students of medical history as well as artists (owing to the evocative quality of its creepy engravings). I'm guessing, though, that most who buy this particular edition just want to appear well-educated. Likewise, The Assassination of Abraham Lincoln, an 1867 compilation of speeches given by various world leaders commemorating the life of Lincoln, may be a must-own for the most die-hard Lincoln fans but is probably low on the list of Lincoln books that might interest a casual reader. But it sure looks good on a shelf, especially beside Dante's The Divine Comedy or Homer's Iliad (both without footnotes, of course). 

Of course, stacking your shelves purely with this sort of book makes it a little obvious that you're just trying to impress. For those who want to start a leatherbound classics collection but want to avoid appearing pretentious, Barnes & Noble has recently started incorporating more populist titles in the series. Some examples include The Ultimate Hitchhiker's Guide to the Galaxy by Douglass Adams, Jurassic Park/The Lost World by Michael Crichton and Wicked/Son of a Witch by Gregory Maguire. 

Mark my words, this is next.

Having a leatherbound edition of an airplane read like Jurassic Park makes the idea that one actually reads the gilt tomes on display seem even more probable. And I'm sure anyone who buys it has read the book- years ago, in a $6.99 paperback. The collectible Jurassic Park/Lost World exists to justify the unopened copy of Wuthering Heights sitting next to it. "Look how well-rounded my tastes are!" it proclaims.

Lest I come off as sounding too negative about the Leatherbound series, let me be the first to say that I fully understand the appeal. I have often been tempted to purchase the leatherbound editions of The Complete Works of Edgar Allen Poe since it seems so perfectly Gothic; and Grimm's Fairy Tales, since it reminds me of the books that you see being opened at the beginning of old Disney films. And I understand the desire to own a collectible edition of a favorite book, even if you never actually read that particular copy. For example, my wife and I are huge Agatha Christie fans, and have purchased several antique first editions as tokens of our fandom, which sit proudly on a display shelf, quite separate from our tower of well-worn Christie paperbacks.

I am also fully aware that this is hardly a new phenomenon. It is a tradition as old as books themselves. As books were expensive to produce, and with education being reserved for the upper-classes, books initially were quite the status symbol. I'm sure early book-buying involved a great deal of showing off.

The illuminated manuscript was the medieval way of saying, "My Bible is classier than yours."

As we get into to the modern era, books became cheaper at the same time as social classes more malleable, and affordable yet classy-looking leatherbound books became a great way of asserting one's societal aspirations. This perhaps reached its zenith with book clubs such as The Franklin Library, where subscribers were sent a new book every month from a series such as The 100 Greatest Books Ever Written.

The Franklin Library: the only reason your Grandma owns a deluxe edition of Rabelais' Gargantua and Pantagruel. 

I'm not against leatherbound books, nor even e-books. I am merely pointing out the trend I have noticed: that the more big booksellers hype e-readers the more they push "collectible" editions of books as well. The consumer is being increasingly pushed towards the dichotomy of the convenient-yet-intangible (e-books) and the tangible-yet-inconvenient (collectible books). I don't think that either will ever entirely replace the sort of tangible-and-convenient book that lines my shelves, but I do worry that the sort of book I love will become less and less available. The more e-books that are sold, the less cheap paperbacks will be printed. And the more collectible tomes that are printed, the fewer modest hardcovers will be available. As a stick-in-the-mud old-fashioned bibliophile, I find this cause for concern.

Friday, June 17, 2011

Childhood Filmmaking Attempts


My wife and I just saw Super 8 (which was fantastic, by the way; I would highly recommend it), and being as the protagonists are a bunch of kids committed to making a movie, that naturally got me thinking about my own childhood attempts at filmmaking. I would bet that most people who eventually end up pursuing film and video work have stories about the films they made as children; this is mine.

My Dad purchased our family's first camcorder, a cumbersome VHS machine, when I was five years old. I distinctly remember immediately insisting that we make a movie. Somehow, we ended up filming a short video of "The Three Little Pigs" starring various stuffed animals (none of whom were pigs) and featuring magnificent sets of Lincoln Logs (for the 'pig' that made his house of sticks) and wood blocks (for the house of brick; I don't recall what the house of straw was made of). I'm guessing it was one of my parents' ideas to do "The Three Little Pigs," as my story suggestions would have most likely involved dinosaurs, trains, or both. But I was thrilled to be making a movie, any movie. Over the next few years, my Dad and I would occasionally play around with the camera, usually having fun with crude, in-camera special effects ("trick photography" as my Mom quaintly referred to it) such as making things disappear and reappear. Georges Méliès would have been proud.

My childhood career as an auteur began in earnest as I entered the third grade, which coincided with my Dad's upgrade to a Hi8 camcorder, a hand-held machine that seemed impossibly small for the time (1992). Naturally, it was too expensive a piece of equipment to hand over to a child with a propensity towards accidentally breaking things, plus I was to be the star of my films as well as producer, writer and director, and so my Dad served as de facto cinematographer; the Greg Tolland to my Orson Welles. That year saw the birth of my first masterpiece: King Kong Returns.

After the hand-drawn opening credits, we see a newspaper boy (my five-year old brother, of course wearing one of my Grandpa's "Newsie" hats) proclaim the latest news: "Extra! Extra! Read all about it! King King is alive again!" And that's all the exposition we need. A pause for suspense (and for my brother to remember his lines). The Newsie points offscreen, "Look, there he is!"

The mighty Kong (a ten-inch plastic toy) arises from behind painstakingly crafted HO-scale model buildings, and proceeds to smash things. As he can't really move his arms, he must do this by thrusting his whole body so that his outstretched arms smash the buildings, all the while wailing and howling. My nine-year old voice had yet to attain the proper level of growl for Kong, and so this Kong has a rather high-pitched shriek. As he moves about the city, an HO train approaches, and the camera strategically pans to the sky to hide the cut; when it pans back down Kong has wrecked the train in a carefully laid-out scene of destruction. Kong then moves to knock over a suspension bridge of wooden blocks, before a Pteranodon, apparently also escaped from Skull Island, appears. Hovering helicopter-like somehow, without moving its wings, the flying beast knocks Kong off a 500-foot cliff (which we know about because of the big sign that reads: "Danger: 500 Foot Cliff!"). This is the end for Kong; in case we had any doubts we now cut to an average citizen (me) reading a newspaper account of all that has transpired to provide the proper closure.

Having conquered the special-effects driven event movie, I was ready to craft some Hitchcockian suspense with my next film that summer: Cliffhanger! The film, shot on location at my grandparents' cabin in norther Minnesota, opens as the Villain (me), dressed in dark glasses, one of my Grandpa's straw fedoras and few other mismatched articles of clothing, explains directly to the audience all of his his evil plots. Not very subtle, but hey, Shakespeare got away with it in Richard III. The Villain (which is the character's name) has stolen a bunch of explosives and needs to destroy the evidence for some convoluted reason, and so sets the bomb on a timer and plants it by the cabin of an unsuspecting family.

This is witnessed by Alex (played by me) from the second story window of the building. He tries to escape but for reasons of plot convenience the door is jammed shut and cannot be opened, and our protagonist concludes that the only way is to climb out the second story window.

Hitchcock would often have an idea for a scene and build a movie around it; North by Northwest, for example, began when Hitchcock asked screenwriter Ernest Lehman to write a movie with a murder at the U.N. and that ended with a chase across Mount Rushmore. Cliffhanger! was made solely because I realized that with the right camera angles and (in-camera) editing, I could fall out of a first-story window and make it appear as if I had fallen out of a second-story window. This scene was the centerpiece of the film.

After miraculously surviving what, to my nine-year old mind, was an impossibly high fall, a limping Alex grabs the bomb, and throws it in the lake. He's just in time; my Dad provides some magnificent sound design and special effects by making explosion noises while shaking the camera. Meanwhile, I do my best Star Trek collision acting and pretend to be tossed about by the lake-dampened explosion. The End.

Much was left to be desired from this, so my next project was a gritty reboot of Cliffhanger! In th redux version, the Villain has a slightly more defined motivation: he wants to dig a mine on the land owned by our protagonist's family, who of course won't sell, so he plants a bomb to fake a gas explosion. This is still explained via soliloquy. The ante is upped by the presence of Alex's five year-old brother, Eric (played by my five year-old brother, Eric) who also must escape the mysteriously locked house. After escaping and heaving the bomb into the lake, Eric decides that he will go and catch the villain and gets away before Alex can stop him. The Villain is standing by the road, with a revolver in a holster at his side (like any good villain), when Eric runs past him. Then, remembering what he was supposed to do in the shot, he runs past again and snatches the gun away from the Villain. "Stick 'em up!" Eric yells as Alex arrives. Alex calls the police, and we cut to the Mayor (my Mom) awarding certificates of bravery and declaring it to be Alex and Eric Day. The End.

Later that summer, I made the sequel, the awkwardly-punctuated and aptly-titled Cliffhanger! II: The Sequel!. Eric is being babysat by his cousin Adam (played by our cousin, Adam). Meanwhile, the Villain breaks out of prison. Next he shows up and kidnaps Eric while Adam is distracted watching television. Seeing them leave, Adam chases after them, leading to a confrontation during which Adam is knocked off a cliff and left for dead. The Villain ties Eric to a bomb and is about to place a ransom call when Adam bursts in and punches the Villain, escaping with Eric. The Villain chases after them when the bomb explodes (via more camera shaking). We are informed by the Mayor, who is again handing out awards, that the Villain perished in the explosion. The End.

I would make one more movie that summer, before moving on to a project which grew exponentially in size and ambition and thus never got made. The completed film, Dino Days! concerns two time-traveling scientists, Professor Alex and Professor Michael (myself and my best friend Mike). First they travel to the Jurassic Period, where they are chased off by a Dilophosaurus, (a clay model that appears in a somewhat failed attempt at forced perspective). The Dilophosaurus eats Professor Alex and Professor Mike flees. This was achieved by some very crude stop-action, which was severely limited by the fact that we did not have a camera capable of shooting single frames (instead we just hit 'record' on and off as quick as possible). 

Jurassic Park had not yet been released at the time of filming, but the promotional tie-ins were everywhere and had already made an impact on me, as evidenced by my decision to depict the Dilophosaurus with a frill that appears around the neck. I knew full well that the frill was not present in the fossil record and had been made up for the movie, but it looked cool so I chose to include it. Clearly, Alex-the-movie-lover was struggling with Alex-the-dinosaur-nerd-with-a-strong-need-for-scientific-accuracy.

Not scientifically accurate.

In the movie, I also wanted the Dilophosaurus to fight a Tyrannosaurus Rex, but knew that they lived about 100 million years apart. Alex-the-dinosaur-nerd won this battle, as I then had the Professor Mike flee to the time machine while the clay Dilophosaurus stows away on board, and then they all travel to the Cretaceous Period (when T. Rex lived).

Prof. Mike gawks at the all the dinosaurs he sees (all of my prized Carnegie Collection dinosaur toys), and the T. Rex eats the Dilophosaurus before attacking the Time Machine (more crude stop-action). The Time Machine explodes (shaky-cam, which was beginning to be my hallmark), and Professor Mike proclaims, "Oh no! The T. Rex destroyed my Time Machine. Now I'll be stuck in the Cretaceous Period forever!" The End.

Dino Days! was the last film I completed until college, but not for lack of trying. Not satisfied with how Dino Days! had turned out, Mike and I began planning a remake which grew and grew in ambition. Eventually it turned into a feature-length script and we spent years trying to teach ourselves special effects and filmmaking techniques in preparation for this epic (which was soon re-titled Mesozoic Mayhem; I had a thing for alliteration). By the time we entered middle school, we'd spent years preparing for this thing but still hadn't shot any of it except some special-effects test footage. Around seventh grade we decided the whole thing was rather silly and the project was abandoned for good. I may sometime do a post recalling the years of work we put into the movie that was never made, but for now this post is long enough and I am up past my bedtime. And so, adieu.

Tuesday, June 14, 2011

If at first you don't succeed, blog and blog again...

It's been over eight months since I've written anything on this blog, and prior to that I'd only written 5 posts. Perhaps that's par for the course for the personal blog of someone not all that comitted to blogging, and yet, for reasons outlined in my original post, I believe I could benefit from actually writing on this thing from time to time. And so, brushing the dust of failure off of myself, I am now going to give this a second go. I'll try to post at least once a month for starts, then every couple of weeks, and then if we're lucky once a week. I feel that's an acheivable goal.

Sunday, October 3, 2010

Classics: The Sting (1973)



Like the thesis statement in the opening paragraph of a paper written for a seventh grade English class assignment, the opening credits of 1973's The Sting announce what's in store for the audience. The 1930's Universal logo, hand-drawn title cards, the introduction of the cast as "The Players," and most importantly, the Scott Joplin ragtime score set the mood for two things: fun and nostalgia.

This was a dose of what Americans needed desperately in 1973. War raged on in Vietnam, society was still reeling from the social upheaval of the late 60's, and cynicism was at an all time high (although not nearly as high as it would be, with Watergate just around the corner). Many of the great films of the era were daring and experimental, taking advantage of the demise of the Production Code and exploring dark themes with gritty realism. When The Sting won the Oscar for Best Picture, it followed the Oscar wins of darker fare such as The Godfather (1972), The French Connection (1971), and Midnight Cowboy (1969)

Many critics have asserted that the unprecedented success of Jaws (1975) and Star Wars (1977) can be explained by a cultural zeitgeist which was ready for good old-fashioned fun and escapism. The real world had gotten quite serious and depressing, and the movie-going public could only take so many Taxi Drivers and Dog Day Afternoons. I believe that the success of The Sting can be attributed to similar factors.

The film takes place in 1936, only 37 years prior to the its release. Exactly the same number of years have passed since The Sting opened in theaters. However, the film is deliberately crafted to give an aura of the much more distant past. The ragtime soundtrack, hand-drawn title cards, and sepia-tinged color palette all create an ambiance more suitable for film set in 1906. The costuming is slightly exaggerated from real 30's fashions, and dialogue is gratuitously peppered with antiquated slang. This is not the real world or the remembered past; this is the imagined past. The characters inhabit a world of nostalgia and exaggeration, like a Norman Rockwell painting.

Like most heist films, the plot takes twists and turns but is, at its core, incredibly simple. A small-time grifter (Robert Redford) sees his partner (Robert Earl Jones) killed at the hands of thugs working for crime boss and general a-hole Robert Shaw. Redford then exacts revenge by enlisting the help of experienced but down-and-out con-man Paul Newman and his various all-star hustler contacts to systematically rob Shaw of $500,000. A corrupt police officer played by Charles Durning appears to nearly foil the scheme, providing the requisite suspense, but through cunning and chicanery the boys pull off the Big Con.

The characters are well-rounded and believable, but the this is not a character piece, nor is the movie interested in the moral complexities that other early 70's films were exploring. Like other heist movies such as Ocean's 11, we are invited to identify with the criminals and live out our secret criminal fantasies through them, all the while with a clean conscience because we know that the victim really deserved it. We know that Robert Redford and Paul Newman are the Good Guys because they are the protagonists and are charming. We know Robert Shaw is the Bad Guy because he had Redford's buddy killed and acts like a jerk. We know that the death of Robert Earl Jones is meant to be tragic because we met his family and they seemed like nice people.

There are movies where people talk like real people, and movies where people talk like movie people. This is the latter. Dialogue such as this doesn't occur in the real world, but sounds great on film:
Loretta: And you expect me to come out, just like that?
Hooker: If I expected somethin', I wouldn't be still standin' out here in the hall.
Loretta: I don't even know you.
Hooker: You know me. I'm just like you. It's two in the morning and I don't know nobody.
Naturally, Loretta then sleeps with him, because that's how things work in the movies. 'Reality' is not a part of the equation. We are here to be entertained.

These sorts of movies are akin to watching a magic trick which you know the secret to performed by a master magician. You know how its done, you know what to expect, but the tricks wows nonetheless due to the sheer skill and slickness of the performer. If The Sting is anything, it is a wholly satisfying movie experience, enhanced by the fun and nostalgic mood it created. The film is timeless escapism, and remains as powerful 37 years after its release as ever.

GRADE: A

Tuesday, September 21, 2010

Confessions of a Basher (trying real hard to be Swooper)

The most recent installment of the weekly IFC Podcast has reminded me of one of the observations contained in Kurt Vonnegut, Jr.'s Timequake, concerning two different styles of writers:

"Tellers of stories with ink on paper, not that they matter any more, have been either swoopers or bashers. Swoopers write a story quickly, higgledy-piggledy, crinkum-crankum, any which way. Then they go over it again painstakingly, fixing everything that is just plain awful or doesn't work. Bashers go one sentence at a time, getting it exactly right before they go on to the next one. When they're done they're done."

As someone who is usually a Basher, I appreciate the evocative terminology that Mr. Vonnegut came up with. Writing, for me, has usually felt like repeatedly bashing my head against the keyboard to make the words come out. I enjoy writing immensely, but it has always been a labor of love with no shortage of painful, grueling labor. I have always felt a deep jealousy for the abilities of the Swoopers.

This has crippled me as a writer. It comes from a reprehensible perfectionism on my part. When I begin a writing project, be it fiction or non, I typically feel unprepared to write a single word before the finished product is entirely mapped out in my mind. I must have every plot point/argument laid out perfectly, every snippet of dialogue rehearsed before I touch my fingers to keyboard or pen. This involves a lot of pacing back and forth, and when I've paced all that I can in my home, a walk or two. 

Finally, a feeling of relief floods through me as I reach the conclusion that, "Yes, I have this thing all figured out. Now the hard part's done. All I have to do now is write the darn thing."

In truth, the fun part is done and the painful part has begun. As soon as I type the beginning of my mental masterpiece, I see what absolute crap it is, beginning with sentence one. I painstakingly edit, sentence by sentence as I write, proceeding at a snail's pace. I usually am utterly faithful to my mental outline as far as plot points/arguments etc., but the language that sounded so good my in head reeks of drivel when I see it spelled out, and so I Bash away. But, staying true to Vonnegut's archetype I usually hardly revise much aside from giving it a quick once-over for typos, especially looking for errors the dyslexic variety that the spellcheckers miss (I am constantly writing form instead of from, for example).

Vonnegut was a Basher, reportedly writing one page at a time, refusing to go on to the next page until each was perfect, often revising a page many times. After finishing a page, he would place it in a drawer and not return to it; when he reached the last page the novel was ready to be sent to the publisher. Vonnegut being perhaps my favorite author, I was quite pleased to learn that this was his style of writing and this information served me well as a justification for my own idiosyncratic habits.

However, I am not Vonnegut. He produced many fine novels and stories using his methods. I have yet to produce anything, despite dozens of ideas and false starts over the years, a few of which I feel may have even ended up being decent had I seen the project through. My writing style has not been conducive to me actually writing anything, and I want to break this trend.

Enter this blog. Everything I've written so far has been typed as fast as possible`, having not really thought out what I will write in any detail, moving quickly, higgledy-piggledy, crinkum-crankum, any which way. Admittedly, I then, rather than carefully revising as a true Swooper would, usually immediately hit the "Publish Post" without so much as a read-through. This mostly has to do with the fact that, having a three month old daughter, I have precious little free time these days and I only allot myself so much time to waste on this thing. But if I were writing something more serious and close to my heart than a frivolous blog, I would revise carefully.

The point is, I'm trying real hard to develop some habits of Swoopers. When it come to non-blog writing, I will probably always be a Basher to a degree, but I am trying to use this blog as an exercise in generating words quickly. If I desire to write, which I do, than I need to write and not just think about writing. This means perhaps beginning to write without as clear a mental picture to begin with as I would prefer, and not laboring over each sentence to the point that I never finish anything (my hard drive is filled with the openings pages of aborted writing projects). I need to be less of a Basher. I have much to learn from the Swoopers of the word.

And so, patient reader, forgive me when these blog posts lack a certain polish or perfection. I am viewing them as an ongoing exercise in Swooperism, and little more.

Monday, September 13, 2010

Photoplasty

The comedy website Cracked has a weekly photoshop contest wherein they toss out a topic and readers post humorously manipulated images. Having a few minutes to kill (my three month old daughter is asleep in my lap and I dare not move), I decided to enter this week. The topic is "If Historical Figures Got Gritty Reboots."

Posting a photo requires being able to link to the image, which I can do after posting it on blogger, hence the reason behind this post.

Sunday, September 12, 2010

Real People vs. Reel People

As an avid library/reading-at-Barnes-and-Noble-without-buying-anything fan, I sadly do not own a copy of many of my favorite books. So when I saw a copy of Bryan Burrough's excellent history book, Public Enemies, on clearance for a couple of bucks, I naturally snatched it up. Upon re-reading the book and enjoying it just as much as I had the first two times, I can say the purchase was definitely worth it. However, there's one thing about the copy I bought that bugs me:

Johnny Depp is on the cover.


Now, I have nothing against Johnny Depp. What I find odd about his presence on the cover is that Public Enemies is a thoroughly-researched non-fiction book about real criminals and lawmen during the early 30's, including John Dillinger (whom Depp played in the movie). If this was a novel adapted into a film, this wouldn't bother me. However, since the book is a factual account of real people who, like Dillinger, were amply photographed, wouldn't it be more appropriate to feature the real John Dillinger on the cover?

Only one of these men robbed banks for a living.

[And as long as being I'm nitpicky, I'll also point out that Dillinger (or anyone else recognizable) wasn't on the original cover since the book is about the 1933-34 crime wave in general, weaving together the stories not just of Dillinger and his gang but of Pretty Boy Floyd, The Barker-Karpis Gang, Machine Gun Kelly, Bonnie and Clyde, Babyface Nelson, J. Edgar Hoover, Melvin Purvis and dozens of others.]

Now of course, the Dillinger strand of Mr. Burrough's book was adapted into a film directed by Michael Mann and starring Depp in 2009 (in my opinion, an incredibly disappointing film, but that's for another post), which is why the book has the cover it does. Of course, movie studios need to promote their products. I get that. They could use the font and imagery (a 1930's car, a close-up on a tommy gun, e.g.) associated with the film, and slap a large "Soon to be a Major Motion Picture Starring Johnny Depp!" label on it, but do we need to see Depp's face?

Maybe a Lego guy instead?
This wouldn't bother me if it weren't such a common practice. Take, for example this book about the life of brilliant economist/schizophrenic John Nash:


That's not John Nash. That's Russell Crowe. This is John Nash.

Similarly, some high school student will someday do a paper on John Adams, check out a library copy of David McCullough's Pullitzer Prize-winning biography, and wonder what the hell Paul Giamatti is doing on the cover:


When it comes to this sort of thing, Johnny Depp is a repeat offender. He has appeared on non-fiction books not only as Dillinger but as drug lord George Jung and as Gonzo journalist Hunter S. Thompson.

It gets even weirder when we move from non-fiction to autobiography. Temple Grandin doesn't get to appear on the cover of the book she wrote about her own life; she's replaced by Claire Daines:

When they make a film based on the bestselling memoir I will someday write, I hope they get somebody totally badass to play me:

It could happen.
All kidding aside, I guess what I find bothersome about this practice is that it ultimately seems disrespectful to the real person. Am I supposed to admire John Nash's ability to cope with mental illness, or Russell Crowe's brilliant performance? If Temple Grandin's life story is worth reading, do I need to see Claire Daines staring out at me whenever I reach for the book? Was John Adams a significant figure in American history and a complicated human being, or the protagonist of an HBO miniseries?

In addition to the desire to cross-market books and movies to reach a broader audience, I believe this practice stems from a (not unreasonable) fear that Americans are terrified of reading in general, and specifically afraid of reading anything that is not entertaining. We love movies, and so the highest honor that can be bestowed upon a book is a film adaptation (hence the ubiquitous "Soon to be a Major Motion Picture!"). We also love fiction, and so many book-reviewers save "reads like a novel" as their highest compliment for works of non-fiction. And marketing folks assume that the highest honor that can be bestowed upon an individual is to be portrayed by a (usually) better-looking celebrity.

This phenomenon relates to the fact that, when marketing movies, star power is everything. Independent films can hardly scrape together financing without a star of some kind, and getting distribution and even admittance to most festivals depends on having at least a B-lister attached to your project. And when it comes to Hollywood marketing, the preference is to create movie posters which consist of a bunch of floating celebrity heads. You may not get a sense of what the movie is about, but if Brad Pitt is in it, people will see it.


I imagine the same marketing logic dictates that more people will be interested in learning about John Dillinger if they see Johnny Depp on the cover. I really hope this isn't the case. It certainly doesn't have to be, even for movies. After all, countries like Poland have often successfully sold movies with no celebrities in the marketing at all.

The rather death metal-ish Polish poster for "Raiders of the Lost Ark." Not pictured: Harrison Ford.
In my work as a guide at two different historic sites, I often tell myself, "If people can give a crap about what Brad Pitt said that angered Angelina Jolie last night, I can get them excited about history. We're all wired to care about human stories, and that's what history is full of." People read both Pulitzer Prize-winning biographies and grocery aisle tabloids because we have a fundamental craving for stories about other people. We are a social animal. But I seriously hope that we don't ever reach the point where people care about historical figures only because they had a TV miniseries based on them, or read about the lives of brilliant but troubled mathematicians only because we're fans of Russell Crowe. Let's give real people the respect they deserve and let them appear on the covers of their own books, and not actors.

Its the sort of basic respect for human dignity that Founding Fathers such as Paul Giamatti fought to preserve.





Monday, September 6, 2010

Review: The Final Destination

I want my money back.

Usually, I've said that after paying $10 to see a bad movie in theaters. However, I rented The Final Destination from a Redbox. Still, I want my dollar back.

Perhaps I'm being overly harsh. After all, what should I expect from the fourth in a series devoted to needlessly complicated over-the-top death scenes? The first film (I'll admit, I've not seen the second and third installment) was not exactly high art.

It was, however, good at what it did. That's all I really ask of a movie. To recap: the first Final Destination  (which the fourth follows plot point for plot point) involved a character who received a premonition of a horrible disaster about to occur. This character saves himself and his few friends, who narrowly miss the fate that should have been theirs. Since they cheated Death, the Grim Reaper now has decided they will all die in the order that they would have originally. Nobody has a stroke or anything quick and simple; each death scene is a Rune Goldberg contraption of deadly terror.

The film was stupid, but were at least fun. It came up with ghoulishly ridiculous, but undeniably interesting and creative, methods of death, and knew how to milk to "ewww" factor for all of its worth. Most importantly, there was effective suspense. The characters were surrounded by potential causes of death, and the original film got much mileage out of fake-out scenes where everything turns out to be fine. When the peril finally set in for real, the scenario was then elevated to such extremes that our expectations of imminent death were constantly being teased as the elaborate situation continued to crescendo to its gruesome conclusion. 

The Final Destination gets off on the wrong foot, however with an opening scene in which almost no buildup whatsoever precedes the onslaught of dismemberment. A car crash at a racetrack leads to many audience members dying in an orgy of CGI that would look disappointingly cheap in a SyFy Channel original movie. Everything flies straight at the camera, because this film was originally shown in 3D and the filmmakers don't realize how quickly that gag loses its novelty. We of course learn that this was just a vision of things to come, and our Protagonist (the characterization never really explores depths beyond that) manages to escape the real disaster along with Protagonist's Girlfriend, Female Friend, Douchebag, Black Guy, Soccer Mom and Racist. The Racist (who's actually listed as such in the credits) is sadly the most developed of these characters; we know he's a racist because he says "There goes the neighborhood" when Black Guy approaches, whistles 'Dixie', and has a Swastika tattoo on his arm. Get it? Later, we see him about to burn a cross in Black Guy's yard. Because he's racist, and that's what racists do, right?

The opening sequence fails on just about every level in ways that will be repeated throughout the film. None of the scenes are suspenseful or surprising, the deaths are not creative or inventive, the fake-outs are too obvious, and the "ewww" factor exploited by the rest of the franchise is left underwhelmed by a lot of obviously CGI splatter but no real horrors.

The film is gratuitously bloody, but gore enthusiasts are likely to be as disappointed as fans of good film making will be. The filmmakers apparently think the human body behaves like a water balloon filled with blood which will burst at the slightest provocation. We see [SPOILER ALERT] people explode and spray blood everywhere after being hit with flying debris, after being hit by a car (in a moment stolen from the first film), and we see a man sucked into a drainage pipe in a swimming pool (the pipes of course then burst out blood and guts) and man pushed through a chain link fence (he comes out the other side like spaghetti). [END SPOILER] The human body and the laws of physics do not work this way. An audience can only suspend disbelief so far, and by being less faithful to reality than most Looney Tunes, The Final Destination lacks the visceral thrills that horror fans crave. Instead we have all the realism of a violent early-90's videogame (and I refer to the plausibility as well as the level of special effects).

All in all, if there's a silver lining in this film, it is that it forced me to appreciate what had been well done by a film which, prior to seeing the fourth film, I would have only have admitted to slightly enjoying. In retrospect, Final Destination comes off as a relative masterpieces of cinematic art.

Grade: F

Sunday, September 5, 2010

To Blog or Not to Blog?

I have an irritating habit.

Every night, right around bedtime, I begin rambling to my wife about whatever random topic happened to be going through my head that day. It is as if my brain must purge itself of any unheard inner monologues in order to be in a condition ready for sleep. This means my wife gets to suffer through unsolicited rants on whatever movies I've seen recently, history topics I'm currently obsessed with, politicians that irk me, and any number of eclectic subjects.

Her response is usually a polite, "That's interesting, honey. Maybe you should start a blog."

Aside from sparing my wife from having to sit through late night ramblings, perhaps writing a blog would give me an ideal outlet for these random thoughts that need to escape my skull. However, I've always been skeptical of the Personal Blog. Blogs on specific topics by someone who actually has something worthwhile to say, for example a blog on history by a published historian, seem to make sense, but a personal blog always struck me as rather egotistical.

After all, if I type something and publish it, even if only via the internet, isn't there an implicit assumption that I expect someone to read it? And who am I to assume that my late night ramblings are worthy of being read by someone?

And so, I've decided to start this blog off first by offering a disclaimer: I don't assume anyone will/should read this, or that I will necessarily have anything noteworthy to say. Of course, I'd be lying if I didn't admit that I hope to occasionally have something worthwhile to say, and perhaps even a follower or two. But that is not why I've chosen to blog.

My reasons are thus:

1) I need outlet for the ramblings of my overactive mind (or, to quote H.L. Mencken, "I write in order to attain that feeling of tension relieved and function achieved which a cow enjoys on giving milk.").

2) The process of writing will hopefully help me organize my thoughts on topics I have been pondering.

3) I don't write very often since leaving school. Written communication being a necessary and valuable skill, and skills being something which improve with practice, than it follows that the more I practice writing (of any sort), the better.

4) Material that first appears in rough, unpolished form on this blog may perhaps be refined into something usable elsewhere. Good writing, after all, is good rewriting.

5) I used to write a LiveJournal a number of years ago, in the pre-Facebook era when such sites were much in vogue, and I actually found it quite cathartic and enjoyable. And no, I won't link to it; the posts were made by a younger, less mature, angrier and surprisingly profane version of my self and it is rather embarrassing to read now (as I'm sure this post will someday seem).

6) Perhaps, occasionally and accidentally, I may have something worthwhile to say.

And so without further adieu, I present the debut of my personal blog, North By NorthWeston (all the good blog titles were taken, and so I had to go with a pun. I seriously spent about forty minutes just thinking of great names, only to find them all unavailable. At least North By Northwest is one of my favorite movies, so it fits.)