Wednesday, August 17, 2011

Marketing 101 from Starbucks

Starbucks has a bunch of new ads around town. All of the have the same slogan at the bottom:

"It's not just coffee. It's Starbucks."

A few things come to mind. First of all, can HBO sue them for stealing their slogan formula, "It's not [widely available product]. It's [brand name]."?

Ah, but this ad doesn't contain the word "just."

Secondly, the slogan seems to be an encapsulation of the entire purpose of marketing. After all, isn't the whole point to get people to favor a particular brand of [product, in this case coffee] over just any old type. The goal of the advertisement is to convince the consumer that one brand stands out as special above all the rest. This is just any old coffee, they want you to believe, This coffee is special. Or, put more succinctly, "It's not just coffee. It's Starbucks." Any more straightforward and the ad would read, "Buy our product. Starbucks."

But of course, it is just coffee. I know plenty of people who love the taste of Starbucks coffee (I've ordered plenty myself), but most people would also just as gladly taste a brew from a Caribou, Dunn Brothers, Seattle's Best, or any local independent coffee shop. Sure, all these brands of coffee taste slightly different, but not drastically so. There are really only two kinds of coffee: decent coffee and bad coffee. Bad coffee may be found at gas stations, church basements, McDonald's (yes, even after their much-hyped switch to espresso drinks and "gourmet" coffee) etc. Decent coffee can usually be found at any of the coffee shop chains or any independent coffee shop.

If there is more than one laptop present, chances are the coffee will be decent.

When it comes down to nitpicking, I slightly prefer the taste of Dunn Brothers Coffee over Caribou Coffee, and likewise prefer Caribou over Starbucks. Hypothetically, if all three were at the same intersection (and I'm sure there are at least a couple intersections like this around) this would inform my purchasing. But the main factor in my coffee buying decision is, "Which store is closest?" If all I need is a cup of coffee, I'll go to a Starbucks any day if its closer than the Dunn Brothers. If I'm looking to do some work, my main considerations are access to wifi, chairs, tables and outlets. The coffee is not even a factor, because, really, they're not that different as long as they're decent.

And so the job of the marketer is to convince you that, no, the coffees are that different, and their brand is of course the best. This calls to mind a pearl of wisdom from humorist Dave Barry:

The value of advertising is that it tells you the exact opposite of what the advertiser actually thinks. For example:
  • If the advertisement says “This is not your father’s Oldsmobile,” the advertiser is desperately concerned that this Oldsmobile, like all other Oldsmobiles, appeals primarily to old farts like your father.
  • If Coke and Pepsi spend billions of dollars to convince you that there are significant differences between these two products, both companies realize that Pepsi and Coke are virtually identical.
  • If the advertisement strongly suggests that Nike shoes enable athletes to perform amazing feats, Nike wants you to disregard the fact that shoe brand is unrelated to athletic ability.
  • If Budweiser runs an elaborate advertising campaign stressing the critical importance of a beer’s “born-on” date, Budweiser knows this factor has virtually nothing to do with how good a beer tastes.
Keeping this truism (which never fails) in mind, lets look at some more of Starbucks' new ads:

In other words, "We're afraid you'll realize decent coffee can be had slightly cheaper across the street." And also, "Even though you may be unemployed, $4.00 for a venti is still worth it."

Or, "Please ignore the dozen other coffee places within a block of here. Starbucks or nothing!"

Or, "We can see why you'd think we we're just a massive faceless corporation which churns out millions of cups of coffee the same way fast food joints crank out hamburgers. And we understand that you'd rather support the local independent shop down the street. We're here to assure that we still have the personal touch (i.e. some underpaid person still manually hits the button on the fully automated espresso machine)."

Which is just another way of saying, "We admit it. We burn our beans, giving our coffee a slightly bitter taste which some of you may find disgusting."

Cynicism from people like me aside, I'm sure the campaigns will be a success. Ad people know what they're doing.

As Don Draper said in the first episode of Mad Men, “We have six identical companies making six identical products; we can say anything we want.”

Sounds kind of familiar.

Saturday, August 13, 2011

When dialogue reigned supreme?

Recently Dan Bloom posted a piece on The Wrap titled "When Movies Once Mattered- And Dialogue Reigned Supreme," which laments the current state of screenwriting and yearns for the "good old days" of the 30's, 40's and 50's. The movies Bloom waxes nostalgic for were the ones he'd seen while working at the American Film Institute Theater at the Kennedy Center in Washington. I'd like to offer a counterpoint to his opinion, but first a representative sample of Bloom's piece for those who don't feel like clicking the link and reading the whole thing:
Movies aren't just the same anymore. Well, of course not, and we all know that. But one thing that has changed much more than the movie technology or the digitalization wizardry or the special effects is the way movies were written.
Repeat: written.
In the old movies, dialog reigned supreme. Words mattered. Sure, plot was important, and character development and story arc and act three and the denouement, but what stood out in almost every old movie I saw at the AFI in those days was the writing, the dialog.
The conversations always sizzled. Whoever the writers were -- and some were famous novelists, others unknowns that toiled in the early days of Hollywood -- they knew how to create movie dialog.
He finishes his piece in top crotchety-old-man form, something he at least acknowledges self-deprecatingly:
Words winged off the screen! Conversations conversed! Wit reigned! Movies mattered.
I'm almost 100 years old now. It's getting late, and it's getting dark outside.
Sure, I'm kidding, but in a way, I'm not. Movies once mattered, and they still do, sure, but now they have other considerations. It's a different ballgame now, and the writers have
long retired to the clubhouse to gloat and to gripe.
Cinema paradise has been replaced by movie muggles. Am I getting old or did the world just pass me by?
The problem with his logic is that he's comparing the average movies of today (presumably; he actually doesn't offer any examples of the movies today that bother him) with the classics of the 30's, 40's and 50's. If a film was being screened at the AFI, then it was a movie that had already been selected and remembered as a great movie; not because it was necessarily a representative sample of movies from that era. The movies Bloom was watching were the acknowledged cream of the crop. Of course they had great writing.

I would argue that there was just as much poor writing in movies then as now. The old movies that still get watched are the good ones (or the ones that so bad as to develop cult status). In the era of Netflix, with thousands of titles just a click away, its easy to forget that the majority of old films are out of print, have never been released on video and are unavailable. Many of them were even blockbusters in their day, but were mediocre at best and have not stood the test of time (usually; there are of course a few great films that have slipped down the memory hole as well, but I'd be willing to bet they are far outnumbered by more middling to poor quality fare). Most of these movies are long forgotten, and so their loss is rarely mourned. Fifty years from now, I bet no one will remember the existence of Transformers: Dark of the Moon or lament its unavailability. However, the great movies being produced now will stick around.

Even many bonafide classics can still have clunky dialogue. Here's a gem of a scene from the "golden" 30's:

                         Yes. I'm - I'm awfully excited.
                         It's all so strange, and I've never
                         been on a ship before.

                         And I've never been on a ship with
                         a woman before.

                         I guess you don't think much of
                         women on ships, do you?

                         No. They're a nuisance.

                         I'll try not to be.

                         You got in the way already. Better
                         stay below.

                         What! The whole voyage!

                         Say, I didn't apologize very good
                         for hitting you. That was an awful
                         sock in the jaw.

               Driscoll stares at her.

                         Well, we're off.

                         We're off.

Can't you just sense the will-they-or-won't-they chemistry between these two? Because after just a few more scenes (and some more great clunkers, like "Aw, you're swell. Women can't help being a bother. I guess they're made that way."), here's where they've arrived in their relationship:

                         When I think what might have
                         happened today -- if anything
                         happened to you.

                         Why then you wouldn't be bothered
                         with a woman on board.

                             (very staccato)
                         Don't laugh. I'm scared for you.
                         I'm sort of - I'm scared of you,
                         too. Ann, I -- I guess I love you.

               They look at each other, both startled by this conclusion.

                         Jack! You hate women!

                         You aren't -- women. I love you.
                         Ann, I don't suppose -- you don't
                         feel like that about me -- do you?

               Ann looks at him soberly for a moment, then takes a step

The movie in question went on to become the highest grossing film of 1932, was critically acclaimed then and now and was placed on the AFI's list of 100 Greatest Movies.  The film is, of course, King Kong. Its a classic (one of my all-time favorite movies, actually), but that's in spite of its dialogue rather than because of it (the one exception being the great final line, "No, it wasn't the aeroplanes. It was beauty killed the beast.") Without the amazing special effects of Willis O'Brien (and the the humanity he was able to give the animated Kong), King Kong would be just another long-forgotten bad movie.

This is a movie where a giant ape beats up dinosaurs. It doesn't need to be Shakespeare.

My point is not to pick on Kong but to point out that it is unfair to expect movies to be something they're not. There are some films that depend on witty banter and well-written dialogue. And then there are popcorn films. To let Kong stand as representative of 30's dialogue would be absurd. It would be equally absurd to judge the quality of todays' writing by only looking at the glut of superhero CGI-fests. If one wants well-written dialogue and smart characterization, one won't expect to find it in a ridiculous movie about aliens battling the navy that is allegedly based on a plotless board game. But, contrary to Bloom's opinion, there are plenty of well-written movies being made today.

Everything about this makes my head hurt.

Just look at the top two Oscar front-runners last year: The Social Network and The King's Speech. These are two movies that consist of hardly anything but people talking. And in both the dialogue is fantastic. The opening scene of The Social Network, to name just one great scene out of many, is a screenwriting concerto with enough wit to rank alongside anything written in the era "when dialogue reigned supreme."

These two films (both of which were critical and commercial successes) are hardly alone. Of course anything with CGI robots will a make a gazillion dollars, but if you look past the popcorn films to the many critically-acclaimed and award-winning movies released each year, you'll find one common denominater: they're all well-written. In my humble opinion, some of the greatest screenwriters of all time are currently practicing their craft and near the top of their game. Anyone who thinks the age of great movie dialogue is over needs to look long and hard at any of the movies written in the last ten years by screenwriters such as the Coen brothers, Aaron Sorkin, Paul Thomas Anderson, Woody Allen, Michael Arndt, Wes Anderson, Alexander Payne & Jim Taylor, Diablo Cody or Charlie Kaufman.

The field of master dialogue craftsmen opens up considerably when you consider television as well. TV, it is often proclaimed, is much more of a writer's medium. A few years ago one could get their great dialogue fix by watching The West Wing, The Office (lets face it, the early seasons were better) Deadwood, or Gilmore Girls; today the torch of excellent writing is being carried by shows like Mad Men, 30 Rock, Community, the BBC'c Sherlock and [insert your favorite dialogue-heavy television show here; I'd be able to name more examples except I hardly watch any current TV shows].

Movies never ceased to matter and dialogue still reigns supreme. Watching old classics, it only seems that something has been lost since time has filtered all but the most memorable movies. For every His Girl Friday or Sweet Smell of Success that came out during dialogue's supposed "golden age," there were dozens of poorly written films, many of which have been long forgotten.

Decades from now, when the AFI screens The Social Network, someone will say, "Boy, they just don't write movies like they used to.

Sunday, August 7, 2011

Telling instead of Showing: Exceptional Cinematic Infodumps

A cardinal rule of good screenwriting is "Show, don't tell." My several semesters of screenwriting in college basically consisted of the professor trying to restate "Show, don't tell" in as many ways possible until (that and indoctrinating us into the three-act structure). Film is, after all, a visual medium, and I tend to agree with Alfred Hitchcock's assessment that silent movies represented the most "pure cinema." There's an old saw that states that with a movie, you should be able to turn the sound off and still follow the story, whereas with a play you should be able to close your eyes and still follow the story. I think that mostly holds true. I have fond childhood memories of sneaking up past my bedtime to watch movies on cable without the sound, because I was afraid the sound would wake my parents. Most movies (especially the action films I was fond of as a boy) still made a surprising amount of sense sans audio. Likewise, listening to an audio performance of a Shakespeare play is as easy to follow as watching a performance because everything is contained in the dialogue. That's because Shakespeare knew he was writing for a theater packed with up to 3000 spectators, many of whom probably couldn't see over the silly Elizabethan headwear in front of them. If a character gets stabbed, he'd better cry out, "I am slain!" just so nobody misses it. My screenwriting professor, on the other hand, would have crossed out "I am slain!" and written "UNNECESSARY. LESS DIALOGUE. SHOW, DON'T TELL" in angry red pen in the margin.

The extension of this rule is that in a movie, any important plot point or crucial information must be seen by the audience. In a play, its perfectly acceptable for a minor character to report that Hamlet's ship has been attacked by pirates; in a movie we'd better have a scene where we see the pirates attack. Or if there's some important event in the lead character's past, don't just have people talk about it- show a flashback.

That being said, every rule has its exception. The sci-fi site io9 had some articles a few months back, "5 situations where its better to tell than show in your fiction" and "20 great infodumps from science-fiction novels", which focus on notable "infodumps" in sci-fi novels. Lately, I've been thinking about instances where movies have successfully broken the cardinal "Show, Don't Tell" rule.

And so, although most good movies subtly parse out exposition through action and dialogue, here's six examples of movies that effectively broke the rules and quickly spelled everything out in massive infodumps:

The Lord of the Rings: The Fellowship of the Ring (2001): Opening Sequence

The Infodump: If there's one thing J.R.R. Tolkien excelled at, it was world building. So how do you condense a 1000-page epic and an entire Silmarillion's worth of backstory into three movies? Well, Peter Jackson begins with the simplest and most direct route to backstory: have a narrator explain everything you need over an opening montage.

The first of about 9,000 gratuitous close-ups of rings.

Why It Works: For most movies, this would be a clumsy and terrible way to open the film, but the mythopoetic nature of Tolkien's novels lends itself to this type of storytelling. In a few minutes, we learn about Middle Earth and all its races, of Sauron, the One Ring to Rule Them All, and how it ended up in Gollum's hands. Thanks to the Norse-saga inspired tone of Tolkien's story, Peter Jackson's iconic visuals and Cate Blanchett's eerie (yet strangely alluring) voice-over, this sequence wonderfully establishes the world so we can get on with the story.

Would an opening voice-over packed with backstory work in a more contemporary setting than Middle Earth? Well, it did in...

Goodfellas (1990): Ray Liotta tells us his life story

The Infodump: "As far back as I can remember, I always wanted to be a gangster," Henry Hill (Ray Liotta) informs us after a brief opening teaser. He then spends the the next ten minutes narrating his whole life story, his motivations, and introducing us to the main characters.

Why It Works: In AdaptationCharle Kaufman lets screenwriting guru Robert Kckee dispense some conventional screenwriting wisdom: "God help you if you use voice-over in your work, my friends. God help you. That's flaccid, sloppy writing. Any idiot can write a voice-over narration to explain the thoughts of a character." However, Martin Scorsese has made a whole career out of successfully breaking this rule, most notably in his magnum-opus Goodfellas.

Scorsese incorporates the voice-over as part of the film's aesthetic, as crucial to the feel of the whole as the pop-tune filled soundtrack or the contrasty cinematography. Without the narration, Goodfellas wouldn't be Goodfellas. The narration also serves to help us identify with the character of Henry Hill, despite all the despicable things we see him do. After all, he's talking straight to us, treating us as confidants.

The fact that the opening narration covers a lifetime of events establishes an engaging "storytelling" feel, while cuing us in to the fact that the events depicted in this movie will span many years. This is a crime epic, not a carefully observed analysis of a moment.

Ultimately, we see the world through Henry Hill's eyes, and so the direct line to his thoughts and feelings does not feel like the cheat it would if the movie were exclusively about him. The movie is as much about the various criminals Henry meets, and by spelling out Henry's backstory we can devote more attention to them. Besides, in the world of hardened gangsters, one is not likely to share much emotionally or otherwise, and so the voice-over gives us access to facets of Henry's character that would otherwise be unobservable.

This is why many of the great voice-overs in film history come from introverted or otherwise close-off characters. This is most true of Travis Bickle's voice-over in Taxi-Driver. Travis Bickle is a man of few words, with trouble connecting to other people (to say the least). His voice-over gives the audience a chilling glimpse into the mental state of the character in a way his dialogue and actions alone never would.

Now, movies also get away with voice-over because we are still seeing things as we listen. Even if the story is primarily told in an auditory fashion, we are still having a visual experience and thus it feels like a movie and not just a story someone is telling. But could a movie get away with just a voice, or even more minimally, just text?

Star Wars (1977): Opening Crawl

Text at the beginning (or even during) movies, is of course a grand tradition going back far into the silent era. Filmmaking pioneers like D.W. Griffith often employed title cards stuffed with text not only to clarify and advance the story, but also to help overcome the common prejudice that film was a mere novelty, hoping elevate film to a novelistic, literary "high" art.

Back in the silent era, what was considered "high art" was more likely to be explicitly racist.

But even in the silent era, filmmakers limited how much text would appear in one sitting. The occasional D.W. Griffith paragraph notwithstanding, most title cards kept the text as short as possible.
Star Wars opens with a typically short title card:

I personally would have added a comma after 'ago.'

This, however, is followed by a full minute and a half of onscreen text letting us know what has transpired up to this point. How does Lucas get away with it?

Why It Works: Two words: John Williams. Sure, opening crawls were a staple of the 30's serial genre that Lucas is paying homage to. But I think the real reason why the opening crawl doesn't bore us to tears is John Williams' stupendous score. It hits the right tone of fantastic, swashbuckling adventure, letting us know that what we are in store for is going to be epic and its going to be fun.

Psycho (1960): The Psychologist Explains Everything

So far I've given examples of massive infodumps at the beginning of a story. But what of the movie with so many loose ends that quick, direct exposition is needed at the end to wrap things up? Alfred Hitchcock's Psycho provides the best example. After all is said and done, a Psychologist comes out and explains exactly what the hell was going on. 

Hitchcock's famous delineation between surprise and suspense has been so often repeated as to become cliche, but I think in this case its worth mentioning. Since its frequently paraphrased and bastardized, we might as well see the original quote itself:
There is a distinct difference between "suspense" and "surprise," and yet many pictures continually confuse the two. I'll explain what I mean.

We are now having a very innocent little chat. Let's suppose that there is a bomb underneath this table between us. Nothing happens, and then all of a sudden, "Boom!" There is an explosion. The public is surprised, but prior to this surprise, it has seen an absolutely ordinary scene, of no special consequence. Now, let us take a suspense situation. The bomb is underneath the table and the public knows it, probably because they have seen the anarchist place it there. The public is aware the bomb is going to explode at one o'clock and there is a clock in the decor. The public can see that it is a quarter to one. In these conditions, the same innocuous conversation becomes fascinating because the public is participating in the scene. The audience is longing to warn the characters on the screen: "You shouldn't be talking about such trivial matters. There is a bomb beneath you and it is about to explode!" 

In the first case we have given the public fifteen seconds of surprise at the moment of the explosion. In the second we have provided them with fifteen minutes of suspense. The conclusion is that whenever possible the public must be informed. Except when the surprise is a twist, that is, when the unexpected ending is, in itself, the highlight of the story. [emphasis added]
Hitchcock's  Psycho is a prolonged experiment in blending suspense and surprise. We are in suspense because we think there's a bomb under the table; suddenly, we're surprised to find out that its not a bomb at all but something much more dangerous we weren't expecting. All the great sequences start out as a conventional Hitchcockian suspense sequence and then a surprise pulls the rug out from under our feet and shocks us. [SPOILERS FOLLOW. If you haven't seen Psycho, do yourself a favor and go rent it.]

The title or the brilliant ad campaigns promoting the film don't give anything of the plot away, and that was intentional. Its unfortunate that everyone today, even if they haven't seen the movie, knows that Janet Leigh gets murdered in the shower because that is truly one of the greatest surprises in film history.

At the beginning of the movie, audiences of 1960 would have assumed that this is a movie about Marion Crane (Janet Leigh) stealing $40,000 from her employer. The suspense is ratcheted up as she is followed by a cop and continues to flee. She gets in the shower, leaving the money sitting on the bedside dresser and leaving herself vulnerable to either the cop following her or the voyeuristic manager of the hotel, who has a peep-hole which he uses to spy upon Marion as she undresses. And then all of a sudden-

Our protagonist is shockingly murdered by an knife-wielding old lady. This is a surprise.

No longer the protagonist.

Then something extraordinary happens. With the cry of, "Mother! Oh God, mother! Blood! Blood!" coming from the Bates house, Norman Bates becomes the protagonist. We realize that Marion was murdered by Norman's domineering mother. She must be he titular psycho. As we watch him painstakingly clean up his mother's murder (throwing out the $40,000 dollars as he does) we realize that we've been following a massive red herring. The movie is not about a woman on the run who stole some money. Its about a man with maternal attachment issues who realizes his mother is a psychopathic killer. How will he deal with this situation? We want to see him wrestle with this issue, eventually reaching the peak of his character arc when he overcomes his need to protect Mother and turns her in. He's flawed, but at this point in the film he's the one who has our sympathy. We even think "Oh no!" when Marion's car, containing her body and all the evidence, stops sinking into the swamp, and we breathe a sigh of relief along with Norman when it resumes sinking.

The story continues as a suspense film as Detective Arbogast heads into the house and up the stairs. No, not up the stairs, the audience thinks, That's where Mother lives! And, as to be expected, mother murders the poor detective.

Next comes Vera Miles, Marion's sister, and Marion's boyfriend Sam Loomis. As Vera investigates the house alone (No! Not the House!) and Loomis interrogates a fidgety Norman Bates, we once again sense our sympathies realigning. Oh, I get it, we think, Norman's never going to betray his mother. He's not the protagonist, but an antagonist. Vera is the protagonist. And she's headed right for where mother is! The suspense reaches its peak when she slowly approaches the old lady sitting a basement chair. Surely, she'll spin around knife in hand. This is classic suspense. And then-


Followed by:


So mother is corpse and Norman Bates a cross-dressing murderer. Perhaps there's some 'splainin' to do. Which brings us, at last, to:

The Infodump: All is made clear and all loose ends wrapped up by the police psychologist who explains everything using a bunch of bogus psychology that would inform Hollywood's (mis)understanding of split-personality disorder for decades .
Dr. Fred Richmond: His mother was a clinging, demanding woman, and for years the two of them lived as if there was no one else in the world. Then she met a man... and it seemed to Norman that she 'threw him over' for this man. Now that pushed him over the line and he killed 'em both. Matricide is probably the most unbearable crime of all... most unbearable to the son who commits it. So he had to erase the crime, at least in his own mind. He stole her corpse. A weighted coffin was buried. He hid the body in the fruit cellar. Even treated it to keep it as well as it would keep. And that still wasn't enough. She was there! But she was a corpse. So he began to think and speak for her, give her half his time, so to speak. At times he could be both personalities, carry on conversations. At other times, the mother half took over completely. Now he was never all Norman, but he was often only mother. And because he was so pathologically jealous of her, he assumed that she was jealous of him. Therefore, if he felt a strong attraction to any other woman, the mother side of him would go wild. [Points finger at Lila Crane] When he met your sister, he was touched by her... aroused by her. He wanted her. That set off the 'jealous mother' and 'mother killed the girl'! Now after the murder, Norman returned as if from a deep sleep. And like a dutiful son, covered up all traces of the crime he was convinced his mother had committed!
He goes on to clear up every other point of potential audience confusion, including who got the $40,000 ("The swamp. These were crimes of passion, not profit.") Incidentally, this scene contains the first use of the word 'transvestite' in a Hollywood movie. The MPAA originally objected, until screenwriter Joseph Stefano argued that it was a clinical term.

Why it works: Normally, cramming all that crucial information into a speech by a newly introduced character whose sole purpose is to explain everything would be considered terrible writing. But in this case, no one cares. We go to see a movie like Psycho for the suspense and surprise, and Hitchcock has more than delivered the goods on both accounts. In my opinion, its still the best blend of suspense and surprise that Hollywood has ever come up with, and the ending remains the quintessential twist ending, one that M. Night Shyamalan wishes he could come up with.

Right now, he probably just wishes he could be the M. Night Shyamalan of ten years ago.

The suspenseful and surprising portion of the story has ended, so now all that remains to be done is to clean up the loose ends and as quickly as possible and take a bow. Anything more would be unnecessary. We've already gotten our money's worth. And so, after we see Mother's corpse and Norman in a dress, all we need is a brief psycho-babble explanation, a final unsettling moment with Norman, a subliminal dissolve into mother's face and THE END.

Jurassic Park (1993): The Mr. DNA Film

I've you've ever read a book by Michael Crichton novel, then you know that he loves to have his scientist characters talk for pages and pages just so that he can include a bunch of interesting stuff he came across during his research (and sneak in a few whoppers along with the facts). And if you're a fan of Michael Crichton (as I consider myself to be), this is one of the things you love about him.

Readers of Jurassic Park will get to learn lots about dinosaurs, computers (circa 1989), Chaos Theory, and genetics. The section that explains how the dinosaurs were cloned lasts well over thirty pages in the book. Obviously, a movie has a more limited amount of time in which to convey the same information.

Screenwriter David Koepp uses the fact that Jurassic Park is a tourist attraction to his advantage. He lets a film-within-the-film designed for the tourists tell us all we need to know, courtesy of the friendly animated "Mr. DNA."

Why It Works: A quick, oversimplified explanation in layman's terms is exactly what you'd expect to be given to tourists- and that's all the audience needs before we're ready to move on to seeing dinosaurs. Sure, the science is bogus, but it sounds reasonable enough as explained in a spot-on parody of the educational films of yesteryear. Again, credit John Williams for a score perfectly suited to the material.

The idea of a film-within-a-film conveying essential exposition was nothing new in 1993. This is something you can trace back all the way to...

Citizen Kane (1940): News On the March

A slow approach to a gloomy old mansion where an old man lays dying. He holds a snow globe, which rolls out of his hand as he utters his last word, "Rosebud."

These moments are followed by ten minutes of fake newsreel footage, the "News on the March," summarizing the life of Charles Foster Kane.

We then are in a screening room with a bunch of reporters who were just watching the same film we were. They are challenged to find the meaning of "Rosebud," which they will attempt to do by interviewing various people who knew the Hearst-esque newspaper magnate.

God bless you, Gregg Toland.

Why It Works: As with Jurassic Park's Mr. DNA film, it helps that Welles absolutely nails the cheesy tone of 1940's March of Time newsreels (the radio counterpart of which had actually been narrated by Welles for a time). He even went to the trouble of dragging the film stock in the dirt in order to give the footage a suitable scratched appearance.

The film turns out to be just one of several competing versions of Kane's life, the others being told by his former friends and associates. All of them are somewhat unreliable narrators, and the blandness of the newsreel is a good place to start before moving to the biases of Kane's contacts. Welles and Mankiewicz's innovative script skips around through the events of Kane's life, and so having a grounding in the basic story is great way to start.

Enough has been written on Citizen Kane to fill several small libraries, so I won't say too much more, except to note that the film-within-a-film-as-exposition technique pioneered here has been copied too often to list. This is the grand-daddy of all cinematic infodumps, and perhaps greatest of them all.

Saturday, August 6, 2011

Irritating Trailer Cliche Du Jour: Incessant Fade In/Fade Out

In a summer where Hollywood has given us a heavy does of sequels, remakes, reboots and adaptations of already existing material (i.e. every summer...), there have been many people voicing the tired complaint that Hollywood is out of ideas. And as anyone familiar with how Hollywood works knows, the problem is not so much that they're out of ideas (the thousands of un-produced original scripts that get sold to studios every year negate that myth) but that they're worried that original works won't make money. A script might be great, but if doesn't have built-in name-brand recognition how will they know how to sell the movie? In short, Hollywood is timid about marketing.

This being the case, it should come as no surprise that movie marketing is perhaps the most cliche and unoriginal facet of the Hollywood machine. Even movies  that are all quite different from one another will appear to be the exact same movie in promotional materials. For example, take note of some annoying poster trends that others have pointed out, such as "Floating Celebrity Heads", "Just Add Sparks", "Everything Must Be Diagonal" or any other trend that made's list of "16 Movie Poster Traditions That Need To Die In A Fire." And if you haven't seen the "Trailer For Every Academy Award-Winning Movie Ever"yet, you really ought to click on that link.

Similarly, within a particular genre there are certain things every movie trailer does. All comedy trailers have to hit the same beats and feature the same style of editing, and the same with every other genre. And there are some trailer trends that are popular across genres. One such trend in movie trailers right now (actually, its been around awhile but is definitely on the rise) that has really started to irritate me is the constant use of the 'Fade In Fade Out Dissolve'.

This is where the image fades to black and then immediately fades up on the next shot. It can, of course be a slow fade or only a few frames. In most movie trailers your average Fade In Fade Out is around one second (although most contain a mix of fast and slow), which coincidentally happens to be the default length in Final Cut Pro. Need an example of how trailers use this effect? Watch just the first 22 seconds of this trailer of Tailor, Tinker, Soldier, Spy, which features 11 individual fades (there are no fades during the rest of the trailer):

Why do they do this? One reason is it makes not particularly dynamic footage appear more dynamic and exciting. In Tinker, Tailor, Soldier, Spy they are trying to grab your attention right off the bat, and this editing technique adds a little pizzaz to some footage of guys sitting around talking.

In traditional film grammar, a fade often signifies the passing of time, and since trailers are editing together footage from throughout a movie, it makes a certain amount of sense that fades would be employed. However, as often as not they're merely there to add excitement. For example, in the trailer for Captain America: The First Avenger, during the reveal of Steve Rogers' Super Serum-enhanced body (at approximately 1:07) a single continuous shot is broken up with three fades, presumably to heighten the drama of this pivotal moment (incidentally, the whole trailer contains 35 individual fades, not counting those on the titles).

Often this technique is used specifically to indicate that something is supposed to have dramatic weight. Comedy trailers use Fades the least of any genre (action and horror movies seem to use it the most), but they employ a fade or two at the requisite dramatic moment. The trailer for Crazy Stupid Love contains just two quick Fade In/Fade Outs (at approximately 1:47), but they are at the exact moment that poignant music fades up and we see images of actors crying and generally emoting. This is brief moment that lets us know that not only will this movie be funny, but will also have heart.

Here's the tally for a few of the trailers in imdb's Featured Trailer gallery:  There Be Dragons has 39 individual fades,  Rise of the Planet of the Apes has 35, Twilight Saga: Breaking Dawn has 23 (plus two white Flash Frames, a less common but equally cliche trailer technique), The Thing prequel/reboot has 21, Friends With Benefits has 11, and Sherlock Holmes: A Game of Shadows almost restrained with its mere 8 Fades (although it has 18 Flash Frames). The most Fade-happy trailer of late is for the completely unnecessary Spiderman reboot:

That's 54 'Fade In Fade Out's in the first 1:44 of the trailer, an average on one every three seconds. Of course, there's an number of shots that last significantly longer than three seconds, which means that some of then come awfully quickly.

I would not be opposed to Fades if they hadn't become overused and abused. Like so many other things, it has become cliche. For me, once I began noticing the technique every time I saw it used it began to take me out of the moment. I find it distracting.

Once something becomes cliche, it becomes parodied, and then it is off-limits. A few years ago the trailer for Jerry Seinfeld's Comedian parodied the cliches of trailer voice-overs ("In a world..." "One Man has to..." etc), and I honestly don't think I've seen a serious trailer with a voice-over since. Certainly the phrase "In a world..." will be forever unusable.

So Hollywood, if you like Fades so much, lay off them a little. Save them for when you really need them. Because once every trailer becomes a strobe Fade-fest like the Spiderman trailer, even people who aren't obsessive film geeks will begin to notice, and soon someone will produce a parody and the jig will be up. Once you know how the trick is done, the magic is gone.

Saturday, July 30, 2011

Whoever owns the rights to these posters could make a fortune...

No long post here, just a comment about a couple of images image I stumbled across during some aimless surfing for vintage movie posters (something I do from time to time).

Having spent time at college, I just have a gut feeling that if this poster were made widely available it would become an ubiquitous fixture in every stoner's dorm room across the country. I imagine it would have been especially popular during the 1960's and 70's when everybody would have been steeped in the sort of conservative cultural milieu that John Wayne represented and appreciated the juxtaposition of this icon of Americana and the hippie drug of choice. It has the same sort of appeal as the already popular "Nixon Bowling" and "Grow Hemp For the War" posters.

What makes it for me is the ambiguous nature of John Wayne's expression and body language. He could be fighting the evil dope peddlers and fiends, or all hopped up on the wacky weed, or, as I prefer to imagine, giving an arm-swingingly enthusiastic endorsement along the lines of: "Gee, golly! Marijuana sure is swell!!!"

Likewise, for those collegians who already possess refrigerator magnets of kitschy 1950's women with ironic captions (this demographic includeds everyone from party girls to nerds who want to assert that, yes, they too have a naughty side, thank you very much), many may enjoy this vintage movie poster:

I'm telling you, if these posters were sold wherever Scarface,
John-Belushi-wearing-a-"College"-sweatshirtPink-Floyd-album-covers-on-the-backs-of-naked-ladies or Le Chat Noir posters are, then they would sell like hotcakes.

Thursday, July 7, 2011

Luddite? Depends on your definition.

In my last post, wherein I ranted about why I'm not a fan of e-books, I offhandedly referred to myself as a 'Luddite.' ["As e-readers become more ubiquitous (as I'm sure they will, despite the grumblings of Luddites like myself)..."] This led to my wife questioning me on whether that was a hypocritical thing to call myself, being as I had just posted a link to Project Gutenberg, which she viewed as me promoting the very thing I claimed to be against. So for the record, let me state two things:

1) If I link to something, I do not necessarily approve of it, nor am I trying to promote it. I consider hyperlinks to be the internet equivalent of a footnote. I use them primarily to allow curious readers to find out a little more information, while less curious readers may just continue reading.(Although in this particular case, I do happen to support Project Gutenberg.)

2) I do consider myself a Luddite in the sense that I am often resistant to new technology that changes things in a way that I perceive to be negative (and, likewise, I happily embrace technology that I view as improving things). To use one example, I steadfastly maintain that 35mm film produces a superior image to digital capture. As a professional videographer, the realities of the market have forced me to work mostly in HD since I left film school, but my preference is still to shoot film whenever possible. I have also sometimes used the term 'Luddite' to describe myself due to the fact that, despite working in a technology-dependent field, I am often the last to adopt new technologies. I tend to be stubbornly old-school in my tech habits.

This second point got me thinking about the various ways in which the term 'Luddite' is used and abused. There are several legitimate definitions for 'Luddite' which differ from each other, as well as many ways that the term is frequently misapplied. Language, of course, is infinitely malleable and constantly in flux, and so varying uses and definitions of a word are to be expected. That being said, although new uses for words are constantly being found, there are times when a word is used incorrectly. In order to clearly communicate, there must be some standard definitions. So...

What is a Luddite?

First, let's ask the Internet.

One of the first images that Google brings up for 'Luddite.'

Of the more respectable online dictionaries, Merriam-Webster offers a typical definition:
: one of a group of early 19th century English workmen destroying laborsaving machinery as a protest; broadly : one who is opposed to especially technological change
This is the standard dictionary definition of the term, but this hardly encompasses the range of meanings the word has assumed. Other common uses of the word are summed up by

1. A Luddite is a person who dislikes technology, especially technological devices that threaten existing jobs or interfere with personal privacy.
2. A Luddite is someone who is incompetent when using new technology.
By the first definition, an author who still prefers a manual typewriter can be called a Luddite; more specifically, a projectionist who sees his job threatened by "foolproof" digital projectors is also a Luddite. By the second definition, your Grandmother who can't program her VCR is a Luddite (for that matter, the fact that she still uses a VCR also makes her a Luddite).

So aside from the historical Luddites, a Luddite can be someone who is opposed to technological change, or is against technology that threatens people's jobs, someone who is technologically incompetent, or merely someone who dislikes technology. But wait, there's more...

From the user-contributed Urban Dictionary:
1. One who fears technology (or new technology, as they seem pleased with how things currently are...why can't everything just be the same?)

2. A group led by Mr. Luddite durring [sic] the industrial revolution who beleived [sic] machines would cause workers [sic] wages to be decreased and ended up burning a number of factories in protest
 A luddite [sic] generally claims things were "just fine" back in the day, and refuses to replace/update failing equipment/software/computers on the basis that they were just fine 10 years ago.
So according to this person, Luddite is synonymous with technophobe, as well as referring to the type of stick-in-the-mud who still runs Windows 95 or listens to 8-track tapes (depending on the age of the "Luddite" in question).

This assortment of definitions encompasses such a broad range so as to render the word fairly meaningless. This is evident in some of the usage that Google turned up. For example...

A British politician commenting on Prince Charles' concerns about genetically modified food:
It's an entirely Luddite attitude to simply reject them out of hand.
An American politician commenting on George W. Bush vetoing a stem-cell research bill:
This will be remembered as a Luddite moment in American history, where fear triumphed over hope and ideology triumphed over science.
The bloggers Bottom of the Glass have offered their own definition, accompanied by the cartoon below:

 Luddites are the Amish. They are anyone who, at any point in time, drew a line and determined that all technology and modernization up to said point was acceptable while all of it beyond said point was evil, deplorable, of the devil, whatever. They are people who bury their head in the sand and wish new things would just go away...The goal of the Luddite is merely to freeze time, freeze assumptions, freeze change. And they seethe and growl at those who attempt to move things forward.

So which of these definitions/usage is the most accurate? And what does any of this have to do with the folks that smashed and burned textile machinery in England two hundred years ago? Perhaps we would do well to learn a little more about...

The Historical Luddites

[Note: The following information comes from a smattering of websites I perused, some of which can be found here, as well as the old standbys of Wikipedia and Brittanica Online, but my chief source was historian Kevin Binfield's site Luddites and Luddism: History, Texts, and Interpretation.]

Frame breaking (1812)
On March 11th, 1811, a bunch of Nottinghamshire stockingers (skilled artisans who knitted stockings) gathered to protest lowered wages, rising food prices, widespread unemployment and the hiring of unskilled laborers, as they had (peacefully) many times over the previous month. This particular evening's meeting ended with a bunch of the stockingers smashing the wide-frame looms (a semi-automated knitting machine that could be used by unskilled laborers) at a local shop which epitomized their grievances. This sparked a frame-smashing fever in the area, and the Nottingham Journal reported several weeks of nightly frame-smashing throughout villages in Nottingham County, as well as arson and other forms of vandalism directed at textile mills. A typical account from the time relates that, "2000 men, many of them armed, were riotously traversing the County of Nottingham."
King Lud.

These stockingers were not, as the misinformed user of Urban Dictionary claims, "led by Mr. Luddite;" but they soon began to began to be known as Luddites because the frame-smashings were usually attributed pseudonymously to one Ned Lud. According to the Oxford English Dictionary, the real Ned Lud was a fellow from Leicestershire who, in 1779, had destroyed a stocking loom "in a fit of insane rage." We don't know much more about the guy than that, but people had apparently remembered his act of senseless destruction enough that "Lud must have been here," had become a catchphrase used whenever machines broke down. By late 1811, letters full of demands were being sent to textile mill owners and local papers signed by "Ned Lud," "General Lud," "Captain Lud," or, most famously "King Lud." The character began to evolve into a Robin Hood-like folkloric figure. Protest leaders began dressing in outlandish getups as "King Lud" and claiming to be King Lud, while the crowds shouted treasonous chants such as, "No King but Lud!"

Luddite protests, riots, and acts of sabotage spread from Nottingham County throughout the English Midlands, as well as spreading to related industries such as cotton manufacturing. This continued in sporadic bursts through 1816. A few of the Luddite demands were met (certain trade restrictions were rescinded, moderate wage increases were gained for some and food prices lowered slightly), but for the most part they met with stiff government reprisals. The Army was dispatched to quell Luddite riots, and for a time there were more British soldiers engaged with the Luddites than there were fighting Napoleon. In 1812, Parliament passed an act which made all industrial sabotage or "machine breaking" a capital offense. In 1813, seventeen Luddites were executed under this act and several hundred more were transported to the penal colonies in Australia and Tasmania. Three more men were executed for the murder of a mill owner later that year. The movement began to falter quickly thereafter.

The conventional telling of these events holds that the machines destroyed by the Luddites were new technology and thus perceived as newly threatening. The notion that the Luddites rebelled against a new innovation is central to the way the word is typically used today. This, however, was not the case. The stocking frames which the Luddites were smashing had been invented way back in 1589. Fearing that the invention would cost English stockinger jobs, Elizabeth I and James I had banned the machines in England, but from the 1660's onwards they had been playing an increasingly large role in the British textile industry. By the time the Luddites began smashing them there were over 25,000 of the machines in place in Nottingham County, and most of them had been there for many years.

So what provoked the Luddites in 1811? Essentially, there had never been a worse time to be a trained stocking-knitter in England owing to a whole host of factors. The mechanized stocking-frames had been putting skilled artisans out of work and driving down pay steadily for 200 years, but in the first decade of the 19th century several more blows came all at once. The first was this guy:

The Napoleonic Wars forced Great Britain into a massive depression from 1808 onwards. Hardest hit were those that produced commodities. Since Parliament had relented to pressure from the nation's wealthiest to eliminate the income tax, the only way to pay for the war was with high commodity tariffs and taxes as well as increased fees and sales taxes. The burden of all this hit the working classes the hardest. The government also attempted to get back at France by banning trade with them and all countries friendly to France (much of Europe, especially as Napoleon conquered more). This meant no foreign markets to sell textiles in.

As the wars dragged on, Britain began to recall its Army and focus its efforts on its navy and on subsidizing foreign allies, adding to the already sky-high unemployment when hundreds of thousands of soldiers returned to England in need of work. Wartime food rationing, several years of crop failures as well as price-support laws designed to protect farmers led to food prices being at an all-time high just as wages and employment were at an all-time low.

The hardships brought by the Napoleonic Wars also came at the culmination of several centuries worth of economic and social change as the English economy shifted from Mercantilism to Free Market Capitalism, and as the Industrial Revolution continued to gather steam (no pun intended). Once upon a time, knitting had been a prime example of a cottage industry, where skilled artisans worked from home. Stockingers, like all skilled artisans, typically served an apprenticeship of seven years (as required by law). The introduction of automated machinery in the mid-1600's meant that mill owners could get away with hiring unskilled workers for dirt cheap, rather than paying apprenticed artisans the wages they demanded; however, for just this reason trade guilds successfully petitioned politicians to write laws requiring workers in traditional trades to serve their full apprenticeship and requiring mill owners to hire apprenticed artisans. With the expansion of colonialism (and thus the demand for goods and availability of resources), the textile industry shifted slowly from a cottage industry to the factory system in the 18th century. However, even though textiles were now made in factories using machines such as the stocking frames, it was still mostly skilled artisans who were doing the work, thanks to government protection and legislation as well as the influence of the trade guilds. Additionally, the economy was still largely a controlled Mercantilist economy with prices stiffly regulated and wages protected by orders of Parliament and the Privy Council.

At the turn of the 19th century this old economic system was breaking down and the tightly controlled economy of years past was giving way to modern free market capitalism. By the time of the Luddites, laws concerning price and wage regulation had been revoked, and mandatory seven year apprenticeships and required hiring of artisans were also no longer in place. The invisible hand of supply and demand now ran everything, and Britain's labor force shifted to being largely composed of hired unskilled laborers. With the economic hardships brought by the Napoleonic Wars, milliners were forced to cut costs wherever they could, typically by slashing wages, and purchasing more machines so that they could hire unskilled laborers rather than skilled ones.

The Luddites were skilled artisans who had gone through seven years of apprenticeship only to find that they were for the same jobs as thousands of unskilled day laborers. These jobs had wages so low that, with rising food prices and widespread famine, it was increasingly hard to avoid the Work House. This is what they were protesting. They were not technophobes, but labor organizers. The industry was scattered enough to make something like a general strike difficult and impractical if not impossible; destroying machines was one way to make their demands heard and assert some leverage over their employers. It was an example of what historian E. J. Hobsbawm calls "collective bargaining by riot." This is well-evidenced by the various texts left by the Luddites; machine-smashings were always preceded by letters making specific demands about wages and hiring practices; the milliners who met those demands (and a fair number did) did not have their machines destroyed.

It is especially worth noting that the Luddites were hardly the first to smash machines for this reason. Industrial sabotage has a long and illustrious history. As Kevin Binfield points out:
For example, in 1675 Spitalfields narrow weavers destroyed "engines," power machines that could each do the work of several people, and in 1710 a London hosier employing too many apprentices in violation of the Framework Knitters Charter had his machines broken by angry stockingers. Even parliamentary action in 1727, making the destruction of machines a capital felony, did little to stop the activity. In 1768 London sawyers attacked a mechanized sawmill. Following the failure in 1778 of the stockingers' petitions to Parliament to enact a law regulating "the Art and Mystery of Framework Knitting," Nottingham workers rioted, flinging machines into the streets. In 1792 Manchester weavers destroyed two dozen Cartwright steam looms owned by George Grimshaw. Sporadic attacks on machines (wide knitting frames, gig mills, shearing frames, and steam-powered looms and spinning jennies) continued, especially from 1799 to 1802 and through the period of economic distress after 1808.
There were many similar revolts after the Luddites as well, including the Pentrich Uprising of 1817 (a demonstration of several hundred unemployed stockingers, quarrymen and iron workers) and, most notably, the Swing Riots of 1830-31. The Swing Rioters were farm laborers struggling against cripplingly low wages who destroyed threshing machines and threatened those who had them in order to make their demands heard. Like the Luddites, the Swing Rioters sent pseudonymous letters signed by a mythical "leader" of the revolt, in this case the entirely fictional Captain Swing. Some samples from the letters:
Sir, Your name is down amongst the Black hearts in the Black Book and this is to advise you and the like of you, who are Parson Justasses, to make your wills. Ye have been the Blackguard Enemies of the People on all occasions, Ye have not yet done as ye ought,.... Swing
Sir, This is to acquaint you that if your thrashing machines are not destroyed by you directly we shall commence our labours. Signed on behalf of the whole ... Swing
On an unrelated note, Captain Swing has recently been reborn as a steampunk master criminal who can spontaneously generate electricity and fights space pirates (or something) in a comic book by Warren Ellis. He also lent his name to a bizarre 1960's Italian comic book (fumetti) and an even more bizarre Turkish film adaptation.
I'm not making this up.

Anyways, returning to my original query, the question remains: How did we get from a 19th century labor revolt to the modern usage of the word?

How the term 'Luddite' was transformed

I believe the main reason why the Luddites, out of the many industrial saboteurs and machine-breakers throughout history, have been singled out as an example is that there is a specific word to refer to them. In the grand scheme of history the Luddite riots were not all that unique, but talking about the "Swing rioters" or "the 15th century Dutch workers who protested new automated looms by throwing their wooden shoes into the machinery" just isn't as direct as saying 'Luddite.'

As near as I can tell, the endurance of the term is due mostly to neoclassical development economists, who seized upon the Luddites as a rhetorical device that could be used to illustrate something they termed the "Luddite Fallacy."  According to these economists, the "Luddite Fallacy" is the mistaken belief that labor-saving technology will lead to fewer jobs. I believe this seriously oversimplifies the actual history of the real Luddites, but it has been an effective enough rhetorical example that anyone from the early-20th century onwards who has taken an Econ 101 class has encountered the term. This kept the word 'Luddite' circulating in the language and perpetuated the oversimplified story of the Luddites as people opposed to new technology.

From the economists the term Luddite was transmitted to the college radicals of the 1960's, some of whom began to identify themselves as Neo-Luddites. The sixties were a time when many leftist and anti-establishment groups happily claimed to be modern-day heirs to the Luddites. Neo-Marxists came closest to the original Luddites' aims, praising the workers for seizing the means of production, and Neo-Leninists and Anarchists praised the Luddite vandalism as 'propaganda of the deed.' However, those that most successfully adopted the term (or had it derogatorily applied to them) were those with objections to technology.

It was a great time to be anti-tech: the hippie counter-culture was in full swing, the conformism of the 1950's was seemingly embodied by new technology such as television and home appliances, and the Cold War threat of nuclear holocaust and the napalm bombings in Vietnam were on everyone's mind. This was the era that saw the flowering of such schools of thought as anarcho-primitivism and environmentalism. The admiration Thoreau and Emerson felt a century earlier for "nature unadorned" was being echoed throughout popular culture; it was a time to "unplug" and "get back to the land." 'Luddite' became firmly cemented in the English language as centrally relating to technology. The economic and social hardships being fought by the original Luddites were all but forgotten. The important thing to remember was that they smashed machines.


The Next Frontier for Luddism

And so now we have the word as used today, with about eight different varying definitions, all in agreement that a Luddite hates new technology, something that was not necessarily true of the original frame-smashers. With this ambiguity in language, I think there may be currently a new type of Luddite arising: the Language Luddite.

The English language used to be (at least, from the 1600's on) in the hands of the few and the highly trained, like the skilled and apprenticed textile artisans of old. These were the English professors, professional grammarians and the dictionary editors. It used to be a big deal when the staff of the Oxford English Dictionary added a new word due to common usage, or changed a definition owing to officially recognized shifts in language. It might still be a big deal, if anyone still consulted the Oxford English Dictionary for reasons other than trying to sound smart. But they don't.

The OED: Helping people sound smart and needlessly lengthen research papers since 1895.

Just as the economic climate of the textile industry had been shifting for some time before the Luddites, language snobs have always been fighting a losing battle against constant shifts in "proper" language use (as chronicled in Jack Lynch's bestselling The Lexicographer's Dilemma). But language police today face something as catastrophic to them as the Napoleonic Wars and repeal of protective legislation were for skilled stockingers: the Internet.

Dictionary, thesaurus and reference book sales have been plummeting in recent years, thanks to the Internet. No Dictionary, not even the "Unabridged" varieties, could ever hold every word, but the Internet comes as close as anything yet. Besides, why would you need to own a dictionary when you can just use a search engine for the word and find dozens of definitions from dozens of sites? You don't even have to spell the word correctly; Google will still know what you mean.

The old standbys such as the OED, Merriam-Webster's and The American Heritage Dictionary are all online for free. But like the 19th century artisans, these reputable old institutions are competing with hundreds of non-professional, user-contributed sites such as Wiktionary, or Urban Dictionary. Also due to the Internet, the print business is in a slump, publishers and newspapers are cutting staff, especially editors (they'll always need people to produce content, but any computer can do a spellcheck), and serious journalists compete with bloggers. Those who would seek to maintain an iron grip on the purity of language are in dire straights. Just Google "all right or alright" and have a look at today's intense lack of consensus over how to properly speak and write the English language.

There must certainly be many language purists who are attempting to resist the changes being wrought by the Internet. The widespread confusion over a term like 'Luddite' must irritate them to no end. Surely, if anyone deserves to apply the term Luddite to themselves, it would be these "Language Luddites." Of course, they never will. They have too much respect for the term in its original and specific context.