Why Can’t Hollywood Get Computers Right?

David Pogue
March 3, 2014

I’m glad Her won the Academy Award for Best Original Screenplay. The story’s good, the acting’s good and, above all, the technology story makes sense.

You might think that’s a little odd, considering that Her is the story of a guy who falls in love with a Siri-like operating system on his phone. 

This futuristic OS is so sophisticated that you can’t tell it’s not a real person. She sounds like Scarlett Johansson, in fact.

That’s a big technological leap. But given that this movie is set in the “slight future” (as director Spike Jonze says), it’s not an implausible, silly leap. Especially because all the other technologies in the movie are brilliantly plausible.

In Her, we’ve finally standardized on a wearable technology. Not glasses, not watches — it’s an earpiece. It’s always online, it understands speech, and it doesn’t require taking your eyes off the road.

People still have smartphones and computers in the Her world, but you don’t see keyboards anymore. They’ve evolved away as speech recognition has improved.

Anyway, it may be that I was so grateful to the future technology in Her because I’m so exasperated by the technology depicted in all othermovies.

Why does Hollywood seem to think that text on PCs in movies pours onto the screen, left to right, like a teletype machine? And that it chirps as it appears?

See Jurassic Park, Mission: Impossible, Disclosure and Broken Arrow. Or see — or, rather, don’t see — The Net, where Sandra Bullock’s keyboard chirps as she types.

For heaven’s sake: We all use computers. We know how they work! And we know that text does not chirp and beep as it spills onto the screen. It never has, actually. And it never will. 

In Bridget Jones’s Diary, an incoming email even appears onscreen letter by letter, as it’s being typed by the sender. Yeah? In what universe?

Furthermore: How is it that, in 2014, entering the wrong password in a movie makes a huge red error message blink across the screen that says “ACCESS DENIED”?

We’ve all mistyped our passwords. And we all know that “ACCESS DENIED” is not the result. Worst case, you see “There was an error with your email/password combination. Please try again,” or something.

And if you do type the right password, you don’t see a huge green “ACCESS GRANTED” message, either. You just get in and start working.

(A screenshot from the TV show Revolution, via the Fake UI Tumblr)

Then there’s Hollywood’s idea of hackers. In the real world, getting hacked usually means that a bot (automated software) is quietly deposited on your hard drive to do the bad guy’s bidding — usually sending out thousands of spam messages. It’s in the hacker’s interests to make the infection as subtle and invisible as possible.

But in Hollywoodland, your screen gets taken over by full-screen animations, like the evil laughing skull in The Net, or the blizzard of random dialog boxes in this episode of the TV show NCIS:

(That episode gets extra comedy points for having its two IT professionals attempt to thwart a real-time hacker attack by typing on the same keyboard simultaneously.)

I also love the movies, like Olympus Has Fallen, where someone’s trying to guess a password by trying thousands of combinations of, say, a seven-digit number. We see all the guesses whizzing by on a screen, like a crazy odometer spinning — and every few minutes, one of the digits locks into place. And then another. And then another. The guessing program is closing in on the full password!

Well, except that guess what? A password is all or nothing. You don’t get to find out which of your digits is correct and which are incorrect. There’s no way a guessing program could “lock in” the digits that are right so far and keep guessing. (Besides: If you try more than about five guesses, most systems realize you’re up to no good and lock you out.)

Now, early on, when the first computer-based techno-thrillers came out, they established right up front that Hollywood either (a) had no clue or (b) thought that moviegoers were a bunch of clueless rubes who’d never seen a computer.

Thus, we get the silliness of Hackers (in which the operating system of the computer being hacked resembles skyscrapers, and you have to fly through them to do your hacking); the absurdity of Independence Day (in which Jeff Goldblum writes a virus that successfully infects the OS of the computers of a vastly superior alien starship);the insanely confused computers of Jurassic Park (in which we see Mac OS 9’s graphic interface — but the little girl says, “It’s a Unix system! I know this!”); and every single frame of The Net.

We can also cut the producers of sci-fi fantasies some slack. We accept some far-fetched nonsense in Minority Report, The Terminator and other movies set in a bizarre future.

But it isn’t 1985 anymore. We know what computers do and don’t do. Especially when Hollywood depicts goofball technologies in movies set in the present day. Text doesn’t chirp, and passwords don’t congratulate us.

Fortunately, although it may have taken nearly 30 years and a lot of YouTube ridicule, the people who make Hollywood movies do sometimes catch on. I believe that we may at last be seeing the twilight of scenes where a detective, inspecting some surveillance video, barks, “Enhance!” — and the IT guy enlarges and sharpens a grainy, dark blob of pixels until it’s a crisp studio portrait of the bad guy.

This montage of that identical scene, cut together from multiple movies and TV shows, should shame any writer or producer who ever thinks of using that tactic again:

And maybe, just maybe, we’ve seen the last of the movies where a character’s face bears a projection of the computer screen he’s looking at.

So what’s wrong with a little techno-absurdity in a movie plot? The same thing that’s wrong with plot absurdity in human behavior, medicine or science: When we go to a movie, we’re willing to suspend our disbelief only so much. Once we witness screaming implausibility, that suspension crashes to earth. We no longer believe the story or trust its creators. The movie’s ruined.

The tragedy is that there are enough fascinating aspects of real technology to fill a thousand movie plots. With just a little bit of effort and creative thinking, a screenwriter or production designer should be able to tell a great story without making our everyday machines do things we know they don’t do.

It’s not 1985 anymore, Hollywood. How about you catch up?

You can email David Pogue here. And you can follow Yahoo Tech on Facebook right here