Movies vs Life: IT edition
Today Aurelien Thieriot, Software Developer with a love for open data and new technologies, discusses Movies versus Life in this guest blog.
As a Software Developer working everyday on IT-related topics, it is sometimes difficult to grasp the fact that other people don't have the same affinity with computers. My laptop, which I use 8-12 hours a day, is a tool I learned to master in my teenage years and some of the things I do with it are engraved so deeply into my habits that I’m often surprised when someone asks me to explain it.
What does my family think I do? ...
Between my family, that would ask me to repair their printer, and the younger generation who would use tablets as easily as we turn pages of a book, I genuinely wonder what people know about my job. Do they imagine me, in a lab coat, casting spells over dozens of coloured cables? Do they think I often smoke a cigar, in my basement, after single-handedly programming the next Facebook? Or maybe they have no clue at all and wonder where the money I earn comes from (but it doesn't matter Son, we are proud of you anyway).
... Certainly not what you see on TV!
I become more and more sceptical every time I watch a film depicting the life and whereabouts of the so-called IT guy. "It’s just for scenario practicalities", I tell myself. Nobody actually thinks that a computer is working like that... right? To be fair, I guess it is the same for lots of professions; no archaeologist on earth acts like Lara Croft, there is no chance to ever resurrect dinosaurs and I bet that most scientists are choking on their food when Armageddon or Day after tomorrow is on the telly.
Even if it doesn't matter that much, as we all very much enjoyed going six times to the cinema for the early screening of Matrix (the first one, obviously), I think it is time to re-establish the truth about technology aficionados. Put your children to bed (it may get messy), put the kettle on and make yourself comfortable.
Imagination is not - yet - the only limit
Let's clear the air now so we can get going: if you see an IT guy doing something in a movie, chances are that’s not how it’s done in real life. There are actual, physical constraints dictating how a computer can behave. Some people call it "science". Everyone would have noticed how easy it seems to zoom in on a license plate in movies and TV shows. The worst one currently broadcast is probably CSI. If the video above (from Blade Runner) takes place in the future, we can't say the same for our friendly forensic cops. There are plenty of videos on YouTube making fun of the actors as they come up with incredible (yet overly simplistic) theories and produce clear visuals of a very bad quality surveillance camera.
We can't invent data that isn't there
The reason why we can't zoom indefinitely on a picture is that there is no data to see, due to a limitation of the current state of cameras and lenses. Building a clever algorithm won't solve that fact. If you took a shot in front of the sun during your holidays, there is no way to completely recover your subject afterwards. For the same reason, the only way to know what is hidden on the other side of the moon is to physically go behind it (+1 Transformers!). Don't get me wrong though, expensive cameras have more data than you can see with your own eyes and, given enough time, resources and data, NASA could aggregate lots of OK pictures to make one excellent one. But there are still physical laws to respect, and it would take time.
No, satellite Internet is not reliable on a plane
Recently, there was an episode on Castle that took place on a plane and the main character (Hello Nathan!) had a spotless and continuous phone coverage - by that I mean he had perfect Skype reception. Have you ever tried sending an email from a plane? ... Exactly! Internet connection on a plane is spotty at best, because it's done by using satellite connections which are very sensitive to high speed movement.
Similarly, there are countless things that you cannot do in real life, such as:
- You can’t download every map of every building out there (yes, paper based archives are still a thing);
- There is almost never a "central mainframe" to destroy;
- Trashing a computer screen is not enough to actually destroy the data stored on the hard drive; and
- There is no guarantee, if you were a spy, that your super-usb-key-that-steals-valuable-information would work on the Russian mob’s computer.
We are not alone
I am a very big fan of Fringe. Characters are nice, there is plenty of action to keep you busy and who doesn’t love parallel universes? But something always bugs me. Even though the show is pushing the fact that the characters are making a good team, they are all, in fact, working alone in their own field. The most visible of them is Walter Bishop who manages to build a machine to travel through universes, all alone (and it probably took him only a few months).
It's all about team work
Every single science project the human kind has ever worked on was made by a team. Usually large teams. Even historical events, like the discovery of the rabies vaccine, was made after several discoveries and previous work by other scientists, inspired by other people's papers. Louis Pasteur probably had the help of a few interns as well.
There are all kinds of jobs and techniques in IT dedicated to managing teams, finding better organisation and investments to deliver better quality software. Like any other business really. Five guys in their garage like in the early days of Google are rare these days. Lonesome heroes are rarer (and not always nice people to work with). The entire Internet has been built around the idea of collaboration and it has never been as easy to share ideas.
What is true, however, is that there is some period during the development of a piece of software when we like to be alone. When the goal is clear and the task big enough, some people like to put their headphones on and just start coding. We call it being "in the zone".
It takes more than a few minutes to program something useful that works
Threats don’t make anyone work better
Let me tell you right away, if, on top of all that, someone were holding a gun to my head, I would probably just wet my pants. Please, if you are working as a scenarist, stop making people believe that pressure always gets better results. Read xkcd instead. There is a part of creativity in programming that has the potential of making a program better with time, thought and care.
Another thing the scenarist of Swordfish hasn't bothered about is that cryptographic programs like password crackers are very expensive to run. Those are doing heavy mathematics computation and quickly face real world problems like the speed of processors. After years of research, even geniuses can only make those computations a little bit quicker (I am looking at you Scorpion), while at the same time making password protection more secure.
A lot of trial and error is involved
The real crux of the job is to try to find the best way to do something, and this generally involves a huge (huge) number of trials and errors. Once you know what you want to achieve, you have to find the right tool to do the job, write some basic program, and optimise it to get the result you expect, while fixing bugs that may be due to something complicated, or as silly as a double space when there shouldn’t be one.
The more the merrier: not on a single keyboard
You must have figured it out by now, but the IT crowd is bound by the limit of physics and time. NCIS is probably the most infuriating show of all time on that point. Abby reading encoded binary code with her own eyes, Abby and Tim using the same keyboard at the same time (a keyboard is NOT a piano).
Humans make mistakes. Developers are human.
Developers make mistakes. A program is something you can refine and you can't always know beforehand how users will use it, or how different machines will react to a set of instruction that isn't expected. If you are watching The Flash, you may have heard at one point Ray Palmer saying something like: "I don't know if my ATOM suit will work, I haven't tested it yet". Famous last words if you ask me.
Techniques and languages are constantly evolving
Finally, things evolve. The Flash also shows a scene were the genius character modifies - in about 2 minutes - a computer that was in fact coming from 200 years in the future. If you have ever seen a punched card from 50 years ago, you know that things can change fast and there’s no way programming today will be similar to programming in 200 years.
Pretty doesn’t mean efficient
We definitely can do things you can't though
We may not be able to zoom into an image past a certain point, but we do have some skills at making ‘normal’ humans a bit confused. See the keyboard above? This is a real thing. Lots of people are using keyboards without letters to avoid useless distractions and be more efficient. Shortcuts were also invented for this reason, CTRL+C / CTRL+V is much faster than five clicks.
Pretty interfaces are nice, but not very effective
One thing I find interesting in movies is that actors are generally really good at faking typing on a keyboard. In fact, they are really good at faking using any kind of User Interface (UI). Who never dreamed about trying Minority Report's touch screen, James Bond's MI5 computer or the console of the TARDIS? But the truth is, there is a good chance that those fancy UIs are in fact pretty useless. Lots of people are working on User eXperience and usability, trying to make things simple for us, but they are not often successful at it.
No hard feelings
I could spend hours talking about weird scenarios in movies but, generally, inaccuracy doesn't kill entirely the magic behind a movie. Apart from obviously wrong choices (like sounds of ships in space), a movie or a TV show is a place of fiction where our inner kid likes to dive - and it does the job.
It is also a good place to run world scale experiments to see what a given technology could become in a distant future or seen from a different point of view.
Enjoy the show but don't trust everything you see!