“Verisimilitude” is one of the ten-dollar words I learned in film school. It means the quality or state of having the appearance of realism. Narrative filmmaking is all about achieving verisimilitude (so is documentary filmmaking, but that’s another discussion). The whole point is to artificially create dramatic situations that seem “real” enough that a willing suspension of disbelief is possible on the part of the viewer. If verisimilitude is not achieved in a given film — due to bad acting or artificial dialogue or cheesy special effects — then the viewer cannot get “lost” in the drama and the film is a failure.
The language of cinema has evolved over the last hundred or so years to maximize verisimilitude. Early in the 20th century, D.W. Griffith established most of what we now call the grammar of film, laying down basic screen direction conventions and “rules” of editing that are still the basis of all narrative films today. These rules govern the spatial relationships of characters and objects on the screen in such a way that in a scene that cuts back and forth between two people who are having a conversation, for example, we see the characters as facing each other rather than staring off at some third point off-screen, even though we can only see one character at a time.
Since Griffith’s time, cinematographic language has been largely dominated by an attempt to make the camera invisible. For most of cinematic history the prevailing wisdom has held that in order to achieve verisimilitude the camera should be omniscient — an invisible fly on the wall that reveals the scene to us without ever calling attention to itself. Exceptions to this rule (breaking the fourth wall, literal POV shots, etc.) are notable because they tend to take us out of the action — breaking verisimilitude in order to achieve a desired effect.
Similarly, editing conventions have been established that can make cutting from one shot to another “invisible.” The “match cut” (maintaining visual continuity between shots) has thus become the option of choice for cinema editors hoping to avoid “jump cuts,” which can have the effect of diminishing verisimilitude.
In the last few decades, however, conventional wisdom has changed regarding how best to maintain verisimilitude in narrative cinema. What was once considered verboten is now accepted. The avant-garde has become the mundane.
Cinematic traditionalists can get rather curmudgeonly about these new acceptable heresies, harrumphing as the 180-degree rule is ignored and as hotshot directors insert their camera acrobatics into the drama.
As the closing credits rolled on the new Star Trek film, I found myself harrumphing a bit as well. Why? Well, for one thing, look at this frame from the film…
This isn’t some aberration – some brief glare that I had to hunt for and that only a cinema nerd like me would notice. Star Trek is LOADED with these lens flares. In fact, J.J. Abrams admitted to intentionally shining spotlights into the lens of the camera in order to get this effect…
Our DP would be off-camera with this incredibly powerful flashlight aiming it at the lens. It became an art because different lenses required angles, and different proximity to the lens. Sometimes, when we were outside we’d use mirrors. Certain sizes were too big… literally, it was ridiculous. It was like another actor in the scene….
Why would Abrams deliberately distress the image of his very expensive film? In service of verisimilitude, of course…
There was always a sense of something, and also there is a really cool organic layer that’s a quality of it. They were all done live, they weren’t added later. There are something about those flares, especially in a movie that can potentially be very sterile and CG and overly controlled. There is something incredibly unpredictable and gorgeous about them.
It turns out that as CGI has gotten better at making things look more realistic, we as an audience have become wary of things looking too real. One way to make the digital effects look more organic is to distress the image with lens flares and add an artificial shakiness to the CGI “camera,” as though shots in outer space are handheld.
All of which is quite counterintuitive. After all, a lens flare is an artifact that’s specific to a glass lens. Our naked eyes don’t see lens flares (at least not the way a camera lens sees them) — so the lens flares we see prevalent in Star Trek and the films of Michael Mann and P.T. Anderson, for example, are by definition artificial. And yet the effect helps these films to achieve verisimilitude all the same. I think the reason has to do with the increasing rate at which we perceive reality through lenses.
Consider the “shaky cam.” South African TV commercial director Leslie Dektor is credited with bringing this aesthetic to narrative filmmaking. It was Dektor’s work on Levi’s and AT&T spots that the camera operators and DPs on the police procedural drama NYPD Blue emulated in the 1990s. Dektor’s mother had been a documentarian and he developed his style by watching her. Done well, the effect is much like a documentary style of vérité shooting. The camera follows the action as it unfolds — as our own wandering eyes might if we were sitting in the room with the characters, watching the scene unfold.
Unfortunately, early attempts at imitating Dektor were disastrous. The NYPD Blue camera was neither handheld nor steady — it was on a fluid-head tripod and intentionally jerked about in a horrible, mechanical imitation of vérité. It was as if the cinematographers knew there was some good energy to be had in an ad-libbed shooting style but they didn’t want to commit 100% and actually carry the camera on their shoulders. Watching early NYPD Blue episodes now is like watching all the crazy zooms in films from the late 60s and early 70s after zoom lenses became fast enough for motion picture work — the style is dated and specific to a particular era of filmmaking.
Nowadays, cinematographers have no problem shooting entire feature films handheld. This is a post Dogme 95 world. The lines between documentary and feature film cinematography styles are so blurred they basically don’t exist anymore.
Again the relevant recent example is Star Trek. Much of the film is covered in simple handheld shots. Swooping Steadicam shots and expensive f/x shots are here too — it is a big budget Hollywood effects film after all, but the camera work in this scene, for example, is not too distinct from the camera work in this scene from Thomas Vinterberg’s masterpiece of austerity, The Celebration.
And again, the rough, handheld style helps the film achieve verisimilitude, even though our own eyes would not be so shaky were we actually present in these scenes.
We are so used to seeing the world through a camera lens that intentionally adding the messy artifacts of documentary filmmaking (lens flares, shaky, handheld shooting, jump cuts) makes these films seem more real. And since in our actual real lives we don’t carry around movie cameras shooting everything we see (yet), the result is that these films achieve verisimilitude by distancing themselves from reality.
I’m not sure what technique cinematographers will adopt when all of this documentary vérité stuff becomes passé, but I hope I’ll still be able to tell what I’m looking at. In Star Trek, it was sometimes difficult to discern through all the glare what was being shot, and this should be a red flag to filmmakers. Seriously, we need to be able to see the film. Harrumph.