Real Film Strikes Back | Craftsmanship Magazine Skip to content

Real Film Strikes Back

Against all odds, and despite the best efforts of Hollywood and Silicon Valley, old-fashioned, analog, motion-picture film is hotter than ever. What’s the magic in this old medium that digital technology can’t seem to match?

Issue: Creating a World Built To Last

Topics:

Locations:

Materials:

top image
Christopher Nolan oversaw the 2018 re-release of Stanley Kubrick’s "2001, A Space Odyssey," shot in 1968 on 70mm celluloid film. Nolan says he chose analog restoration because he wanted audiences to experience the film in its original medium, with all its richness and flaws. photo by Roko Belic

Written by DAVID MUNRO

  1. The Nuts and Bolts of Cinema
  2. Film vs. Digital: A Primer
  3. A Moving Image, Literally
  4. Splendid Imperfection
  5. The Rise of Digital Video
  6. The Power of Ones and Zeroes
  7. Digital Imaging Grows Up
  8. Mining Film’s “Intangibles”
  9. “It’s About the Basics”
  10. Celluloid’s Next Act

Film was declared dead on November 2, 2011. On that day, the late, great Roger Ebert penned a blog post-cum-eulogy titled, “The Sudden Death of Film.” “The victory of video was quick and merciless,” he wrote. “I insisted, like many other critics, that I always knew when I was not being shown a true celluloid print. The day came when I didn’t.”

The “print” that Ebert was referring to would be a foreign concept to many of today’s filmmakers. He was talking about emulsion-coated, silver halide-and-color dye embedded, dripping-wet-from-the-developer strips of image-laden celluloid. That film. Real film. Film film.

Around the time of Ebert’s epitaph, signs of film’s demise were everywhere, and you didn’t have to be a movie critic to notice. Kodak was in Chapter 11 bankruptcy. Film processing labs were shuttering seemingly by the hour. And theater chains were transitioning to digital exhibition formats at a madcap pace. Just like that, the repository of a century’s dreams and nightmares—Marilyn, moon landings, Hitchcock, The Hindenburg—became a thing that once was, and could no longer be.

In 2013, the New York Philharmonic played a live (analog) score accompaniment to a screening of “2001, A Space Odyssey,” an event one critic called “the ultimate 2001 experience.” photo courtesy of Warner Brothers/Philharmonic Symphony Society of New York

But a funny thing happened on the way to the coroner’s table. Film produced a pulse. In the Spring of 2014, two years after Ebert’s eulogy, the heads of six major studios were approached by a group of iconic, A-list directors. Like the cavalry in a John Ford Western, the directors rode in and strong-armed a deal whereby the major studios would buy enough film each year to ensure Kodak’s survival. “The studios listened,” says Linda Brown, head of USC’s cinematography department, “because the directors were named Quentin Tarantino, Christopher Nolan, and Steven Spielberg.” In the following years, including this year in particular, these and other top directors kept winning Academy Awards for movies shot on celluloid.

The sign on this cabinet, in a Bay Area camera store, explains some of the rise in analog film sales. As with vinyl records and hardcover books, a renewed interest in film is driven by millennials who are burned out on the ‘stream’ and seeking hands-on experiences. photo by David Munro

Of today’s movie productions, 99 percent are now 100-percent digital. So why would a posse of Oscar-winning directors go to the mat for a dying medium with a reputation for being costly, cumbersome, and commercially non-viable? As I explored this question, I discovered not just film diehards, but digital pioneers who continue to use celluloid as the source of their inspiration. What is it about film that evokes so much devotion that it’s now being summoned back to life?

THE NUTS AND BOLTS OF CINEMA

My introduction to filmmaking happened in 1989 at the University of Southern California. A disillusioned young ad man, I’d been bitten by the movie bug and decided to attend the school’s summer filmmaking “boot camp” to see if my big screen infatuation was more than a temporary fling. It was the best six weeks of my life.

Hands-on and fully analog, I immersed myself in the artistic and technical aspects of a craft that had not changed much since its invention nearly a century before. Like the internal combustion engine, refinements were many, but the basic mechanics were the same. In film’s materiality, I found the world. Subjects I’d spent most of my life avoiding—chemistry, physics, engineering, math—were suddenly captivating and essential.

USC’s sound stages and cutting rooms were playgrounds of hefty, manually operated, industrial-age machinery. Although analog video formats (primarily VHS) had overtaken the home entertainment market, film remained the undisputed medium of origin for industry pros and independents alike. I learned to edit on a Moviola (Pat. 1924), to shoot on an Arriflex-S camera (1952), to record sound on a Nagra tape recorder (1951), and to load a camera using Kodak 7278 Tri-X reversal film stock (1955). By every qualitative metric—resolution, dynamic range, color reproduction—film’s aesthetic bona fides were demonstrably superior to video. It wasn’t even close.

In 1997, George Lucas’s growing infatuation with digital tools led him to re-release the original Star Wars, 20 years after its debut, in a digitally updated version. Yet many fans prefer the analog versions (top) of iconic characters like the slug-like crime boss, Jabba The Hutt, over their computer-generated replacements. photo courtesy of 20th Century Fox/Disney Corp.

FILM VS. DIGITAL: A PRIMER

As a TV baby, when I got too close to the screen I did not go blind as my mother warned, but instead saw a field of tri-color dots that were fixed, finite, and unflinching. When I did the same as a cinema-smitten young man at a screening of David Lean’s “Lawrence of Arabia,” in glorious, Super Panavision technicolor, I did not see dots. I saw God. Actually, what I saw was grain—the floating, shifting, light-sensitive particles embedded in film’s emulsion coating.

“Now that we live in a medicinal, anesthetic, digital world,” says Kodak’s Steve Bellany, “grain is beautiful, and we need it back.”

Film is, quite literally, matter infused with time. In the blink of a camera’s eye, light-limned images are etched onto film’s gelatin-based substrate. As John Alton suggests in the title of his seminal book on cinematography, “Painting With Light,” (University of California Press, 1995) film is a canvas, not a pass-through medium. Each film type and format lends its own textural nuance.

Digital imagery, by contrast, is a facsimile. When an HD camera blinks, light strikes a photo sensor that turns images into information. Light values are assigned ones and zeroes—binary digits, or “bits”—that are stored as numerical code. Techno-historian George Dyson is not wrong when he says that “a Pixar movie is just a very large number, sitting idle on a disc.”

When that number wasn’t so large, film had nothing to worry about. Enter today’s ultra-high-definition technology: Gone are the blocky pixels and electronic “noise” that were video’s tawdry MTV-era signatures. In their place is an exponentially refinable picture quality that is blowing by the limits of human perception itself.

By now, most filmmakers (myself included) have passed through the grief stages of film’s demise and embraced digital video’s undeniable advantages. As a virtually cost-free capture medium, it allows endless takes with multiple cameras. Footage (a carryover term that’s funnily still in use) can be logged and edited right on the set simultaneous to filming. And the dreaded words “there’s a hair in the gate” will never again vex a director’s ears, for the simple reason that with no film strip to be locked into place behind a camera lens, there is no gate where a shot-ruining human hair can get stuck.

All this implies that continuing to work in the filmed medium is an act of either retro-fetishism or neo-Luddite masochism. Yet neither of those things is the case. Digital may have the metrics, but film has the immeasurables.

In January 2020, at the 4th Annual Kodak Film Awards, Steve Bellamy (left), president of Kodak’s motion picture division, presented director Quentin Tarantino with a Lifetime Achievement Award. All of Tarantino’s movies have been shot on film. photo courtesy of the Eastman Kodak Company

A MOVING IMAGE, LITERALLY

If you’re not sure what grain looks like, think of a time when you stared at the negative space in a projected image and wondered why the surface seemed to be moving: old home movies, 7th-grade educational films, “March of Time” newsreels. That’s grain.

Unlike digital video’s fixed grid of photosensitive picture elements (“pixels”), film’s randomly distributed, light-reactive crystals stamp ever-changing pointillist patterns onto each successive frame. This textural effect, devotees say, is something audiences can feel, even if they don’t necessarily notice.

Actually, grain was long regarded as a flaw, and the Kodak Corporation spent nearly 100 years trying to get rid of it by developing ever-finer film stocks. Now it’s their secret sauce. “When there was no digital, everyone was trying to make grain fade into the background,” Steve Bellamy, president of Kodak’s motion picture and entertainment division, told me. “Now that we live in a medicinal, anesthetic, digital world, grain is beautiful, and we need it back.”

Bellamy was living a perfectly nice life as a cable TV impresario when director friends urged him to drop everything and save an imperiled medium. “I didn’t take the job to wage war on digital,” he says. “I drive a Tesla, I own every gadget you can think of. The issue for me isn’t ‘this versus that.’ It’s about choice. When acrylic paint came along, no one said, ‘Sorry, you can’t use oil anymore.’ That’s not how art works.”

Under Bellamy’s stewardship, the Kodak Goliath has embraced its inner, artsier David, and the results have been impressive. Movies shot on film in 2019 alone—”The Irishman,” “Little Women,” “Once Upon a Time in Hollywood,” “Marriage Story”topped their award-season shortlists, despite the fact that less than 1 percent of today’s movie productions originate on celluloid. “I want the building blocks of my art to be photochemical,” Bellamy states emphatically. “My bank statements and medical records are digital. I want my art to be something different.”

“My family is always on me about working with film,” says Linda Scobie, whose acclaimed short, “Craig’s Cutting Room Floor,” is constructed entirely from discarded bits of old movies. “I tell them, ‘I will not be turned into ones and zeroes.’” photo courtesy of Linda Scobie

SPLENDID IMPERFECTION

The problem with perfection is that human beings have a hard time relating to it. We like Tom Cruise, but we relate to Tom Hanks. One makes us swoon; the other makes us care. Kodak’s Bellamy may choose other words, but imperfection is what he’s talking about when he refers to film as “something different.”

For the ‘Star Wars’ prequels, George Lucas stunned Hollywood by announcing that he was done with analog film for good. In a poetic twist, the recent and highly acclaimed ‘Star Wars’ reboots have all been shot on film—with Lucas’s enthusiastic blessing.

“When I watch a film, I don’t want it to look like my Steelers game on Sunday,” says Pittsburgh-born filmmaker V. Scott Balcerek, referring to digital video’s super high-definition image quality. “That degree of hyper-clarity, where you can see each hair in a person’s beard, runs counter to the art and illusion of cinema. Ultra HD digital may be closer to how we really perceive things, but it goes against our desire as moviegoers to be transported.”

A former Lucasfilm editor, Balcerek knows a bit about the film world and its digital divide. His deeply moving new documentary, “Satan and Adam,” was shot over 23 years on virtually every film and video format in existence. He also worked on the “Star Wars” prequels when George Lucas stunned Hollywood by announcing that he was done with film for good.

After the original “Star Wars” was released in 1977, as a purely analog creation (even the Death Star was cardboard), Lucas took a hiatus from directing and dove headlong into digital image tech development. When he re-emerged to direct the three “Star Wars” prequels, the worst of video’s quirks and glitches had been overcome by his own stable of geniuses. Quoted at the time, Lucas praised the control and precision that shooting digitally gave him, adding that “audiences can’t tell the difference.” But audiences could, and did.

As beloved as the original “Star Wars” films are to billions of fans the world over, the digitally made prequels might be loathed even more. In a poetic twist, the highly acclaimed “Star Wars” reboots directed in recent years by JJ Abrams and Rian Johnson have all been shot on film—with Lucas’s enthusiastic blessing.

As the childlike Alby Cutrera in David Munro’s “Full Grown Men,” an indie comedy shot on celluloid, the shadows cast by actor Matt McGrath’s bangs obscured his eyes in many scenes. Digital tools, like selective brightening, rescued much of McGrath’s performance. Photo by Dan Littlejohn.

THE RISE OF DIGITAL VIDEO

By the time I was ready to direct my first feature film—”Full Grown Men,” in 2005—the pressure to shoot digitally was intense. If digital video hadn’t yet matched film’s epic image quality, the savings alone made it impossible to ignore. And in the independent film world, success stories on video were starting to pile up.

After his first encounter with the Arri Alexa, a next-generation digital camera, director Steven Soderbergh said “I feel like I should call film up on the phone and say, ‘I’ve met someone.’”

Early indies that incorporated analog video, such as “Sex, Lies, and Videotape (1989) and “Reality Bites” (1994) justified its use by having videographer characters in the story, contrasting video’s diaristic “reality” with film’s dramatic “transparency.” A few years later, foreign hits like “The Celebration (1998) and “Russian Ark” (2002) proved that audiences were ready to accept video entirely on its own terms. Then, in 2004, cinematographer Dion Beebe shot Michael Mann’s jaw-droppingly gorgeous, neo-noir, action-thriller “Collateral” using a next-generation Viper HD camera, and all bets were off.

I loved these films and marveled at the rewards, both technical and artistic, that video gave them. The unsettling meltdown of “The Celebration”’s sordid family reunion benefited from early HD’s electronic edginess, as well as its ability to handle extreme low light situations (many scenes were lit just with candles). And “The Russian Ark” was shot in a single, 96-minute take inside St. Petersburg’s Heritage Museum. Good luck trying that with a film camera’s 1,000-foot magazine (which gives you only about 10 minutes of shooting time).

Yet despite their cinematic nature, these digital movies were part of a new aesthetic. As our experience of the world became less tactile and our personal interactions less intimate, it followed that video’s cold, detached immediacy was a medium suited to reflect our new reality. 

My film, however, was not in this mold. “Full Grown Men” tells the story of a post-adolescent Florida man who romanticizes his childhood to a pathological degree. As my main character, Alby, says in a bit of opening voiceover, “If I could live in the past, I’d be on a plane tomorrow.” Alby sees the world through rearview, candy-colored glasses. Visually, this means warmth, saturation, and—above all else—illusion. In a similar vein, Tarantino films, even when set in the present, mine the past through classic genre tropes. I shot on film, and I have never regretted the decision. It’s unlikely, however, that I will ever make that decision again.

Patrick Lin, Director of Photography for Pixar Animation, conducted live-action camera tests in Pixar’s hallways for the Academy Award-winning “Toy Story 4.” photo courtesy of Pixar Animation Studios

THE POWER OF ONES AND ZEROES

My film/digital crossroads came unexpectedly, when it was time to strike a “release print” for “Full Grown Men”—the print, in celluloid, that would be projected in theaters nationwide. A lighting problem had occurred during filming that I assumed we could “fix in post.”

Matt McGrath, who played our main character, Alby, is a very expressive actor. But the haircut we gave him—a sort of surfer-dude-meets-late-‘60s-Jane-Fonda pageboy—threw a shadow over his eyes that robbed much of his performance of its subtlety. I thought I could boost the exposure at the lab and add light to his eyes, but increasing “brightness” affects the rest of the image, too. The shadow stayed. Every time I watched the film at festivals and during its theatrical run, I got sad knowing that Matt’s inspired work was being under-appreciated.

To my surprise, digital technology redeemed Matt’s performance. As part of our DVD and cable TV deals, we had to deliver an HD digital version of our movie. This meant re-outputting from the original film negative, only instead of doing it with an Edison-era optical printer, we did it with mind-blowing digital scanning software. Using digital tools, we adjusted qualities like brightness, contrast, and offending hair bangs selectively.

Given a new lease on cinematic life, I went back through my film—frame by frame, gaffe by gaffe—adding light to Matt’s eyes, reframing rookie compositions, and punching the saturation on every neon motel sign and mermaid encampment in my offbeat Florida fantasia. The resulting HD master of “Full Grown Men” is as close to a perfect version of my film as I could ever hope to make. And it was a suite of algorithms that made it possible.

Lin uses analog lenses to inform the “virtual cameras” that give Pixar movies their uniquely cinematic quality. photo courtesy of Pixar Animation Studios

DIGITAL IMAGING GROWS UP

At the time, “Full Grown Men” seemed like the best of two worlds: filmed by artists, digitally enhanced by nerds. Though still a winning formula for some, the technologies underlying this calculation have shifted dramatically. In 2005, when we were making
“Full Grown Men,” the percentage of digitally shot productions was in the low single digits. Ten years later, that ratio had flipped. Pundits differ on the exact tipping point, but it wasn’t long after our premiere that the words “digital” and “cinema” began appearing together in the same sentence.

The advent of next-generation HD cameras like the Red One, Thomson Viper, and Panavision Genesis adopted film’s languid frame rates and large exposure areas, prompting high-profile directors to wade into digital waters. The result was David Lynch’s “Inland Empire” in 2006, David Fincher’s “Zodiac” in 2007, and Danny Boyle’s Slumdog Millionaire” in 2009, which won a Best Cinematography Oscar—a first for a digitally shot movie.

Gradually, inexorably, video was no longer the imposter and pale imitation. It had arrived as a full-fledged artistic medium. And with the introduction of the Arri Alexa in 2010—the first digital camera made by the industry’s most venerable maker of film cameras—the war seemed won. In its first nine years of existence, seven films shot on the Alexa won the Academy Award for Best Picture, and eight won for Best Cinematography (including 2020’s “Parasite” and “1917,” respectively). As director Steven Soderbergh quipped after his first encounter with the Alexa, “I feel like I should call film up on the phone and say, ‘I’ve met someone.’”

While studio bean counters had been pushing directors and cinematographers to shoot digitally for years, it wasn’t until the Alexa came along that the artists largely stopped pushing back. The reason, according to French-Canadian Cinematographer Yves Bélanger, who shot “Dallas Buyers Club” (2013) and “Wild” (2014) with the Alexa, was simple: “It looks like film.”

“Imperfection creates trust in an imaginary environment,” says Pixar DP Patrick Lin. “We let focus slip at times. We use lens distortion to add texture. We ‘film’ animation, except it’s a virtual camera.” photo courtesy of Pixar Animation Studios

MINING FILM’S “INTANGIBLES”

The irony of digital’s industry takeover is that it only edged out celluloid when it finally was able to evoke it. Exhibit A are digital colorists like the ones who gave my film a high-res sheen; now, they’re degrading digital content for the express purpose of replicating film’s unrefined charms.

“Our mental image of what cinema looks like is film,” says Shane Reed, partner and senior colorist at L.A.’s Apache Digital. Reed is also the creator of CineGrain, a collection of motion picture film textures produced in digital form. “For one humdred years, analog film has been our touchstone. My job is to give it a digital afterlife.”

An ultra-high-resolution scanner invented to film the 2019 documentary, “Apollo 11,” recently revealed that a single frame of celluloid film contains 40 times more information than an entire digital chip.

Reed was an analog cinematographer, shooting real celluloid before becoming a digital colorist. So he gets it—both film’s inexpressible beauty, and digital’s infinite creative possibilities. When his clients kept asking for “filmic” looks to add a cinematic patina to their digitally shot movies and commercials, he honored their requests—but only after honoring the integrity of film’s original material.

Unlike products that simulate film using graphic software, Reed’s CineGrain is the genuine article. “I approach it like a cultural scientist,” Reed says. “How can I preserve film’s intangibles in our digital sandbox? My answer is, go to the source.” Cinegrain’s film-grain overlays (FGO’s) are textural layers sampled from actual film stock that Reed collects and shoots himself. Think of it as straight-edge digital image files dragged back through celluloid’s groovy wardrobe closet.

Among the “looks” in the CineGrain FX menu are lens flares, motion blurs, and soft focus—in other words, mistakes. Technically speaking, these are photographic errors caused by improper focus-pulling, faulty panning, or failure to shield a lens from the sun or excessively bright lights. Emotionally speaking, however, these effects rivet us with a sense of realism and authenticity. They suggest that what we are watching is not the result of technological calculation, but rather of human idiosyncrasy, fallibility, and luck.

Even animation filmmakers—particularly those who are masters of the form—seem to understand the limits of digital imagery. “The math in our virtual camera is amazing,” Patrick Lin, a live-action cinematographer who leads Pixar’s virtual camera department, told me. “We can put the camera anywhere and have it do anything. But that’s exactly why we shouldn’t. It’s about moment-to-moment believability, and that means being grounded in human perception. If we don’t follow the laws of motion in the actual world, it won’t feel real to the audience.” Lin’s team even adds virtual grain to animated surfaces—for the same reasons that Reed’s clients request it: It’s comfort food for the eyes.

Before the camera was invented, in the late 1800s, a painter’s primary job was to essentially reproduce reality. A prime example is the painting on the left: “Two Sisters,” by Theodore Chasseriau, done in 1843. (The photographer and subjects of photo on the right, perhaps inspired by the old painting, are unknown.) photos courtesy of Musée du Louvre/Musée d’Orsay, Paris

“IT’S ABOUT THE BASICS”

Sentiments like Lin’s speak to why teachers of the trade, like USC’s Linda Brown, continue to put students through the paces of analog film. “It’s about the basics,” Brown says. “Technology will change, but the laws of physics will stay the same. What is light and how does it travel? Film forces you to learn the fundamentals.”

Brown, an accomplished filmmaker herself, describes digital camera classes where she has to yank her students off “the nipple of the monitor” to assess the true, physical nature of what they’re seeing. She also sees an increase in the amount of footage they shoot. “Students ask why they can’t just keep shooting forever since it’s digital? I tell them, this is why we start with film. To instill the concepts that inform the creative choices. They’re so used to everything being instant.”

Brown admits to an aesthetic predisposition, teasing her students that if film and digital were water, “digital would be a dripping faucet when you’re trying to sleep, while film is when you’re lying in bed with a lover and it’s raining outside.” But she emphasizes that the power is in the image, not the technology, and that true beauty is in the eye—and vision—of the beholder.

Filmmaker David Munro used an analog visual-effects technique called “bi-packing” to create an ethereal scene of children dancing in a hallway for his 1994 short film, “Bullethead.” The shot, at 6:17, involved stacking film strips on top of one another and then re-photographing them onto a single print. Photo by Steve Muller.

CELLULOID’S NEXT ACT

When the French painter Paul Delaroche saw the first daguerreotype photograph, created in 1839, he exclaimed, “As from today, painting is dead!” His fear, widely shared at the time, was that painting, now surpassed in its representational function, would whither into obsolescence. Instead, a man named Claude Monet started painting water lilies in a rather curious fashion, and the age of Impressionism was born. Rather than kill painting as an art form, photography liberated it from mimetic servitude. And we all know how that turned out. [To see how some film artists are creating a new era of impressionism, of sorts, this time with celluloid, see my companion story, “Film’s New Generation of Experimentalists.”]

“It’s a mistake to think that new technologies completely supplant the old,” says Howard Besser, director of New York University’s Moving Image Archiving and Preservation Program.Remember before TV? There was something called radio. It didn’t go away, it just became less ‘mass.’ Maybe that’s a good thing.”

When I ask why, Besser thinks a moment before responding. “People who choose to work with film now are craftspeople. There’s a renewed closeness to the object itself in a practice that’s hard to master.” Kodak’s Bellamy agrees. “No doubt, it’s harder to make a film than a file,” he says. “Film is about craft. It’s about making a piece of art that will stand the test of time.”

As a career filmmaker who has shot extensively in both film and digital, with equally superlative results, I have cried only once while viewing my work. It was in the screening room of a now-defunct Bay Area film laboratory watching the release print of my first short film, “Bullethead,” which would go on to play at the Sundance Film Festival. In a flashback scene of children dancing in a hallway, I superimposed multiple takes of slow-motion footage using an optical printer hacked from a push-button telephone. Weeks went by before I could see what we shot, and months more before I could see it in the full context of the finished project. When the film was finally spooled up and projected, the effect exceeded my wildest dreams. It was pure cinema. Something happened in the camera when light hit those children and activated those crystals; something beyond digital media’s matrix of ones and zeros. Something that was not quantifiable.

What is quantifiable, and perhaps astonishing, is the new generation of moviegoers willing to pay a premium for an analog experience. The Alamo Drafthouse theater chain has installed more than 40 film projectors in 39 locations across the country,  after consistently selling more tickets for film screenings than for the same titles on digital. Limited engagements of Tarantino’s “Once Upon a Time in America,” shot and projected on wide-format, 70mm celluloid have enjoyed the kind of hype—and upcharge—typically reserved for 3D and IMAX.

Perhaps moviegoers sense something special going on, after all. The legendary film editor Walter Murch once shot an empty room on both film and digital video to gauge the difference. The feeling he got from the room on film was “a rising potential, as if somebody was about to come in.” The feeling he got from the room on video, however, was “of somebody just having left.”

Just as I was wrapping up this article, I got a text from Kodak’s Bellamy—he was at the 2020 Oscars, where Kodak-shot films had notched a staggering 37 nominations. When I text-teased him that “Parasite,” which won Best Picture, was shot on digital, he immediately shot back. “Bong Joon [Ho, the film’s director] wanted to shoot on film,” he said, “but wasn’t confident in Korean labs. We’ll get him next time.”

In a follow-up call the next day, Bellamy, still buzzing from his red-carpet glories, had more good news. An ultra-high-resolution scanner invented to film the 2019 documentary, “Apollo 11,” recently revealed that a single frame of celluloid film contains 40 times more information than an entire digital chip. Over time, even if those chips produce images with higher and higher resolution, Bellamy feels confident the chips will never catch up. “Think about the shape of pixels,” he said. “The basic unit of a digital image is a box. You can have a billion-trillion boxes, but you will never find a square in a human face.”

Bellamy catches his breath, taking a moment to reflect. “Five years ago, we were just hanging in,” he says. “Two years ago, we were making a comeback. Now, we’re dominating again. There will never be a better capture medium than film, because the depth of film is infinite.”

More stories from this issue:

The Craft of Sustainable Rice Farming

The Human Cost of Recycled Cotton

The New Water Alchemists

Latest content:

Watch “The Masons of Djenne”

Listen to “The Cowboy Folklorist”

Mending: An Ancient Craft for Modern Times

Back To Top