How I Fell Back in Love with iPhone Photography

Save this storySave this storySave this storySave this story

There’s a Japanese word, komorebi, that describes beams of light and dappled shadows that result when the sun shines through trees. When I take my dog on walks around my leafy neighborhood in Washington, D.C., komorebi is what most often catches my eye, especially in this autumnal moment when dense, green summer foliage is starting to thin and turn golden. As the sun sets and the shadows grow long on the edge of a precipitous valley near my apartment, the foliage creates fluttering patterns of warm and cool colors. I try to photograph these apparitions with my iPhone camera, but I’m always disappointed in the results: the device’s automated image processing treats contrast as a problem to be solved, aggressively darkening the highlights and lightening up the shadows to achieve a bland flatness. Little of the lambent atmosphere I see in real life survives in the image.

Downloading a new camera app recently changed things for me. Halide, which launched in 2017, is an elegant program that can be used in place of your phone’s default camera. It mimics the controls of a digital S.L.R., allowing, for instance, the user to manually adjust the focal length. Halide is a complex app that’s suited for experienced photographers (the name comes from a chemical used in photographic film), but it can also be made very simple, because in August it added a new setting called Process Zero. Once the mode is switched on, the camera does as little processing as possible, avoiding artificial-intelligence optimization and any other dramatic editing of the photo. (It still performs basic tasks like correcting the white balance and lens distortion.) The iPhone typically merges many separate images together to create one composite; with Halide, you get a single digital image that preserves the richness and the contrast of what you see in front of you. Shadows survive. Highlights may be blown out, but the camera doesn’t bring out detail that the eye wouldn’t necessarily catch, as the iPhone does, for instance, with clouds in a bright sky. Where Apple’s automatic editing irreversibly smooths out the digital grain that you get in dim images, Halide preserves it, yielding images that appear more textured. Eschewing the uncanny perfection that marks so much iPhone photography, Process Zero has made me enjoy taking photos with my phone again, because I don’t feel like I’m constantly fighting against algorithmic editing that I can’t control or predict. Ben Sandofsky, the co-creator of Halide, told me, of the program’s ethos, “We’re going to keep this as dumb as possible.”

Process Zero has proved a hit for Halide, which is the product of Lux, a parent company that makes several niche, high-powered camera apps. (The Halide app costs almost sixty dollars but can also be paid for monthly or annually.) Since the feature launched, Halide has been downloaded more than two hundred thousand times; the app has hundreds of thousands of monthly active users. “It ended up blowing up like we couldn’t believe,” Sandofsky said. Along with trends like the dumbphone and the Fujifilm X100 digital camera, the popularity of Process Zero is another sign of a surging demand for technology that resists aggressive A.I. decision-making. These new tools are satisfying our urge to go backward, to technology that lets us explore our preferences rather than imposing its own defaults upon us.

Sandofsky, a forty-two-year-old living in Manhattan, always wanted to pursue visual art, but his father urged him toward a computer-science degree. After college, he eventually landed a job in San Francisco at Twitter, in 2009, and the windfall from the company’s I.P.O., in 2013, gave him the financial freedom to pursue a passion project. Apple’s iPhone cameras were getting better and better; they were “magical devices” that let you “press a button and get a good photo,” Sandofsky said. Their convenience presented an existential question for traditional photography, as he put it. “Is there a place in the world for a manual camera?” He partnered with Sebastiaan de With, a former Apple designer and amateur photographer, to build Halide and restore the user’s power to decide what counts as “good” on their own. Over most of photography’s two-century history, photographs “were never hyperrealistic,” Sandofsky said. They were sometimes unusually blurry or focussed in unintended spots; they were tinted by the chemical makeup of different types of color film. Apple’s camera, by contrast, seems to put everything into focus at once and saturates each color, exposing each plane of the image equally. “Modern computational photography is almost in this uncanny valley,” Sandofsky said. The Apple image’s quality of, let’s say, all-overness can lead to a kind of confusion in the viewer: if nothing in a picture is visually emphasized over anything else, we don’t know what we’re supposed to be looking at.

With the emergence of optimized digital images has come the realization that the “realest” image may not be the most pleasing one. I’ve found that I prefer photographs that look like photographs. The composer Brian Eno wrote something in his book, “A Year with Swollen Appendices,” that I think of often: “Whatever you now find weird, ugly, uncomfortable and nasty about a new medium will surely become its signature. CD distortion, the jitteriness of digital video, the crap sound of 8-bit—all these will be cherished and emulated as soon as they can be avoided.” The iPhone can rid digital photography of its aesthetic flaws, but what if the flaws are actually positive features?

In his new book of memoiristic essays, “The Picture Not Taken,” the photographer Benjamin Swett writes, “The camera sees more than the photographer.” In other words, moods and details are captured in a photo that might not be apparent until after the fact; you try your best to frame a shot, but what it ultimately contains may still surprise you. Accidental imperfections may become a photo’s main achievement, so long as those imperfections are not erased first. Using Halide, I feel happily surprised by phone photos again. The process reminds me of messing around with a medium-format film camera when I was in high school, before phone photos were more than just messy pixels. In college, I joined a photojournalism group, where one lesson, hammered home during a field trip with the photojournalist Gary Knight, was that photographs should be made while you’re looking through the viewfinder. The digital S.L.R. cameras that we were using on the trip could take a dozen photos in a heartbeat, but the point was to think through the scene and the composition, as well as the settings of your camera, in advance of taking a shot. All the editing or processing in the world can’t fix a fundamentally bad or boring photo. The knowledge that Halide won’t gloss over imperfections makes me slow down and consider the creative process for a beat longer. It makes me think more about what I’m seeing.

With Apple’s iOS 18, you can edit your phone’s lock screen so that Halide, rather than the default camera app, is available with a single tap. (The app’s logo appears in the bottom right-hand side in place of the usual camera, though the Apple default is still available with a swipe.) The Process Zero camera is now the only one I use. The interface is very similar to the iPhone camera’s, with a big preview frame and a button for taking photos. The app just does less on the backend, though if you want to, you can also go deeper into the settings or edit the photo after the fact. Taking the photo without A.I. to begin with doesn’t mean you can’t apply it later.

My phone camera roll looks different now. There are fewer repeats or slight variations on the same image, taken in a burst. The compositions are less static or symmetrical, and the colors are funkier. I have a Halide snapshot of some leaves casting shadows on a tree’s trunk, which occupies most of the frame. Whereas the Apple phone app tends to turn every color warm, this one is very blue and cool, and the background behind the tree is dim rather than eye-burningly brightened. But I prefer the photo that way: the visual qualities of the specific scene are still there to be appreciated. It doesn’t, and shouldn’t, look like everything else. ♦

Sourse: newyorker.com

No votes yet.
Please wait...

Leave a Reply

Your email address will not be published. Required fields are marked *