Constant Dullaart

Artist and professor Networked Materialities at ADBK-Nürberg

Accepting the Job (2023) by Constant Dullaart
Accepting the Job (2023) by Constant Dullaart
*Refusing to build another highstreet.* In light of the synthesis of "photo-realistic" depictions, the representational value of lens-based documentation media becomes comparable to that of a painting. Think about it. Most images have become as trustworthy as a painting—a subjective depiction of something that either happened in some form or was described and generated. The technical details that could reveal an image is authentically lens-based, as opposed to a generated or synthesized image, are imitated or spoofed one by one. From the artifacts in pixel compression to the finger problem, the incentive is to convince beyond any doubt that the generated—or, one might say, spoofed—images could be photographs. This undermines systems of verification we have lazily grown accustomed to since the wide adoption of photography, throwing us into a period of iconoclasm. (Hi! Welcome, enjoy! 😊) Current generations have been raised considering certain types of documentation as technical evidence and are now effectively thrown back to living at the beginning of the 19th century. A painting is just as true as an iPhone snapshot—and vice versa. Withholding specialized forensic and evidence photography, the discourse about truthful photographic documentation already included manipulations with negatives, Photoshop, or framing, and changing colors in recent decades. Photography as a reference medium is dead. With the advent of fully spoofed images, we are back to an age of folklore, myths, and riddles. However tempting an age with alternative interpretations to our ever-increasing fascist future might seem, the hunger for simple narratives persists. How do we, as artists, respond to this new technology enabling this spoofing of depictions? Do we enter the demo phase of a new medium, showing and celebrating what is possible—perhaps with a dash of critical reflection about bias? In this light, it is interesting to see how Vincent van Gogh wrote about the impact of photography to his brother Theo. He spoke about the scale of reproductions and the images lacking certain convincing qualities that painting had to offer. He discussed in detail how photography offered him a method to study other artists' techniques, but also how it provided him a window to develop his work beyond mere depiction (as an Impressionist). Loving the printing press over the locomotive, van Gogh knew he was indebted to new technologies for the development of his work. If all you have is a hammer, everything starts to look like a nail. If all you have is Sora, does everything start to look generated? And following this analogy, can you tweak the hammer or add some cultural preferences—or remove some biases and add others? The core issue with outsourced data interpretation remains biases in the input data. Open-source technology enables interactions with these biases, while high-street, closed-source solutions limit them to favor ease of use, stifling idealistic open-source incentives. Clearly, the answer from the artists asked to demo and test Sora for OpenAI was that this form of the technology should not be under the sole control of a predatory commercial company using artists as beta testers for a product. The importance of leaving multiple on-ramps and off-ramps to this technology during its development by supporting open-source culture is something I wholeheartedly support. Although the model was not actually released, access to it was made available to anyone for a short period. The symbolic value of that action remains. Giving up privileged access in order to make a point: Another high-street, funneled access to technology under the control of a culturally dominant and repressive climate won't help anyone deal with the troubles that are still to come—let alone with the dismantling of the depictive value in media. While Alphabet discards its old models and destroys development history that is culturally significant, and while we cannot see into the black box of OpenAI, we need all the symbolic action we can get to resist these types of practices. When anyone resists big money and demands open access to increase culturally diverse development, that deserves support. It does not need to be this way. Demand transparency. Support open-source development.