Anonymous #4

By anon guests

Anonymous #4
This has happened before. Remember the Llama torrents leak? This is a big deal, for sure, but was never going to be a 'checkmate' against OpenAI. Critics of the leak have pushed that it makes the prospect of future access to OpenAI (etc) development more fraught. As Domenick Ammirati [suggested](https://spikeartmagazine.com/articles/do-you-have-to-be-hot-to-get-ahead), 'the problem is that securing the bag *becomes* the art'. We also know, from art, the lore generated by [setting that bag on fire](http://klf.de/home/publication/k-foundation-burn-million-quid/). What at first appears to be a straightforward question regarding artist remuneration (which is enduring and general) conceals a more specific question: what is the value of artist 'beta-testing' as a *quid pro quo* transaction? The suggestion in circulation is that artists are actually providing content-indexing as a 'bug fixing' service. In fact this type of work more closely resembles search-engine-optimising for content production. Here, the role of artists is more constructive than corrective, less 'fix the model' and more 'show the model can say something interesting'. This small contribution sustains the baseline of the artists' contribution at what Venkatesh Rao terms [*premium mediocre*](https://www.ribbonfarm.com/2017/08/17/the-premium-mediocre-life-of-maya-millennial/). Trust-boosted conditions, although they necessitate both parties, can only be established by the host company. Without this, corporations like OpenAI risk artists shitposting and pumping generated prompts back into its freshly curated model; mid-fidelity content intended to fortify a model's mediocrity. Indeed there are important reasons *for* something *like* beta-testing under better conditions: when the purpose of an artist in such a context is either to be a fully-blown developer on a project, or, more than 'beta-tester', a *purple teamer* of the system. If artistic production is indeed 'caged' by such developmental contexts, it is not because artists are structurally oppressed by technology companies. Rather, this could be better understood as the constraint of a much more fundamental participation artists could have in the research and development context if brought in earlier, resourced more generously. Artists, placed at a more opportune point in the development process, could make a much greater contribution to the overall production of the technology (understood here in its most holistic sense). In the end, such leaks may only serve to deepen the lore of OpenAI-conspiracy-complex, extracting some PR value for the artists involved as a byproduct. In lieu of serious involvement in the development of such technologies, we encourage artists to allow the heat to dissipate out of the 'training data IP' debate, which operates as a distraction, taking attention away from the more important issue of model and infrastructure ownership.