Daniel Leiter
Information
2023
For the work, a few hundred photos of industrial human anatomy models were web-scraped to build up a custom dataset that was fed into a Generative Adversarial Network (image generating algorithm). The algorithm trained for almost 2 weeks on 2.3 million images, progressively approaching the realism of the input images, but failing to reach it due to the variation of the input images and the size of the dataset, resulting in a “hallucination” of the AI.
The term, coined by the tech industry, describes false or invented output of generative AI, attempting to mystify a major problem of the new technology.
Based on the visual information of the input data, the algorithm calculated models neglecting the anatomical reality and writing itself as well as its history into the depicted anatomy, creating technologically distorted biological forms.
One of the images was hand-modeled into a digital 3d model, CNC-milled and cast in ceramics multiple times. The strangely colored AI-generated images were reimagined by pouring colored ceramics into the mold, resulting in a similar fluid coloring.