Hallucination in NOVATRON by Deyu Zeng (China)

Hallucination in NOVATRON is an immersive project that explores AI hallucination in deepfake technologies and its impact on our understanding of reality. Through digital storytelling, performance, and media experiments, the project explores how AI hallucination changes the way we see and understand the world. By creating fictional AI-driven roles that interact with real people, Hallucination in NOVATRON questions the growing influence of AI hallucination. It highlights the fine line between simulation and deception, inviting audiences to experience a world where it becomes difficult to tell what is real and what is not.

Hallucination in NOVATRON: Speculative Design and Machine Learning-driven Immersive Installations
IRCAM Forum Workshops 2025, Taipei

 

 

This project examines the phenomenon of AI hallucination—when artificial intelligence generates fabricated or misleading content—through the lens of deepfake technologies. With the mainstream adoption of large language models (LLMs) and synthetic media, hallucination has moved from a technical limitation to a pressing societal challenge, shaping how we experience trust, deception, and truth.

Hallucination in NOVATRON creates a fictional AI hub, “NOVATRON,” as a speculative platform where simulation and deception intertwine. Research traced the conceptual evolution of hallucination from early neural network creativity to LLM-driven mainstream recognition, focusing on its sensory dimensions: visual (deepfake video manipulation), auditory (voice cloning and synthetic speech), and verbal (AI-generated narratives). These forms reveal how deepfake technologies amplify hallucination, intensifying fraud, eroding social trust, and complicating our ability to distinguish fact from fiction.

Methodologically, the project combined performance-based interviews, symbolic props (masks, prosthetics), and role-play analysis to explore lived experiences of deception in professional and everyday contexts. The design process materialized in a multi-layered installation: a NOVATRON website featuring fictional staff profiles, printed correspondence that blurred digital and tangible communication, and AI-synthesized interview videos in which non-English speakers were voiced through AI-generated translations.

The final exhibition invites audiences into this staged hallucination: navigating the website, listening to synthetic voices, reading deceptive letters, and watching uncanny interviews. By orchestrating these encounters, Hallucination in NOVATRON asks participants to confront the fragile boundary between simulation and deception, highlighting how AI hallucination destabilizes shared reality and challenges the foundations of social trust in an AI-mediated future.