Sonic Lens
Interactive art-and-technology project that transforms uploaded video into immersive sound, projected visual lenses, and a participatory installation experience.
Presented at:
Sonic Lens explores a simple question: what if sound could once again become a lens through which humans experience the world?
Inspired by bells—one of humanity’s oldest public sound technologies—the project transforms uploaded video into layered sonic and visual interpretations. Through modes such as place, tension, calm, memory, rhythm, and atmosphere, each video becomes a unique sensory translation.
The work is evolving both as a web-based application and as an interactive indoor / outdoor installation. Visitors can upload video through a QR code or activate autoplay content, allowing the system to generate a living soundscape alongside projected visual “lens” tracks that reshape how the original moment is perceived.
Collaboration:
Sonic Lens is shaped through collaboration across story, sensory design, system prototyping, and spatial installation thinking. It brings together different disciplines while also connecting multiple systems — uploaded video, sound translation, projection, interface flow, and audience interaction — into one shared experience. Collaboration is therefore not only part of the team structure, but also part of the work itself: image, code, sound, light, and public participation continuously inform one another.
Art and Technology:
In Sonic Lens, technology acts as the translation engine, while art shapes the emotional and spatial experience. Video input is transformed into sound, visual lens tracks, and installation output, while the frame, bells, light, and projection turn that process into something sculptural, atmospheric, and experiential. The project treats technology not as a hidden utility, but as a visible material that can be seen, heard, and felt.
Depth:
Sonic Lens asks what happens when humans stop treating sound as background and begin to read the world through it again. It reflects on perception, atmosphere, tension, memory, and the role of machines in helping us sense environments differently, not simply faster. At a deeper level, the project asks whether future technologies can make us more attentive, more embodied, and more connected to the richness of the world around us.
White Mirror:
Sonic Lens imagines a positive, human-centered future in which technology expands human sensitivity rather than reducing it. Instead of extracting attention, it turns everyday video into a reflective, participatory, and shared sensory experience that invites curiosity, accessibility, wonder, and presence. It proposes a future where digital systems help people listen more deeply to places, moods, and one another.
Tools and materials used:
Software / AI / Data:
Web-based application prototype
Video upload and playback flow
Visual analysis and lens-generation logic
Audio-mapping / sound translation system
QR-based interaction pathway
Hardware / Sensors:
Smartphone or camera video input
Embedded sound system / speakers inside bell-inspired structures
Projection system for wall or ground
LED lighting integrated into the installation
Laptop / screen for prototype control and demonstration
Media / Output:
Interactive installation
Web experience
Sound
Projected visual tracks / lenses
Demo video
Physical sculptural unit
Other tools / materials:
Clear acrylic, translucent or black optic finishes
Visible data wires with LED lighting
Bell-inspired sound vessels
Architectural frame inspired by historic bell structures
QR code access for audience participation
Project demo / how to experience it:
Sonic Lens can currently be experienced in three ways: through the presentation, through the live prototype, and through the demo video. In its installation form, visitors scan a QR code to upload a video or activate autoplay content, then experience how the system translates that source into sound, projected lens tracks, and an immersive spatial environment. During the presentation, reviewers can look at how one video becomes multiple sensory interpretations through sound, light, and visual translation.
Team:
Silvie Claes — Storyworld Architect
@Srix — The Alchemist
Bart Cuppens — Spatial experience / architectural support
Presentation: canva.com
Application: soniclensart.netlify.app





