Interactive Systems · Data Visualization · AI-Driven Tools
Designing interactive systems—advanced product tools and UIs, immersive experiences, and AI-driven interfaces. End-to-end ownership from exploratory prototyping to stable, production-quality implementations for advanced product teams.
Skills & Capabilities
Interaction & UX Engineering
Interaction architecture for complex tools and workflows
UI design for dashboards, timelines, and multi-panel layouts
Rapid prototyping for UX research and stakeholder demos
Front-End & Interface Engineering
TypeScript / JavaScript
Vue, React and component-based architectures
Three.js / WebGL, Canvas, rendering and performance optimization
Data, Realtime & AI Integration
Real-time data pipelines (sockets, Firebase, streaming dashboards)
Sensor and signal processing for audio/motion-driven interfaces
Integrating AI/ML outputs into explainable, usable UX
Advanced Prototyping & Real-Time 3D Systems
Real-time 3D interfaces for tools, simulation, and visualization
3D scene, camera, and interaction design across devices and environments
Environment-aware and multi-device interaction patterns (spatial audio, headtracking, device sensors, WebXR / ARKit / ARCore / 8th Wall) used to de-risk novel interactions before production
Intuitive Surgical — Advanced Product Design
Roles: Design Technologist, Data Visualization, AI/ML
Designing and engineering advanced prototypes for future surgical video and analytics tools, exploring new ways for clinicians and researchers to navigate, understand, and summarize complex procedures.
Sole design engineer / design technologist embedded in an advanced product design team, responsible for building future-facing prototypes for surgical video and data tools. Developed internal case-explorer concepts that link procedure video with rich system data and event timelines, and prototyped interfaces for structured procedure insight and automated post-case review.
Founded Extrasensory, a digital product studio focused on creating professional tools for creative expression. Led the complete product development lifecycle from concept to commercial launch, establishing brand identity, technical infrastructure, and go-to-market strategy.
First product: VEX MIDI Expression, a cross-platform audio plugin that transforms MIDI controllers into expressive instruments using real-time physics simulation. Developed hybrid architecture combining JUCE C++ for DSP with React/TypeScript UI, achieving sub-10ms latency with sample-accurate MIDI timing.
Built automated CI/CD pipeline for cross-platform distribution (macOS/Windows/Linux, VST3/AU formats). Designed and launched e-commerce platform with Stripe integration and automated delivery system. Product successfully launched and actively used by music producers worldwide.
Exhibition Roles: Lead Developer, Exhibition Design, AR/VR
For the permanent Adidas AR exhibition at Adidas HQ in Germany, I acted as creative technical lead for the interactive exhibition. The experience featured a number of interactive AR experiences that were triggered by physical markers.
Exquisite Landscape
LandscapeClock is a generative AI art piece that creates a continuously panning 24-hour landscape panorama synchronized to real time. A Railway background worker runs daily, using LangChain + OpenAI to generate 24 chained narrative prompts—each referencing the previous for continuity—then iteratively builds a seamless panorama using Stability AI's mask-based inpainting, preserving existing content while extending the scene segment by segment. The resulting image and continuity files are uploaded to Vercel Blob storage, where a Nuxt/Vue frontend pans through the panorama based on current time, with hourly descriptions appearing via typewriter animation. Multi-day continuity is achieved by using each day's final segment as the next day's starting point.
This prototype shows the potential of dynamic head tracking and AR interaction with remote content using AirPods. This opens up a range of new spatial interactions, in particular with wayfinding using spatial audio cues. The AirPods send motion data to the iPhone app which then sends data to the server. Multiple apps running on other screen can then leverage the motion data.
Immersive Motion Drawer
This prototype uses immersive 3D particle to transform body motion into an immersive interactive experience. The technology behind the demo has two main components: 1. an interactive 3D particle environment powered by Unity3d software, and 2. a web-app which uses rotation data from a mobile device to remotely interact with the interactive 3D display. The system also utilizes a Firebase realtime database to transmit the data from the web app to the interactive display.
Interactive 3D Web Campaign
Lead Software Development for a seamless 3D web experience that allowed users to experience a new product in a fun way. The experience was built using Three.js and WebGL to create a fully interactive 3D environment that could be explored in the browser.
Touchless Web Prototypes
Roles: Lead Developer, AR/VR, javascript, three.js
In response to the covid public health crisis, Touchless is a series of prototypes created to envision ways in which touchless technology can be used in physical environments. I was the creative technology lead in the remote manipulation prototype. Here viewers can interact with exhibit artifacts from their smartphone.
Future Studio Prototypes
For the creative research oriented Future Studio at Valtech, I created a number of prototypes. I created this emotion detection prototype to explore the aesthetic and technological potential of realtime facial recognition tools. The protoype captures facial movements and renders it onto a 3D avatar, and tries to infer emotional states based on data from Apple ARKit framework.
8th Wall AR Experiments
A prototype for 8th Wall that demonstrates an interactive AR experience that allows users to explore a 3D space with their mobile device.
WebXR Experiments
Roles: Lead Developer, AR/VR, javascript, three.js
Experiments with fully web-based XR using Three.js
AR 3D drawing system draws the outline of 3d objects with custom software
Permanent Installation at Microsoft Cybercrime Center
Roles: Software Development, Creative Technology, Interactive Data Art and Exhibition Design
At The Office for Creative Research, we created a permanent installation for the Microsoft Cybercrime enter which maps and visualizes botnets in the wild to give researchers a more intuitive way of understanding their activity over time. Using realtime datasets from millions of infected computers, we created an interactive application that allowed the data to be explored visually and sonically.
ScreamOmeter – Breaking glass with sound at Norwegian Science Museum
A collaboration with Gagarin for an installation where people get a chance to break a wine glass by using nothing but their own voice. As a game experience, the installation demonstrated the physics of sympathetic resonance where an audience member's voice would cause a real glass to shatter. A custom system incorporating architecture, software, physical computing brought the experience to life.
Wonwei is a research-driven design & technology studio working on commissions, products and artworks. Working as Art Director and Technical Lead on a number of projects. Wonwei was commissioned by Universal Music Group to create a realtime and immersive 3D visual show for musician Ólafur Arnalds' world tour. A software system clandscapes were created to create a atmospheric narrative in response to music during the concert. Each landscape would interact to the live music and movement from the performer using a Kinect camera.
Study For Resonators
Roles: Art Direction, Software Development, Circuit Design, Creative Technology
Fifty resonating structures create a evolving polyrhythmic installation that transform the gallery space into a living sound sculpture. The percussive instruments create an perpetually evolving musical composition, developed using custom software and physical computing to activate the custom designed instruments. Commisioned by media art festival Raflost in Reykjavík, Iceland.
Microperception Window Installation
Roles: Digital laser fabrication, rapid prototyping, software development, art director
A series of visual compositions were created for an exhibition at Third Space Gallery in Helsinki, Finland. The works explore light phenomena and forms that are barely perceptible from a distance but become clear on closer inspection. A new technique was developed where custom software creates microscopic etching patterns on a glass surface, creating a perceptual play of light, color, and reflections.