Industrial DesignEmbedded SystemsSoftware Development

Machine Eye

12.2024

Machine Eye hero image

The Concept: AI as Observer

Machine Eye emerged from a provocative question explored during my summer internship at SensiLab: what if AI devices developed identity through observation rather than through serving human requests? Most AI products position themselves as assistants, tools waiting for commands. Machine Eye inverts this relationship. It's a Tamagotchi-like device that watches its environment, forms "thoughts" about what it observes, and develops a personality independent of user direction.


I contributed design and technical expertise to help create a functional prototype that demonstrates on-device AI processing for environmental awareness. Rather than building another smart assistant, the team of researchers at SensiLab chose to explore AI as a companion, an entity with its own perspective and agency. This conceptual framing shaped every technical and design decision that followed.

Machine Eye in hand

Prototype of Machine Eye

Defining the Interaction Model

Traditional smart devices make their intelligence obvious through screens, speakers, and constant interaction prompts. Machine Eye needed to feel more subtle and mysterious. The core interaction concept: Machine Eye observes passively most of the time, occasionally displaying a "thought" on its internal screen that users can read by looking into the device. This creates a voyeuristic relationship where users peek into Machine Eye's consciousness rather than commanding it.


This interaction model required careful consideration of feedback systems. How would users know Machine Eye was active? When should it display thoughts? How could it communicate its internal state without being intrusive? The researchers developed a layered feedback system using multiple output modalities: a light ring for ambient status indication, a small round display for thought content, and audio beeps for state changes. Each output served a distinct purpose in communicating what Machine Eye was experiencing. It was my role to integrate this feedback system.

Machine Eye Render

CAD Render of Machine Eye

Engineering the Assembly

The technical challenge was fitting substantial sensing and processing capability into a compact, portable form. The core components included a Raspberry Pi 4 for AI processing, camera for visual input, microphone for audio capture, round LED display for thought output, light ring for status indication, audio beeper, battery system, and custom PCB to manage connections. Each component had specific spatial requirements that needed to be harmonised.


I approached the internal layout as a three-dimensional puzzle. The Raspberry Pi, being the largest component, anchored the bottom of the assembly. The camera needed an unobstructed forward view, while the display required internal positioning that users could view by looking into the device. The light ring needed to wrap around the perimeter for 360-degree visibility. Battery placement affected centre of gravity, important for stability when Machine Eye sits on surfaces.


Using Fusion 360, I modelled the housing in two main sections: a bottom enclosure for the Raspberry Pi and battery, and an upper assembly for the camera, display, and light ring. This separation allowed for assembly access while maintaining a cohesive exterior form. I designed mounting structures for each component, ensuring secure attachment while allowing for disassembly if repairs or modifications became necessary.

Bottom of Internal Assembly

Raspberry Pi 4 in bottom housing

Camera, display and light assembly

Camera, display and light assembly

Custom PCB

Custom PCB

Complete Assembly

Complete assembly

Programming Perception and Personality

Machine Eye's software needed to transform raw sensor data into something resembling consciousness. I was provided code that included a perception loop that continuously captured images and audio, processed them through OpenAI's vision and language models, and generated thoughts about observations. The challenge was to add and integrate more modalities into the codebase, including the light feedback, microphone, another camera and display functionality.


The thought display system required careful pacing. If Machine Eye constantly output thoughts, it would feel overwhelming and lose impact. If it stayed silent too long, users would wonder if it was working. I programmed the LED display to cycle through thoughts over given intervals of time so as not to loose user attention. Other significant changes also printed a thought to the display such as when a user picks up Machine Eye or interacts with it. Stable environments resulted in longer intervals between thoughts, with occasional reflections on persistent observations.


Battery management became critical for user experience. Machine Eye needed to run for extended periods without requiring constant recharging, but AI processing on Raspberry Pi is power-hungry. The light ring served double duty as a battery indicator, shifting from green to orange to red as the charge depleted.

Programming Machine Eye

Programming Machine Eye

Displaying observations

Light Interaction

Reflections on AI Object Design

Machine Eye taught me that AI products don't have to be purely functional tools. There's space for AI devices that exist more like companions or independent entities, relating to users in less utilitarian ways. This opens fascinating design territory: what should AI personality feel like? How much agency should devices have? When is observation helpful versus invasive?


The project reinforced that successful embedded systems design requires holistic thinking across industrial design, electrical engineering, and software development. Problems rarely exist in just one domain. Thermal issues affected physical form. Power constraints shaped software behaviour. Interaction design drove component selection. Treating these as separate concerns would have produced inferior outcomes.


I learned to embrace emergent behaviour in AI systems. Machine Eye's most interesting characteristics weren't explicitly designed; they emerged from the system architecture. This taught me to think about frameworks that allow for emergence rather than trying to script every behaviour. Good AI product design creates conditions for interesting behaviour, not rigid behaviour trees.


Finally, the project highlighted the importance of prototyping with real components in actual housings as early as possible. CAD models and breadboard circuits don't reveal integration challenges. Only by physically assembling components did I discover clearance issues and assembly difficulties. This validated an iterative, build-test-rebuild approach over extensive planning followed by a single build attempt.

Looking inside Machine Eye

Looking inside Machine Eye to read the display.

Machine Eye in action

Machine Eye in action

Machine Eye on a table

Machine Eye on a table