A wearable camera using mood sensors to trigger video capture of memories, and create immersive experiences for retrieval.

M.I.A

Life is full of meaningful and exciting moments. However, we can't possibly memorize all the small moments that we have been through. What happens when we forget those precious experiences? Mia (Mood Intelligent Assistant) is a wearable line of sight camera that uses mood sensors to capture memories, and creates immersive experience for retrieval. When Mia is activated, it records video and biometric data that get stored in the cloud. Each memory is assigned a variety of parametric information based on users' emotions. This also enables users to alter or enhance their mood in the future by re-experiencing memories suggested by Mia.

Context

The goal of this project was to explore new interactions for a line of sight camera. It was a three-week long group project in an interaction design course. We designed from an overall systemic perspective, thinking about what it meant to see the world through a line of sight camera. What would we want to capture? How does it get stored, and what happens to the information in the future? We produced a video prototype to communicate our design concept and  to give potential contexts of future use.

During the project, I learned that a good interaction designer creates an environment for interactions between people and artifacts, and turns operations into experiences. The key point was to always focus on designing relationships - the interactions between people, products and the world - instead of simply designing the product itself. I took this as the mantra in my every future design project.


Team
Ann Lin
Adam Riddle
James Pai
Elsa Ho
DD Ding

Timeline
2016 Nov - Dec

01
____ Understand__

Functional Decomposition

We first started with understanding a camera's functional purpose to better explore new concepts for line of sight camera. What does it mean to see the world by line of sight camera ? What was it designed for ? We deconstructed concrete ideas of a camera step-by-step into abstraction, from its physical form, physical function to the functional purpose. The result of our decomposition was the core of a line of sight camera : create amazing moments that can be re-lived. Using it as a baseline, our team decided to focus the design on personal memory retrieval. We wanted to bring users back to cherished memories and experience them as if they were there again.
02
____ Understand__

Ideation

After a series of team brainstorming sessions, we built a new camera concept: a wearable camera that could both capture sensory information and recreate the mood of an experience based on the user’s mood. For example, the device would know when the user is in a cheerful mood and record the moment for retrieval. Beyond that, when the user is feeling stressed out, it can also suggest a calming experience and project it to aid in relaxation. We then fleshed out a complete usage scenario given the context of our design. This helped us define what it can do, who uses it, what situation they are in,  why they use it, and how they use it.
03
____ Build __

Interaction model

The concept led to two interaction touch points: capturing videos and retrieving memories. For the physical device, we incorporated a mood sensing mechanism based on EEG technology that would automatically record experiences. For memory retrieval, we designed an interactive mood wheel for the dashboard concept. The wheel enables users to choose suggested videos based on a mood color value. Somber memories are darker colors; cheerful memories are lighter. Since emotion can be  more fuzzy and abstract than precise adjectives, we offered a visual way to represent the concept of moods instead of a traditional search menu. The mood wheel was designed to lift the burden of searching and browsing from the viewer.
Physical Device
Building the interactive system
04
____ Deliverable __
The concept led to two interaction touch points: capturing videos and retrieving memories. For the physical device, we incorporated a mood sensing mechanism based on EEG technology that would automatically record experiences. For memory retrieval, we designed an interactive mood wheel for the dashboard concept. The wheel enables users to choose suggested videos based on a mood color value. Somber memories are darker colors; cheerful memories are lighter. Since emotion can be  more fuzzy and abstract than precise adjectives, we offered a visual way to represent the concept of moods instead of a traditional search menu. The mood wheel was designed to lift the burden of searching and browsing from the viewer.
The gesture-based interface allows the wearer to take control of curation and fine tune the experience based on preference.
This thin device loops over the ear and record/project video, from the same line of sight, visually, as the user sees.