Save, store and explore memories in hybrid reality
Gems is a virtual container for memories, in which the remarkable moments are stored and visualized as distinctive gemstones and can be viewed at the recording place via mixed-reality smartphone app.
In a nutshell
When visiting a place, we can only see its present scene and the happening at the moment, but barely know what happened here in the past. The memory of this place only stay in mind of encounter(s) and fades with time, leaving no trace.
The objective of this project is to add another layer to the physical space, which enable people to record, save and explore the remarkable moments on sites. The overlay of memories would provides a new perspective to a space across time.
People who are interested save their memory in an external virtual container and would like to explore more stories of a place. The basic requirement is smartphones that support AR.
An AR application that creates a container for people to store and share their memories and experiences, visualize them, associates them with our physical surroundings, and enable people to explore ones from others.
Record, share, and Explore
The goal of this project is to realize the experience of recording, sharing, and exploring memory. The application will:
Sharable Memories in AR
We need to firstly narrow down and define what is the memory that we want to put in the AR container.
There are multiple levels of memories regards privacy/secrecy. We focused on the memories that people are willing to share publicly, even with an unknown person.
Mapping the experience
The user journey helped us to map the user experience and find out the problem that needed to be solved. The biggest challenge is taking out the phone to record a memory would disturb experiencing the moment. So we’d create a wearable device as a short cut to record a memory, that user can record the moment with the smallest action possible.
Visualize as Gemstone
We would like to visualize the memory in AR, so that they can be explored by users in a playful way.
We’ve thought of different ways of visualizing. The idea of embodied as gemstones popped up in a brainstorm. Just like memories, each gemstone is one of a kind. It’s shaped by external factors and changing with time. We think it’s a great metaphor for the unique and priceless memory.
To optimize the cooperation between group members, I created the project map that listed out all the tasks in the project: what has been done, what is been doing, what will have to be done, as well as who is in charge of the tasks.
To create a better experience of recording a remarkable moment without disturbance, we don’t want users to take out their phones every time when they want to record something. Instead, we designed a wearable memory recorder, which helps to record the remarkable moment with the smallest action possible.
The necklace is made of multi-texture. To record a memory, users only have to turn the pendant toward their mouth and speak to it. The device provides a quick action to record the base of the memory, which consists of time, location, weather and sound.
Wearing the recorder
Recording by speaking to it
Technical prototype of the recorder
By means of the tilt sensor, the recorder would start recording when it’s turned up and stop when it’s turned back. Photon would then send the recorded audio to the cloud for later uploading.
Upload the recorded memory
When users are ready, they can edit their memory by adding extra text or photos, and then upload it to the app.
Every memory is one of a kind. To visualize the memory in augmented reality, we appropriate the metaphor of gemstones for its unique and pricelessness. We visualized the memorie base on their data of geoinformation, sound, images, emotion, and time. Every uploaded memory would be embodied as a “Memstone” by the following visualizing process:
#1 Shape from sound
The recorded audio track would be equally divided into 8 parts. The corners of the cube are trimmed by 8 surfaces, which are changing their angle and position according to the average energy(volume) of each part of the recorded sound track.
#2 Texture from Weather
The weather at the time of the recording is turned into the texture of the stone.
#3 Animation from Tone & Emotion
The recorded sound is turned into text using CMU Sphinx, an open-source speech recognition system. Next, with the aid of Watson Tone Analyzer, the emotions of the text are detected, and converted to the animation of the Memstones:
#4 Color from pictures
In case the user uploaded a picture to their memory, the main colors of the picture are picked out and applied to the Memstone.
#5 Memory ageing
Alike real stones, Memstone gets rounder and rounder as time goes…
We prototyped in Python to turn the data into 3D model.
Every memory is one of a kind!
Exploring Memories in MR
Knock on the memstone to view details
User can view the memories as memstones through AR camera. To explore further details in it, simply use the smartphone to knock on the stone. The content of the memory would then be display.
Explore with the phone
The memories can be experienced through camera in the AR environment. The emotion(animation) of the memory shows as user gets close to it. The further details of a certain memory can be viewed by knocking on the memstone with the phone.