Izzi is a smartbot device that would configure your home’s smart environment based on your daily mood. We all have a tough or long day at work, and we need a comfortable environment to come home to. Izzi makes that the right environment for you by learning your feelings through a device worn on your person. Combining that with observing your smart home preferences, or depending on how your routines manually up, Izzi can set your home to exactly how you want it, when you want it.
I took time to investigate the current product space. Did some literature review to understand the lay of the land, some competitive analysis on both bots that evoke emotion and smart technology in a home. I also watched a few pixar movies that had animated bots to understand their facial emotions a bit (looking at you, WALL-E).
One key product I looked to for inspiration and thoughts is the Anko Cozmo. If you have not seen him I highly recommend checking him out. It’s a really cool fun little robot that has emotion. It looks quite fun to interact with and has a personality of his own. One thing that drove me to this idea was this fun little robot. This little robot had me thinking “how might consider something like this in a smart home environment?” Thus more questions arose, and curiosity.
To understand the reality of this device and how its companion app would look and feel, I iterated on ideas on izzi’s look and the app’s look. Izzi herself I landed on as an egg shape as generally rounder shapes feel softer and more friendly.
I designed multiple face plates Izzi may use as stand ins for her emotions switching. When a user would enter their home, Izzi would be in a place on her stand to greet said person. The necklace would be an example of what a user would wear on their daily trip. It would be inconspicuous and can be worn under the clothing, and a big reason why I chose a necklace.
For the rest of the project I focused on the mobile app. I iterated on the mobile application that a user of Izzi would have. The mobile app had 4 key areas of focus, each with an important use case: Dashboard, Home, Sense, Settings.
I tested these parts of the app and gathered some feedback on the interface.
Dashboards Gives an overall view of Izzi and what she’s connected to, what her thoughts are currently, who she is connected to, what the status of the house is.
Sense gives a user the ability to see all the feelings Izzi has collected about the connected individual. Users can also manually talk to Izzi, override of Izzi’s collected thoughts with how they really feel.
Home allows a user to see the current home setup, other created scenes, all connected items, people that Izzi knows, and a help section to help get setup and maximize Izzi’s use.
My Izzi allows users to manage the Izzi’s setup assistant, what notifications Izzi sends to whom, the user’s profile settings, what color Izzi’s interface is, and the formatting.
I conducted an Informal Walkthrough with 3 people on the application. I wanted to get some iterative feedback early in the process. One of three had seen Izzi’s previous physical form, and the other two were new to the product.
Most of the findings were small details to tweak, such as changing the name on Izzi's settings to...actual settings. Or adjusting Sense to remove the illusion that the app is referring to Izzi's emotions.
See the Full Low-Fi Prototype
Using the feedback I received, I continued to focus my iterations on the mobile app, as it was the most interactive rich. As I was designing this I kept in mind the scenarios each screen housed, and why those scenarios would happen.
I crafted a prototype with these screens to answer questions around ease of use, emotional feelings around some parts of the app, default actions, etc.
In general there was a large amount of positive feedback on most of the assets in the interface. Two findings that were interesting revolved around the coloring of the app itself (the blue/green gradient) and sadness provocation by some assets in the app. 50% of the users commented about using color as an indirect indicator of what Izzi was picking up for a user’s mood. This led me to think about possible future iterations where color plays a more important role. I’d have to consider how that may impact accessibility.
When I had users navigate to format Izzi, each user displayed true empathy when seeing Izzi’s sad face and did not want to hit “Forget Me”. This was the intended response I was going for and led me to consider key actions in the app for the future that would have emotional responses.
See the Full Hi-Fi Prototype