Show info
Amy Roberts

Copyright © 2016 Amy Roberts





Functional prototype, UI, and video


2016 (3 weeks)


Prototyping course, University of Washington


UX Design, UI Design, Identity


Xiao Yan
Chris Chung
Catherine Jou
Amy Roberts


Patients can feel isolated from their loved ones while in the hospital. Nausea, pain, and reduced mobility can all impair their ability to use mobile technology. Their cell phones can harbor harmful bacteria and can spread contagious diseases.


Convey is a phone dock that helps patients communicate with their loved ones by providing them a form that is easy to touch with a voice controlled interface to make and receive calls.


I worked as part of a four person team to create this prototype. My role included designing the logo and user interface, investigating real-time digital prototyping methods, manipulating the interface wirelessly for user testing, and creating the poster and other visual artifacts.


Convey is a phone dock that helps patients communicate with their loved ones by providing them a form that is easy to touch with a voice controlled interface to make and receive calls. Convey allows users of varying abilities to easily communicate through a simple user interface, touch input, and voice commands. The plastic enclosure of Convey keeps the phone isolated, preventing the spread of bacteria and contagious illnesses. When an incoming call is received, Convey pulsates softly with soothing ambient light.


Design Goals

We had three main design goals for our prototype:


People in the hospital have varying abilities, so we wanted to make the phone dock as accessible as possible so they could easily communicate with their loved ones. We would achieve this by using a simple user interface, touch input, and voice commands.

Emotionally Comforting

We wanted to make the phone dock comforting to use for patients who might feel otherwise very sick or weak. We would achieve this by using calming lighting effects that change with the patient’s interactions.


The plastic enclosure of Convey keeps the phone isolated, preventing the spread of bacteria and contagious illnesses.



Our work included interface design, physical fabrication and Arduino design. The prototyping methods we used were high-fidelity wireframing, laser cutting, 3D printing, and Arduino. We divided this work between team members in the beginning, then we put it all together.

This involved putting Arduino elements into physical prototype, testing lighting and sound results, loading and syncing interface on the iPhone, and placing the iPhone inside the physical prototype. After the prototype was assembled and functioned well, we moved to the video shooting stage collaboratively and the final video editing and production.


3D Printing

We chose to focus on the look and feel of our prototype by creating a 3D model and using 3D printing as a means to allow the user to touch our design. The concept was built in SolidWorks and printed on a Makerbot.

We created our model to be translucent to allow light to pass through. This was imperative to the overall finish of our prototype, as we had LED lighting inside the prototype to provide ambient visual feedback through the glow in the translucent plastic.


User Interface

We wanted to create a user interface that was easily visible and simple enough to use for someone in bed at a hospital, who might have an illness or motor impairment. In addition to this, we wanted to make something that would complement both voice and touch interactions, making things easier for patients of varying ability who might not be able to easily communicate.

We created a user interface to simulate an outgoing call and an incoming call. In our design, we used large text and contrast to provide clarity for the user. Simple diagrams aid the user in understanding how the touch interactions work and are large enough for them to view through the acrylic screen. During calls, a comforting image of their loved one is displayed.

We controlled the interface with an app called Skala. This used a wireless Internet connection to connect to Photoshop, allowing us to remotely turn on and off layers to simulate an interactive UI.



Since the target user of our prototype were patients in the hospital, we wanted our prototype to provide clearly distinguishable states though audio and visual feedback. However, we did not want the feedback to be so strong that it is distracting for the user. We decided to go with LEDs for visual feedback, while using a buzzer to provide audio feedback.

We used a light sensor as a trigger to detect when a user has touched and activated the device. This sensor could detect when a hand is touching the prototype through the translucent material, since the hand blocks much of the light detected by the sensor. Because lighting conditions vary, took several light readings each time the device was turned on and averaged them to create a baseline and threshold. Conditionals were used to determine when to turn on the LEDS and buzzer, based on the change in light read by the sensor. For our lighting, we used an LED strip wrapped around the base of the prototype and a powerful blinkm LED at the top.

Six stages of interaction

  1. Hand touches shell to wake up device
  2. Hand is removed and device is awake
  3. Hand touches shell to initiate call
  4. Hand removed from shell and call initiated
  5. Hand touches the shell to hang up
  6. Hand removed from shell and call terminated. Device goes back to sleep.



The storyline of our concept video is based on the storyboard we created. The story is divided into two parts. The first one is the patient initiating phone call with his family and the second part is the patient picking up the call from his family.

We shot our video in a medical training room on campus in order to mimic a real hospital context. We shot three sets of footage of the whole story from three different angles, highlighting the overall medial setting, the patient’s movements and facial expressions, and interactions with the product. We designed our video to tell a story and introduce the product so the audience can easily understand its usage and functionality.


Our filming location


We wanted our logo to embody simplicity and clarity to express the accessibility of our prototype. We designed a simple, rounded image of a hand reaching out and rounded the ends of a sans-serif typeface. The blue color is soothing while still reflecting the hygienic nature of the medical field.


Rendering and Poster

We created a realistic rendering of the form in V-Ray and placed it in a hospital setting to create context. This served as the header of our poster, which we designed for our demo to help communicate what our prototype is and how it works.



Throughout our prototyping process, we conducted informal tests with others to gauge the effectiveness of our prototype. The insights we gained by watching people interact with our prototype gave us some direction for our next iteration.

We evaluated our final prototype by conducting three user tests with strangers. After giving users a brief overview of how Convey works, we gave them instructions for making a call. As the user interacted with the phone dock, we had someone sitting behind a laptop changing the screens accordingly.

Make a call

  1. Touch Convey to turn on
  2. Give a voice command to Call Catherine
  3. Touch Convey to confirm call
  4. Talk to Catherine
  5. Touch Convey to hang up

We then asked each person a short series of questions to learn more about what their thoughts were on the prototype and what they liked and didn’t like about the experience. Because we were not able to test with actual patients, we opened with questions about their previous hospital experiences to better frame the interview.



Interactions with form

When we conducted the user tests, the first thing we noticed was that it wasn’t always intuitive to the user how to activate the device. This was surprising since the top is the best place for a user to touch the device, as the curves at the top of the prototype conform to a user’s hands. The top is also the largest exposed portion of the prototype that a user can interact with, as the front is blocked by an acrylic screen. However, once we explained it to the user afterwards, they were immediately able to get it.

Overall Experience

When asked about their overall experience interacting with the prototype, we received generally positive feedback. Many users commented on the form of the device, ranging from how they loved the curved aesthetic, or the visual feedback through the led strip. One suggestion they did make was perhaps to eliminate the touch to confirm when initiating the call. Some users felt this was an unnecessary step if you already have voice control. Unfortunately, due to the limitations of the Arduino, we had to use the user’s touch as a trigger to differentiate between states, so that we could translate to the correct sequence when prompted by the user’s action. As a result, we would not be able to remove the touch to confirm, unless we also removed the visual and audio feedback from the led and buzzer during this stage.

Could this be used in a hospital?

With regards to the hospital experience, we found that all users had varying levels of experience, yet all shared the same mindset. The users viewed the hospital experience as negative when forced to stay for an extended period of time, due to the isolation from your normal social life. One of the pain points was the hospital environment, which was seen as unfriendly and sometimes artificial, when everything is colored white and you’re surrounded by medical equipment. One possibility for improvement was making the experience more personal. Users felt that the Convey prototype was a great way of making a hospital room more personal, without having to make too many changes. The light and sound feedback was a welcome change compared to the normally dull hospital room.

Future Work

Our prototype was an effective test in demonstrating that a phone dock could be used to easily make and receive calls through voice and touch input. Through our user testing, we found that the form was ergonomic for people to touch and simplicity of the user interface paired with the voice and touch commands made the prototype easy to use.

While our prototype was effective in demonstrating the basic interactions of the phone dock, we would probably need to make an additional iteration integrating real-time voice interactions for a more complete behavioral prototype before moving forward. After making these changes, we could move on to testing our prototype with patients in the hospital.

The next steps would be to conduct more in-depth testing to specifically address each of our design goals: accessible, emotionally comforting, and hygienic. Testing users of varying abilities (motor impairments, weak grip, soft voice, blindness, or another disability) would help us tweak the features to make our prototype easier to use. A diary study would allow us to track and gauge emotional comfort received from the device over a period of time. The hygienic aspect would be the most difficult thing to test, and we would likely have to work with medical professionals on this.

While further testing would reveal more clear design directions, some ideas for future work involve investigating whether UV lights could be used to sanitize the phone dock, integrating a video chat option for a more realistic experience, and adding features for patient entertainment.