Milestone 4: Prototype, Presentation & Final Demo
System Concept & Architecture
As seen in the diagram below, our system consists of features that interact with the user from both the internal backend and the user-facing side of the product. On the backend, our product uses artificial intelligence to recognize and respond to a user’s voice commands. It utilizes user input to learn more about their common habits, substitutions, allergies and preferences to better assist the user in the cooking process. The internal architecture of the product includes a recipe database that serves as a search engine to generate recipe ideas for the user. Finally, the backend utilizes Bluetooth, allowing the user to easily connect their cooking assistant to Wifi, pair with their smartphone and contacts, and share recipes instantly.
The user-facing side of the product features a visual display, a light and a dual speaker microphone. The visual display presents the user with recipe results, recipe detail pages and recipe reviews. The device has a speaker which serves as a voice assistant to provide the user with audio feedback including ingredient substitutions and recipe instructions. This same technology also serves as a microphone for the user to interact with the assistant. Lastly, there is a circular light at the center of the speaker as indicated by the blue circle in the diagram. This light serves as visual feedback for the user to understand the current state of the device: a purple light represents the device is on standby, a light blue light represents the device is processing user input, a dark blue light represents the device is speaking and a yellow light represents the device is listening and waiting for the user to respond. This helps guide the user through voice interactions with the product.
Our goal for the demo is to highlight several main interactions that users would have with Sous Chef when using it to prepare a meal. In this way, the demo reveals how our product provides a personalized experience to guide users step-by-step throughout the cooking process in a hands-free way. The core interactions demonstrated in the demo video include the following:
Onboarding users by learning their dietary preferences, restrictions, and/or allergies
Helping users choose a recipe based on available ingredients and other preferences/restrictions
Providing instruction through visual and audio feedback to guide users through executing a recipe
Social affordances that connect users to friends and family by sharing recipes and cooking experiences
The demo is also meant to illustrate the product’s multimodal feedback. The device is voice-activated, includes a digital display, and visual cues through light whenever a user speaks to its Sous Chef. In the demo, users interact with Sous Chef and are given these different forms of audio and visual feedback, which help guide them through the cooking process. Overall, our demo shows how Sous Chef is an innovative device to enhance the cooking experience for a wide variety of users.
Sofia is looking to cook dinner using the ingredients available in her fridge. She gives Sous Chef four to five ingredients in her fridge, and the device then generates a list of recipes that she can choose from. Sofia views the recipe results on Sous Chef's visual display. Based on the ratings of other users, Sofia decides to make the penne ala vodka, but realizes she is out of shallots. Sous Chef presents her with substitutions, and the next steps in the cooking process. The visual light feedback from Sous Chef helps her know whether the device is processing her request or waiting for a command. When Sofia is done cooking, she is impressed by the end results. She decides to use Sous Chef to share the recipe with her friend, Orly. Orly receives the recipe from Sofia, and decides to make the same dish. Sous Chef adjusts the measurements of the ingredients based on the number of people Orly is cooking for, and gives her step-by-step instructions. She is impressed with the end results and posts the picture of her penne ala vodka on Instagram. Anandita sees Orly's Instagram post and decides to buy the Sous Chef. The device arrives, and she begins to set up Sous Chef by stating her food preferences and dietary restrictions. Finally, Anandita finishes setting up her device and starts cooking.
Paper towel roll
The prototyping process was challenging since we were in a remote environment, and we each had to make our own prototypes. That being said, we chose to use common household products and class materials, so that we could build almost identical prototypes. The first step in prototyping was to sketch how the audio, visual, and light feedback would come together to create Sous Chef.
The next step was to write the proper code to power the Neopixel which was used for visual light feedback. Once our breadboard was correctly set up along with the code, we were able to successfully receive visual feedback on the Neopixel. This feedback allows the user to understand the current status of the Sous Chef (i.e. listening, processing request, waiting for response, or standby).
We then built our physical prototype, placing the Neopixel into a paper towel roll and wrapping it in tinfoil (as seen in the image below). This serves as the speaker and microphone for the voice assistant. We attached this all to the tablet which displays the recipes. We made these screens on Figma. Lastly. we pre-recorded the audio for the voice interactions and followed the script to successfully complete our demo.
The demo captured key elements of the intended experience, including: onboarding, recipe suggestions, recipe instructions, and social affordances. The system gives audio, visual, and light feedback when interacting with users, helping them better leverage the tool while cooking. The demo captures different light displays based on interactions such as, when the device is listening (yellow light), processing input (white/blue light), device response (solid blue light), and standby (solid purple light). The device also has a digital display that provides additional visual output to the user such as recipe suggestions, and additional instructions and prompts to help personalize the user experience. Primarily the demo features voice-activated interactions with the user and provides a step-by-step guide to users throughout the cooking process, allowing for a hands-free digital assistant in the kitchen. This is shown by having the user interact with the product to receive recipe results, execute a recipe, send recipes, ask clarifying questions, and go through an onboarding process.
Insights & Takeaways
While preparing the demo, we confirmed the importance of having both audio and visual feedback within our assistant. Based on standard conventions of voice assistants, the visual feedback through the use of light provides a clear indication as to the current status of the assistant. While the main purpose of the Sous Chef is to provide auditory guidance to users in the kitchen, having more information about the product's status allows for more intuitive interactions.
Additionally, the demo reflected the importance of ensuring that the user feels in control of the process when using Sous Chef. This helped confirm multiple design decisions that we included to empower the user and provide more freedom in their interactions. For example, we decided to include a touch screen display to give users the ability to scroll through recipes or let the device auto-scroll. We also chose to build-in several commands, such as “repeat that step” and "slow down", which allow the user to control the pace of their interactions with Sous Chef. This way, users' needs are accommodated, and they can personalize their guidance through the entire cooking process.