Jump to content
Gemini now has added data protection. Chat with Gemini to save time, personalize learning and inspire creativity.
Gemini now has added data protection. Chat now.

With Google Cloud’s AI tools, students make user research more user-friendly

A new service, smileML, draws on a facial recognition algorithm so product developers can analyze user responses in real time.

In the spring of 2017, Shannon Anderson and Ethan Petersen were developing a mobile app, Guess Less, to help people share clothing sizes and gift ideas. Anderson had conceived the idea after a boyfriend mentioned that he never knew what to buy her for the holidays. “It was also frustrating playing phone tag with family members when shopping for nieces and nephews for the holidays,” she says. “I thought it would be nice if all my friends and family had Pinterest boards with things they like, and their sizes, so I’d buy the right size. I searched all over the App Store and there wasn’t anything that accomplished this. So I asked Ethan if he wanted to work with me to make an app for this problem.” Since then, she and Petersen have been getting great feedback.

The friends met as undergraduates near Indianapolis—Petersen at the Rose-Hulman Institute of Technology, and Anderson at Indiana State University—and shared an interest in business and technology. To test their app, they set up an elaborate protocol to get user feedback.

“We would video-call with friends and family and watch them go through the app step-by-step,” Anderson says. “This was very time-consuming and also made it difficult to get an accurate sample of our user base. We realized how tedious traditional market research can be, so we decided to create a solution.”

Wanting to see their users' interactions—as well as their reactions—the duo realized this wasn't possible from simply analyzing crash reports and logs. The typical market research process was expensive and time consuming, and the potential for bias was problematic.
​ And so smileML was born: a way to obtain genuine user feedback, in real-time, by using AI to capture human sentiment in mobile apps.

Within a few months, Anderson and Petersen built a prototype, a software development kit (SDK) that enables app developers to conduct market research with machine learning to record, interpret, and analyze emotional responses during user testing. For now, the SDK supports only iOS apps, but the duo plans to expand into Android and the web.

The SDK works by recording user data through its front-facing camera every quarter-second. It collects training data through an iOS game the pair developed called MatchMoji, where players try to match their facial expressions to emojis. When played in training mode, the app uploads data to Google Cloud and runs it through a machine-learning algorithm running on Google Compute Engine that tests for accurate recognition. As users respond during testing, the results are streamed through BigQuery for real-time display on a dashboard. UX researchers can analyze responses such as “the user was 70% happy on XYZ screen at 11:15:32 pm.”

The training model runs on Google Compute Engine. At scale, they plan to leverage TPU-based Compute Engine VMs to train their emotion recognition models. The SDK downloads the best model to a user's device to protect privacy, and results are streamed to Firebase, Google’s mobile app development platform. Anderson explains that “we decided to use Firebase because it was easy for me to pick up and design our database architecture, and it was also very affordable.” Petersen adds that “Google Cloud allows us to automate a lot of tedious processes. And we can scale once we need to, which takes away a lot of stress.”

Combining the power of artificial intelligence with the power of human emotion

With smileML, product developers can see exactly how users respond to a mobile product in real time.

“Getting user feedback is an incredibly difficult problem,” says Petersen. “Surveys exist to get simple insights at scale, and people can be brought in for user testing in small groups. But there’s nothing in between. We’re building a service powered by emotion recognition to capture reactions in mobile applications. These reactions can inform product managers and UX researchers about user behavior much more extensively than surveys and at a much larger scale than usability testing.“

Yet Anderson and Petersen are also sensitive to protecting user privacy, and building relationships with users through transparency and trust.

“Google Cloud allows us to automate a lot of tedious processes. Further, we can scale once we need to and that takes away a lot of stress.”

Ethan Petersen, Ph.D. Candidate in Computer Science, Indiana University

Connecting UX designers and developers with their users

Petersen and Anderson hope to launch to the public in early 2019, but in the meantime they are working closely with the UX community, interviewing dozens of UX researchers who are excited about the app’s benefits. Petersen says that “one product designer we’ve worked with said that even though he tries very hard to remain neutral in user interviews, it’s hard to completely get rid of a confirmation bias. For him, our tool would act as a benchmark to make sure the reactions he’s seeing are really there and not wishful thinking.”

Anderson notes that “smileML allows users to participate in usability testing on their own time, from the comfort of their own homes. It also makes it significantly easier for companies to reach their users in different parts of the world.” At some point, the duo would like to integrate a chatbot to act as an AI UX Researcher, prompting users with questions after sudden positive or negative expressions—which register as emotional “spikes” on a graph.

Meanwhile, they’ve joined the Google Cloud for Startups program, and Petersen is starting a Ph.D. program in Computer Science at Indiana University, where he hopes to launch a career building cooperative software that understands its owner’s goals and emotions. For Anderson, “being able to help others is what motivates me to learn. I love developing apps to help people and business owners with everyday problems.”

“Being able to help others is what motivates me to learn. I love developing apps to help people and business owners with everyday problems.”

Shannon Anderson, Indiana State alumna

Sign up here for updates, insights, resources, and more.