In the spring of 2017, Shannon Anderson and Ethan Petersen were developing a mobile app, Guess Less, to help people share clothing sizes and gift ideas. Anderson had conceived the idea after a boyfriend mentioned that he never knew what to buy her for the holidays. “It was also frustrating playing phone tag with family members when shopping for nieces and nephews for the holidays,” she says. “I thought it would be nice if all my friends and family had Pinterest boards with things they like, and their sizes, so I’d buy the right size. I searched all over the App Store and there wasn’t anything that accomplished this. So I asked Ethan if he wanted to work with me to make an app for this problem.” Since then, she and Petersen have been getting great feedback.
The friends met as undergraduates near Indianapolis—Petersen at the Rose-Hulman Institute of Technology, and Anderson at Indiana State University—and shared an interest in business and technology. To test their app, they set up an elaborate protocol to get user feedback.
“We would video-call with friends and family and watch them go through the app step-by-step,” Anderson says. “This was very time-consuming and also made it difficult to get an accurate sample of our user base. We realized how tedious traditional market research can be, so we decided to create a solution.”
Wanting to see their users' interactions—as well as their reactions—the duo realized this wasn't possible from simply analyzing crash reports and logs. The typical market research process was expensive and time consuming, and the potential for bias was problematic.
And so smileML was born: a way to obtain genuine user feedback, in real-time, by using AI to capture human sentiment in mobile apps.
Within a few months, Anderson and Petersen built a prototype, a software development kit (SDK) that enables app developers to conduct market research with machine learning to record, interpret, and analyze emotional responses during user testing. For now, the SDK supports only iOS apps, but the duo plans to expand into Android and the web.
The SDK works by recording user data through its front-facing camera every quarter-second. It collects training data through an iOS game the pair developed called MatchMoji, where players try to match their facial expressions to emojis. When played in training mode, the app uploads data to Google Cloud and runs it through a machine-learning algorithm running on Google Compute Engine that tests for accurate recognition. As users respond during testing, the results are streamed through BigQuery for real-time display on a dashboard. UX researchers can analyze responses such as “the user was 70% happy on XYZ screen at 11:15:32 pm.”
The training model runs on Google Compute Engine. At scale, they plan to leverage TPU-based Compute Engine VMs to train their emotion recognition models. The SDK downloads the best model to a user's device to protect privacy, and results are streamed to Firebase, Google’s mobile app development platform. Anderson explains that “we decided to use Firebase because it was easy for me to pick up and design our database architecture, and it was also very affordable.” Petersen adds that “Google Cloud allows us to automate a lot of tedious processes. And we can scale once we need to, which takes away a lot of stress.”