With the earlier Avatar prototype, users faced a lot of confusion during user testing since we did not fully explain why users were providing data, and what benefits of security and convenience they would get in return. So we focused on providing a narrative where the user reading the storyboard can see the value of providing personal data to make Continuous Authentication work.
A Need Validation Storyboard
A Technical Prototype
Some secondary research we did early on covered biometric authentication, including Apple’s Face ID technology. We found that Face ID has usability issues (link), and similar looking family members having access to another family member’s phone. This 10 year old kid can consistently unlock his mom’s phone (link)!
Two factor authentication might be the solution to some of these problems. We created an Android app that uses Microsoft’s Emotion API to look at a photo of a person and determine what they’re feeling.
It can tell if a person feels anger, contempt, disgust, fear, happiness, neutral, sadness, or surprise! Initially the prototype was created more to see how far we could go in one week. First iterations just detected emotions off of images of people, but we eventually made one using live video based on a Tensorflow tech stack. See more on the Project Website.
- How can we best demonstrate how Continuous Authentication works and its value?
- How do people respond to being asked to provide their personal data?
- In a physical space, how can this technology be more convenient than Apple Pay, or even just using your card?
Number of User Tests: 9
Feedback from 2 other Capstone teams, user tested the prototypes with 9 users.
Users read through a storyboard aloud that brought the context of use to life. We then had a detailed interview on how they felt and their thoughts on the concepts.
Users have drastically different opinions on this concept. Some users found the avatars adorable, while others found it insecure to share the app with another person. The insecurity comes partly from not understanding what the app is doing and how Continuous Authentication works.
01.The avatar concept was perceived as adorable by the younger generation, while useless for the older generation.
"This would be really cute, and will make me want to purchase more."
"Creepy? This would be adorable."
02.An ideal convenient checkout experience should require low user attention
"Seems like a lot just for a payment app"
"When I think about payments, I don't want to think about payments"
03.Skepticism with the hand-off scenario (where the CA system can tell between different people holding the same phone)
"I don't know how that would work. Would a calibration process take place? How reasonable is that in real world usage?"
04.Friction in the payment process may be preferable
"I like a little bit of friction, otherwise I might overspend, I don't trust myself"
05.There are concerns with data being collected constantly, as well as the types of data
What does it mean it mimics her movements (on the avatar mirroring the user's gestures? Like her whole body? That's really creepy…"
"I don't want to give up all my information at once. I don't want one app tohave all the access to this information."
06.Social factors play a large part in user adoption
"I would probably not [use this service] unless it became popular and ubiquitous"
07.Mixed reactions to facial data being collected. People weren't as squeamish about sharing their facial data as we thought.
"I don't want my face videoed all the time. I'm more comfortable with camera usage on my phone than my computer."
"I don't mind, I feel like it's already out there, I don't care."
"I don't want the camera to be on all the time."
"From a tech perspective it adds a layer of reassurance"
08.Designing (cute) products which people don't associate with serious things like 'finance' could be problematic.
([On the avatar concept] “I don’t see this and say ‘ah yes this is how I want to conduct my payments’. Maybe younger folks would be into this.”)