Continuous authentication requires a wide array of data, and could benefit from different types of data that may not be typical data points users are used to providing currently. This prototype investigates consumers' sentiments on alternative methods of personal data collection and consent.
We designed and prototyped a convenient experience that encourages users to provide their biometric information like facial data, typing patterns, location data, and others, which can be used to continuously authenticate them. It uses various types of personalised avatars to provide a secure, fun, and familiar way to collect a user's biometric information.
- How do people respond to being asked to provide their personal data?
- How can we succinctly convey the value of Continuous Authentication to the average Person?
- Can "logging in" be playful or fun?
Number of User Tests: 4
We interviewed users on their thoughts on the concept and validated needs. Conducted a Think Aloud Think-Aloud testing, or lab usability testing using the think-aloud protocol, is a common technique for testing an interface with a user. During a lab test, a researcher will directly observe while they try to accomplish a task. As the user ... Think Aloud Testing while users went through our paper prototypes.
- We found that language was the most important attribute when user testing.
- Terms such as "accelerometer" mean nothing to non-tech people - and it can create confusion or misunderstanding. In fact, one user ended up selecting items such as Camera and Location, two things which people are generally more hesitant to share, only because those were the ones they understood.
- Even having the term 'Accuracy' isn't consumer-centric enough. Users asked what 'Accuracy' meant and we needed to explain it to them.
- People preferred the Weak - Medium - Strong scale when choosing what sensor data to share with the app. It reinforces their mental model during password creation.
- Showing the percentage was a lot more confusing.
- As for the character, users thought it was a fun, and interesting idea. Some users expressed concerns like, "what does this have to do with my finances?", and "why do I want something cutesy during authentication?"
- We also asked which data users feel least comfortable sharing. Amongst the five, 'Screen' and 'Camera' seem to be the most personal, and most sensitive.
Collecting permissions by making an avatar happier is coercive and easily slips into dark pattern
"A child could do this by accident, as if they're playing a game."The section where the app asks for permissions, using a greyed-out figure that brightens up as permissions are granted (by tapping on hands, for granting typing pattern data, or the face, for facial data) was universally praised, though we did receive valuable feedback that it might be coercive, and we should be careful when incentivising data sharing like that.