The good news is that 85 percent of people will likely head back to stores once restrictions are lifted. However, most will only return to stores with reservations and other precautions to ensure social distancing.
The bad news is that 45 percent of people will likely not be willing to touch a digital screen in a public place following the pandemic. The vast majority of screen interactions in a physical store environment are done with our hands. During a pandemic, and for the foreseeable future ahead, we are at risk of losing almost half of our users right at the onset of our experiences.
How do People Feel About Touching Screens?
To understand how safe someone felt using a touchscreen, we tested consumer reception of the following touchscreen additives:
- Hand sanitizer
- Personal stylus
- Self-cleaning screens
- Phone as controller
- Gesture control
- Voice control
- Eye tracking that controls the screen
The research shows that providing hand sanitizer and the ability to control the screen with a personal phone or personal stylus increased the likelihood of interaction, as did self-cleaning screens.
Key Results Summary
Along with providing hand sanitizer, consumers are most likely to interact with a screen using their personal device as a controller.
Our findings revealed that 92 percent of respondents found it easy to use their personal device to control a public screen and of this group 71 percent indicated having previous experience using their personal device to control a public screen. They rated both comfort and satisfaction with this interaction at eight out of 10.
The top concerns for users centered around a misconception that they needed to download an app, concerns. about data security and a lack of understanding of QR codes. We believe that designing an elegant onboarding screen can significantly decrease these concerns. We will go deeper into this interaction in a bit.
Voice and Gesture Control
Aside from using their personal device, two other screen interfaces have begun to take shape as best practices across the industry: voice and gesture control. Despite significant pitfalls, if designed properly both can work well in a physical store environment.
Spurred on by technology such as Apple’s Siri, Amazon’s Alexa, and Google Assistant, speech recognition has become a very mature platform. We have used Google’s Dialogue Flow software to create both prototypes and production-ready versions of voice control in stores. While the technology reduces concern over touching germs on a screen, there are significant roadblocks to overcome:
- Understandable fear of sounding awkward or annoying in public
Talking to a screen in public feels strange to most users. We have had success minimizing this effect by introducing on-screen prompts. These prompts show what someone can say or ask—this gives them guidelines while also making the interaction look normal to nearby customers.
- Privacy issues divulging personal information aloud
Simply stated, it is not wise to ask a customer to announce their sizes to a screen in the middle of a store. That said, we have had a bit more success asking for personal information in a more private setting such as a fitting room.
- Potential interference of outside noise
A busy public space with a lot of background noise makes it difficult for voice recognition software to do its job. Prototypes with directional microphones and screen placement in low foot traffic areas have proven to alleviate noise interference.
Again, gesture control systems have come a long way since first introduced to consumers with Microsoft Kinect in 2010.
To prototype and build gesture experiences in both retail spaces and theme parks, we use Microsoft’s Azure Kinect SDK. The technology works incredibly well in theme parks. Why? Because visitors are mentally prepared for fun and new experiences. In a retail setting, our research indicates that while the technology is interesting, customers do not believe it will work.
We tested two types of systems. One system reacts to where eyes are looking on screen to progressively dive deeper into content. The other system reacts to mid-air swiping motion to move from section to section in an experience. Eye tracking was met with skepticism about privacy. People incorrectly assumed the system used facial recognition. And as for the swiping experience, it brought concerns about looking silly in public. Some concerns could be solved with privacy language on the screen and gamifying swiping to make it fun.
Phone as the Controller for Touchless Screen Interactions
The most effective and popular interface for touchless screen interactions was when the customer could simply use their phone to control the public device. However, there is concern both from brands and customers about having to download an app to control a screen in a retail environment. The moment is too fleeting to ask a customer to invest the time and device space for an app so niche in its offering.
The good news? An app download is unnecessary. By simply using a QR code and clear messaging on how to use it, any customer with a smartphone can participate effortlessly. Their phone camera brings up a web application that will start the interface. Once in this interface, the controls make it easy to move around in the experience, push data to it and save data from it.
This movement of data is key. Users are considerably aware of privacy issues with a public digital experience. When designed correctly, you can use this to your advantage with a personal device controlling that experience. Users are much more likely to sign in to an account or save the data from their experience if they are using their phone. We also make sure that we explain to customers that none of their data will ever be displayed or shared with others on the public platform.
A good example of using a personal device to control an interface is our work for MAC Cosmetics.
We helped MAC create a next-generation flagship store in Queens Center, New York City. The store centers around what we call the Art Studio and three stations for Eyes, Lips and Face complete with 16 virtual try-on mirrors. The digital screens use QR codes to check a customer into the experiences and allow them to move seamlessly from station to station while saving their favorite looks. They enable customers to choose from over a hundred product shades and to design their own personalised palettes. Blending customisation and personalisation across each station delivers an extra-special product—your perfect looks, saved directly to your phone.
Some might argue that digital screens are nothing new, that they are gimmicky or irrelevant in the touchless economy. What is so exciting here, however, is that as you move from station to station, the recognition software updates your online profile accordingly. If you choose a certain shade at the lip station, the matching recommendation will be sent your way at the eye station and saved to your phone simultaneously. This is digital enablement at its brightest and best.
These touchless digital experiences were always in the making, the arrival of COVID-19 only accelerated development. While the research tells us where to focus our attention, at the end of the day it’s about putting the fun back into the experience in a safe way. After months of restrictions and isolation, customers are emerging from their homes with a new burst of energy and are looking for experiences that bring them joy.
Choosing the right interaction for your users should be guided by this research, but we recommend prototyping multiple options to see what resonates most with your unique customer base. On a human level, this is about delivering safe, playful, and intuitive experiences that improve the buying journey. From a business perspective, these touchless interactions form the foundation for increased consumer interaction, engagement and loyalty.