- There is heightened awareness around the potential for contaminated surfaces due to COVID-19.
- People may turn away from touch-based technologies as they try to avoid germs and stay safe.
- Tech such as speech recognition, facial recognition and digital money, which many of us were still getting comfortable with, may fast become the norm.
Sometimes, a technology arrives just in time to save us from a problem that the technology wasn’t created to solve. In the early 1900s, for instance, the automobile came along just as burgeoning cities became alarmed that an explosion in horse traffic might bury their streets in manure.1
We’re seeing a similar phenomenon today with three technologies: speech recognition, facial recognition, and digital money. None was invented to help us deal with COVID-19, but each recently came into its own and could be crucial to our post-crisis ‘new normal’ by allowing us to avoid touching things that a lot of other people have also touched.
Over the course of the past few decades, we’ve built a screen-tapping, button-pushing society that now, in our heightened state of germ sensitivity, seems downright frightening. When our forebears needed cash from a bank, they walked up to a teller and asked for it. Now, we have to poke an ATM screen that others might have smeared with a deadly contagion. Long ago, people got on an elevator and told an operator which floor number they needed. Now, we and hordes of potential virus carriers step in and push the same buttons.
We’re about to see a huge shift away from touch in response to the novel coronavirus. A lot of older technologies will still have roles. We’ve long gotten used to touchless water faucets in public bathrooms, for instance. And a simple website can solve the problem of multiple hands on restaurant menus: Everyone at the table could get the menu on their mobile phones. But newer technologies will drive some of the more interesting changes.
Only in the past few years have we gotten comfortable with speech recognition: Millions of people use Siri on iPhones or chat with Alexa devices at home. The first attempts at getting machines to understand speech go back to 1962, when IBM researchers built Shoebox, which could understand 16 words (none of them very well, and most of them being the numbers zero to nine; Shoebox was basically a touchless adding machine).2 For the next 50 years, advances came slowly. Speech systems never tipped over into being truly useful until about 2010, thanks to a combination of new approaches to the software, enormous amounts of voice data that machines could analyse, and cloud computing that made it possible for a small gadget to get speech instantly deciphered by enormous computers in a data centre.
Now that machines can understand us almost as well as another human, we’ll see the technology take us back to a virtual version of the old days. We’ll be able to walk into an elevator and simply say, in any language, “Tenth floor, please.” Vending machines were invented to automate things such as candy and ticket stands, which were operated by clerks who people could speak to. In the coming years, we’ll again ask for what we want instead of pushing a button, but we won’t be talking to a person. Paris-based Thales, for example, is marketing its Transcity voice-recognition ticket machine to train stations: Travelers speak to tell it where they want to go, and it prints their ticket.3
Next-generation ATMs will veer toward becoming virtual tellers, according to Doug Brown, an executive at ATM maker NCR, who spoke about the technology in a recent news article.4 A conversational ATM could do more than just spit out cash or take a deposit — it could answer questions and handle more complex tasks, such as opening an account.
The effort to get machines to recognise faces also goes back to the 1960s, when an inventor named Woody Bledsoe, possibly funded by the CIA, laid down some of the field’s foundational research and dreamed of wearing glasses that would tell him the names of everyone he met.5 But as with speech technology, computers then didn’t have enough power or data or clever enough programming to make facial recognition work.
In the 1990s, the US’ Defense Advanced Research Projects Agency (DARPA) rolled out a program to encourage commercial development of facial recognition, in part so the military could use it.6 The Internet in the 2000s sucked in billions of digital photos, giving companies such as Facebook and Google huge repositories of faces to analyse. China dove into perfecting facial recognition for state security. And now, Apple’s iPhone X, released in 2017, has made millions of users comfortable with having their faces be their password.
By the time COVID-19 hit, facial recognition was so good that a version from Clearview AI was seen as an ominous threat to privacy — dangerous enough that its technology had to be restricted, as if it were nuclear fuel material.7 Now a helmet from Chinese company Kuang-Chi Technology comes equipped with both an infrared camera and facial recognition.8 The wearer can supposedly spot someone 15 feet away who has a fever, and identify the person.
In other words, facial recognition is now highly accurate and can get built into almost anything. Of course, this technology raises a lot of privacy issues. But if we can get comfortable with those issues the way iPhone X users have, why would we ever again touch anything as a way to identify who we are?
Going through an airport, a traveler constantly hands people a license or passport. The physical versions of both will become relics of the past, replaced by a database that matches your face to its records. U.S. Customs is already testing facial recognition in a handful of airports as a replacement for handling paper passports (though it backed off an earlier plan to make it mandatory).9 In the years ahead, every ATM or checkout system that asks for a PIN will instead just identify your face. Doors that require a security code or card will recognise you and automatically open. Physical keys, which others might have touched or coughed on, can disappear. Instead, your house or car will simply see that it’s you and open up.
One way to think about digital money is the mobile wallet: simply a digital version of a credit or debit card, embedded on your phone à la Apple Pay, Google Pay, or Chinese-based services such as Alipay and WeChat Pay. Services like Venmo are similar: money-transferring apps tied to a bank or credit card account. Those all got started around 2008, and have quickly caught on, much more in China and Europe than in the US.10 The number of people using mobile wallets worldwide is about 1.3 billion, almost 14 percent more than in 2019.
We’ve recently adopted a lot of ways to digitally pay for stuff without touching anything but our own phones. There’s no reason to handle cash that others have touched, or give a credit card to someone who then hands it back. Retailers, too, want to protect employees from customers, and since COVID-19, some, such as Publix Super Markets, have ramped up installation of digital payment options.11 Richard Crone, CEO of mobile payment research firm Crone Consulting, told Bloomberg he expects contactless payments to grow as much as 20 percent this year.12 “We shouldn’t be touching anything,” he added.
Which is the point. If the economy is going to safely bounce back, we need ways to do the things we used to do while touching as little as possible. Just as cars disrupted horses more than a century ago and saved cities from a manure crisis, new technologies are about to disrupt touch screens and buttons as a way to get us past the COVID-19 crisis.
A version of this article was originally published in strategy+business on June 19, 2020.