In Digital Pulse’s Uncovered series, we dissect the definitions behind digital trends and technologies.

Here, Daintree Peters offers an introduction to natural language processing (NLP). Signalling a change in the way computers understand humans, this emerging technology could see widespread adoption in just a couple of years.

Whether written, spoken or signed, language-based communication is not limited to words. It’s everywhere, from the icons and symbols embedded in history’s greatest artworks, to the cues and gestures we employ when interacting with digital devices.

Until recently, these digital interactions were decidedly one-sided. The highly structured, rules-based nature of software (which is itself powered by a programming language) has forced all of us to become digital diplomats. We funnel our interactions through strict workflows, moving from one button, list, or menu to another; executing simple direct commands such as OK or Cancel.

These rules of engagement, however, are beginning to change. Advances in natural language processing (NLP) – a technology that seeks to teach computers how to understand native human communication – have meant that we can now begin talking more candidly to computers and they’re beginning to answer back.

Natural language processing:
an overview

Broadly speaking, NLP provides the promise that computers might be able to derive accurate meaning from our natural language, such as understanding conversational questions from our natural speaking voices, or extracting worthwhile meaning from large texts. This is markedly different from conventional techniques, which requires us to conform to rigid rules or syntax.

Arguably the most high-profile implementations of NLP can currently be found in services such as Apple’s Siri, Amazon’s Alexa and Google Now. On these platforms, NLP is combined with speech recognition capabilities to create a virtual assistant located in your smartphone, computer or smart home device.

But as anyone who has ever used one of these digital assistants will tell you, they don’t yet get it right all the time. That’s because understanding natural language is hard for computers and, when you get right down to the mechanics of our communication, it’s easy to see why.

Human language remains deeply ambiguous at almost all levels: syntax is inconsistent, semantics can be misleading, and phonics and morphology can be difficult to distinguish. What’s more, the ability to understand the relationship within and between sentences frequently seems to just work by magic – and that’s all before throwing in human characteristics like sarcasm and word-play.

The power of virtual assistants
and NLP

NLP attempts to make sense of these inconsistencies and ambiguities, changing the way we interact with software and applications in the process. With NLP, we can be freed from having to learn complex query languages or the intricate sets of menus and commands that come with every software platform.

Additional benefits can be achieved when NLP is combined with speech recognition, such as virtual assistants. With this pairing, digital accessibility can be vastly improved, bringing software capabilities to people previously denied such services, whether through a lack of training, education or disability.

Learning from
unstructured data

As well as powering virtual assistant technology, NLP can provide new ways to understand data, in particular the understanding and categorisation of unstructured datasets found within large documents or text records.

Unstructured data is information that has yet to be organised in a tabular format, such as a database or spreadsheet. Examples of such unstructured data include call centre transcripts, social media content, or other forms of consumer feedback.

Applying NLP to these datasets (using products such as PwC’s Customer.Mind, for example) can uncover a richer, more authentic customer voice, distilling sentiment and meaning from these human responses.

How does natural language processing
work?

Let’s consider the simple sentence, ‘I walked down the street in a hat’.

Most of us can gather the meaning fairly easily: I walked down a street, and I did so while wearing a hat. For a computer, however, instantly deriving this meaning is not so simple. It might think the street itself is in the hat.

NLP attempts to find the right interpretation within the many permutations found in human language. While most of these possible permutations are highly implausible (the street, in fact, is not likely to be in the hat), a traditional software application would be stuck dealing with all of the possibilities.

This complexity only grows as sentence lengths increase. The phrase, ‘I walked down the street in a hat with a smile’ raises more questions about the fate of the hat, not to mention who the owner is of the smile. Am I smiling, or is the smile walking beside me?

 

natural language processing

An example of the syntax analysis provided by Google’s Cloud NLP API, using the sentence ‘I walked down the street in a hat with a smile’. The API can provide the data to create syntax tree diagrams for any submitted text. 

NLP
and machine learning

Early NLP methods focused on using rules to ‘calculate’ meaning. However, this approach was constrained by the large number of rules and exceptions that were required to deal with even moderate variations within texts, such as context, topics or vocabulary.

To overcome this obstacle, NLP uses machine learning, a type of artificial intelligence which allows computers to learn, without being explicitly programmed, when exposed to new data. This is achieved through training and testing algorithms on a large body of real-world language, narrowing down the most likely meanings.

An emerging technology
in the cloud

As NLP and machine learning have grown in power and capability, aided by the rising availability of cloud storage and computer processing, so too have expectations for the technology. A range of NLP service providers have entered the market, lowering the technology’s cost, improving its accuracy and making it easier to integrate within a range of business applications.

While a recent Gartner report, Hype Cycle for Emerging Technologies, forecast mainstream NLP adoption to occur over the next two to five years, the technology can begin to be incorporated into applications today using cloud-based APIs.

The setup works like this: an application sends text, such as transcribed call data from a customer contact centre, to a third-party API for analysis. Using NLP, a structured response is then sent back from the API to the application, providing attributes for the data such as sentiment (is the call content positive or negative?), categorisation (what topics is the call about?), or another specific form of indexing.

Understanding
the ambiguous human

In a world where understanding, insight and prediction are becoming increasingly important, NLP will be critical for generating intelligence from vast volumes of recorded human interactions. Understanding the true voice of the customer is a huge opportunity to improve the customer experience and NLP can play an instrumental in this challenge.

With real-time analysis of sentiment, subjects and meaning all now a reality, data-driven organisations can begin to extract new insights from whole new areas of unstructured and textual information, both held privately or in the public domain such as digitised library books or public records.

The future
of natural language processing

Looking ahead, the increasingly personalised and interactive nature of applications may mean that NLP becomes an invisible service, only noticeable when it stops working. Alongside this integration, the keyboard and mouse may finally give way to more human interactions, allowing for whole new hardware and software platforms to evolve.

NLP could even help save lives, analysing speech to identify worker fatigue, or miscommunication between two parties, such as an airline pilot and control tower.

Ultimately, NLP’s potential usefulness goes a long way beyond the quirky and the novel. However, we may still be some way off from getting a computer to recognise the contexts of the more archaic, colloquial, or peculiar uses of language.

Consider this lyric from the Australian folk classic Waltzing Matilda: ‘And he sang as he watched and waited till his billy boil/You’ll come a Waltzing Matilda with me’. Will a computer ever be able to know it’s about boiling water in a pot, then travelling on foot with a bedroll? Only time will tell.

 

Contributor Placeholder

Contributor

Daintree Peters

Daintree is a director in the Innovation & Disruption practice working with emerging technology.

More About Daintree Peters