The Year NLP Stopped Being a Research Project

A few months ago, I was running late for dinner and asked my phone to "find a quiet Italian place near me that's not too expensive and has outdoor seating." It came back with three options, all reasonable. I picked one, booked a table, and made it with five minutes to spare. Only later did it hit me: I'd just had a genuinely useful conversation with a machine. Not a keyword search. A conversation.

That small moment captures something bigger happening in 2019. Natural language processing, the branch of AI that helps machines understand human language, has quietly crossed a threshold. It's no longer a research project. It's becoming infrastructure.

Amazon Echo smart speaker with glowing yellow ring — one of 208 million smart speakers projected in 2019
The smart speaker installed base grew 82% in 2019, making voice a default interface for millions. Photo by Smart Home Perfected, CC BY 2.0.

The breakthroughs, in plain English

Two models are driving most of the shift. BERT, released by Google in late 2018, can understand words in context, not just in isolation. The word "bank" in "river bank" versus "bank account" actually registers as two different meanings. It's already being adopted for tasks like sentiment analysis, question answering, and entity recognition, and the implications for how we search and interact with information are enormous.

Then in February, OpenAI released GPT-2, a text generation model so convincing that they initially held back the full version, calling it too dangerous to release. Whether or not you agree with the drama, the underlying capability is real: machines that can write coherent paragraphs, summarize documents, and translate between languages with startling fluency.

What both models share is a technique called transfer learning. In practical terms, it means you no longer need a massive, custom dataset to build something useful with NLP. You can take a pre-trained model and fine-tune it for your specific use case. That change alone has lowered the barrier from "hire a team of PhDs" to "one engineer with a weekend."

Voice goes mainstream

The other signal is hardware. The global smart speaker installed base is projected to grow 82.4% this year, reaching roughly 208 million units. Amazon's Alexa now works with around 60,000 smart home devices. People are talking to their devices constantly, for search, for music, for shopping, for questions they'd rather not type.

A colleague of mine recently admitted she'd started talking to her Alexa more than she texts some friends. She was joking, mostly. But the underlying point is real: voice is becoming a default interface for a lot of people, and NLP is the engine underneath.

What this means if you build products

If you're on a product team, the practical takeaway is this: natural language is becoming a product interface, not just a feature. Chatbots that actually understand intent. Search that grasps meaning, not just keywords. Voice experiences that feel conversational rather than robotic.

A year ago, putting NLP on your roadmap meant a significant investment in specialized talent and infrastructure. In 2019, the tools have caught up to the ambition. The question is shifting from "can we do this?" to "where does this create the most value?"

The real shift

The story of NLP in 2019 isn't that machines learned language. Machines have been processing language for decades. The story is that language became accessible as a building block, something any product team can work with, experiment on, and ship. That changes what's possible, and it changes it fast.

← All writing Follow on LinkedIn →