Natural language processing (NLP) applications keep shaping our digital lives. Whether it’s through the conversational bots that process our input through text or voice and interacts with us in some form of dialogue like humans do or the various applications such as the translation mobile apps that help bridge communication between people speaking different languages across the world, NLP products are revolutionizing the way we communicate. In this article, I will share a broad overview of the NLP space from an industrial perspective without deep diving into the technical details. My purpose is to provide brief insights into the field for the entrepreneurs and the general public. First, I will share a brief history of the field.
"Google Trends also shows the popularity of “NLP” search to have doubled in the last five years"
Natural language processing is concerned with the interactions between computer and human languages. Early signs of research in the field began in the seventeenth century when philosophers thought of ways to translate between languages. Yet it was only in the 1950 that the field of NLP started making headlines when Alan Turing published his infamous paper “Computing Machinery and Intelligence” that lead to the Turing Test, a mechanism in which a machine would impersonate a human and it is left to another human to determine if they are interacting with a human or a machine using just the conversational content. Another key event occurred in 1957 when Noam Chomsky published his book Syntactic Structures where he proposed a universal grammar for languages using a rule-based system of syntactic structures. Chomsky’s work revolutionized the study of linguistics with rule-based grammars until the 1980s when statistical-based machine learning algorithms with powerful but affordable computing power became more prominent at the time. Statistical techniques that rely more on probability theory and data analysis of corpora of data showed more promising in real-world settings such as speech recognition where human input is often unpredictable and cannot always match a predefined set of rules. Advancements in the research and development of NLP using machine learning algorithms grew even more significantly in our last decade due to advancements of research in artificial neural networks and the popularity of digital assistant products we all use today, including Amazon Alexa and Google Translate.
There seems to be no shortage of ideas and opportunities to innovate in the field of NLP. There are over 600 books on NLP on Amazon.com, over 2500 e-prints on arxiv.org, and a search for “natural language processing” on Google Scholar returns more than three and a half million results. Google Trends also shows the popularity of “NLP” search to have doubled in the last five years. In my opinion, the popularity of NLP can be attributed in large part to the following:
- Massive adoption of social media tools, including Facebook, Twitter, WhatsApp, and Instagram, amongst our generations. People nowadays text rather than talk with one another, and the corresponding data on these platforms are being embedded in research such as analyzing people’s sentiment on social media.
- Digital assistant devices such as Amazon Alexa and Google Assistant keep invading our homes, and voice-enabled mobile products such as Siri have become one of those common “voices” we talk to. The popularity of such tools is driving developers to build more voice-enabled services for your home, business, and for anything that you do.
- Chatbots, voice or text, are continuously being embedded in consumer products and business services. Bots can be found in e-commerce self-service websites, customer service automation for call centers, computer operating systems, such as Windows Cortana or MacOS Siri, and development platforms, such as DevOps chat tools in Slack channels.
Many of such technologies were first made available for the English language, but we are witnessing the continuous expansion of additional languages in chatbots, digital assistants, machine translation, and various other NLP tools.
I see that the next generation of NLP products will accommodate more natural dialogues between humans and machines across a lot more languages. Algorithms are becoming more capable of identifying and mimicking wider varieties of dialects and accents. Recent publications in neural networks for speech synthesis and natural language understanding show promising results for languages beyond the English language. Nowadays, you can converse with Alexa in 14 popular languages, allow Google Translate to speak in more than 100 languages on your behalf, or build a conversational bot using Microsoft Cognitive Toolkit for 12 popular languages or dialects. Recent NLP product demos from major firms including Microsoft, Google, and Amazon accompanied with the various innovative NLP code publicly available on Github.com along with the open access research papers on ArXiv.org strongly support the argument that NLP technologies are becoming far more intelligent and prominent in our lives. But there is a lot more to be done for NLP technologies to accommodate the human languages and dialects across the world. I believe that additional research and development of NLP products that encompass the world will help connect more people. It will possibly generate better opportunities for the disadvantaged ones as well as help all head into the 4th industrial revolution.
See Also :