Natural Language Processing or NLP is a catch-all term for making sense of unstructured text-like data. Google search recommendations, chatbots, and grammar checkers are all forms of NLP.
This is a field with many years of research. But, for the last 5-7 years, machine learning has reigned supreme.
Five years ago, machine learning approaches to NLP were labor intensive. Success meant having access to large amounts of clean and labeled training data that would train ML models. A text summarization model would be pretty different from one that did sentiment analysis.
The development of large language models or LLMs has revolutionized this field. Models like GPT-3 are a general-purpose tools that can be used to do several different tasks with very little training.
To show GPT-3 in action, I built a tiny slack bot that asks some questions and uses GPT-3 to generate actions. The video below is a demo of the bot and also an explanation of how to prompt GPT-3 to do NLP tasks.