Bidirectional Encoder Representations from Transformers (BERT) is a NLP pre-training technique released by Google. BERT's key innovation is its ability to pre-train bidirectional, contextual language representations modeled on a large text corpus. The model can then be used for downstream NLP tasks like Natural Language Understanding (NLU) and question answering. Named Entity Recognition (NER) is a subtask of NLU that attempts to identify and classify entities in a given text into pre-defined categories like names, places, organizations, currency, and quantities. A NER model can be trained using BERT. Integration of BERT NER with Rasa using a custom pipeline resulted in highly performant NLP and engaging conversations between humans and Rasa agents. WHAT YOU'LL LEARN Custom NLU pipeline with BERT integration How to build highly performant NLP Mady Mantha is a machine learning architect and AI platform leader with nearly 9 years of experience in the technology industry focused on conversational AI, natural language processing, and deep learning. She graduated from Georgetown University and is a space enthusiast.