Hands-on Question Answering Systems with BERT: Applications in Neural Networks and Natural Language Processing
Autor Navin Sabharwal, Amit Agrawalen Limba Engleză Paperback – 13 ian 2021
Get hands-on knowledge of how BERT (Bidirectional Encoder Representations from Transformers) can be used to develop question answering (QA) systems by using natural language processing (NLP) and deep learning.
The book begins with an overview of the technology landscape behind BERT. It takes you through the basics of NLP, including natural language understanding with tokenization, stemming, and lemmatization, and bag of words. Next, you’ll look at neural networks for NLP starting with its variants such as recurrent neural networks, encoders and decoders, bi-directional encoders and decoders, and transformer models. Along the way, you’ll cover word embedding and their types along with the basics of BERT.
After this solid foundation, you’ll be ready to take a deep dive into BERT algorithms such as masked language models and next sentence prediction. You’ll see different BERT variations followed by a hands-on example of a question answering system.
Hands-on Question Answering Systems with BERT is a good starting point for developers and data scientists who want to develop and design NLP systems using BERT. It provides step-by-step guidance for using BERT.
What You Will Learn
AI and machine learning developers and natural language processing developers.
The book begins with an overview of the technology landscape behind BERT. It takes you through the basics of NLP, including natural language understanding with tokenization, stemming, and lemmatization, and bag of words. Next, you’ll look at neural networks for NLP starting with its variants such as recurrent neural networks, encoders and decoders, bi-directional encoders and decoders, and transformer models. Along the way, you’ll cover word embedding and their types along with the basics of BERT.
After this solid foundation, you’ll be ready to take a deep dive into BERT algorithms such as masked language models and next sentence prediction. You’ll see different BERT variations followed by a hands-on example of a question answering system.
Hands-on Question Answering Systems with BERT is a good starting point for developers and data scientists who want to develop and design NLP systems using BERT. It provides step-by-step guidance for using BERT.
What You Will Learn
- Examine the fundamentals of word embeddings
- Apply neural networks and BERT for various NLP tasks Develop a question-answering system from scratch
- Train question-answering systems for your own data
AI and machine learning developers and natural language processing developers.
Preț: 198.73 lei
Preț vechi: 248.42 lei
-20% Nou
Puncte Express: 298
Preț estimativ în valută:
38.04€ • 41.44$ • 31.91£
38.04€ • 41.44$ • 31.91£
Carte disponibilă
Livrare economică 28 noiembrie-12 decembrie
Preluare comenzi: 021 569.72.76
Specificații
ISBN-13: 9781484266632
ISBN-10: 1484266633
Pagini: 184
Ilustrații: XV, 184 p. 80 illus.
Dimensiuni: 155 x 235 mm
Greutate: 0.29 kg
Ediția:1st ed.
Editura: Apress
Colecția Apress
Locul publicării:Berkeley, CA, United States
ISBN-10: 1484266633
Pagini: 184
Ilustrații: XV, 184 p. 80 illus.
Dimensiuni: 155 x 235 mm
Greutate: 0.29 kg
Ediția:1st ed.
Editura: Apress
Colecția Apress
Locul publicării:Berkeley, CA, United States
Cuprins
Chapter 1: Introduction to Natural Language Processing.- Chapter 2: Introduction to Word Embeddings.- Chapter 3: BERT Algorithms Explained.- Chapter 4: BERT Model Applications - Question Answering System.- Chapter 5: BERT Model Applications - Other tasks.- Chapter 6: Future of BERT models.
Notă biografică
Navin is the chief architect for HCL DryICE Autonomics. He is an innovator, thought leader, author, and consultant in the areas of AI, machine learning, cloud computing, big data analytics, and software product development. He is responsible for IP development and service delivery in the areas of AI and machine learning, automation, AIOPS, public cloud GCP, AWS, and Microsoft Azure. Navin has authored 15+ books in the areas of cloud computing , cognitive virtual agents, IBM Watson, GCP, containers, and microservices.
Amit Agrawal is a senior data scientist and researcher delivering solutions in the fields of AI and machine learning. He is responsible for designing end-to-end solutions and architecture for enterprise products. He has also authored and reviewed books in the area of cognitive virtual assistants.
Amit Agrawal is a senior data scientist and researcher delivering solutions in the fields of AI and machine learning. He is responsible for designing end-to-end solutions and architecture for enterprise products. He has also authored and reviewed books in the area of cognitive virtual assistants.
Textul de pe ultima copertă
Get hands-on knowledge of how BERT (Bidirectional Encoder Representations from Transformers) can be used to develop question answering (QA) systems by using natural language processing (NLP) and deep learning.
The book begins with an overview of the technology landscape behind BERT. It takes you through the basics of NLP, including natural language understanding with tokenization, stemming, and lemmatization, and bag of words. Next, you’ll look at neural networks for NLP starting with its variants such as recurrent neural networks, encoders and decoders, bi-directional encoders and decoders, and transformer models. Along the way, you’ll cover word embedding and their types along with the basics of BERT.
After this solid foundation, you’ll be ready to take a deep dive into BERT algorithms such as masked language models and next sentence prediction. You’ll see different BERT variations followed by a hands-on example of a question answering system.
Hands-on Question Answering Systems with BERT is a good starting point for developers and data scientists who want to develop and design NLP systems using BERT. It provides step-by-step guidance for using BERT.
You will:
The book begins with an overview of the technology landscape behind BERT. It takes you through the basics of NLP, including natural language understanding with tokenization, stemming, and lemmatization, and bag of words. Next, you’ll look at neural networks for NLP starting with its variants such as recurrent neural networks, encoders and decoders, bi-directional encoders and decoders, and transformer models. Along the way, you’ll cover word embedding and their types along with the basics of BERT.
After this solid foundation, you’ll be ready to take a deep dive into BERT algorithms such as masked language models and next sentence prediction. You’ll see different BERT variations followed by a hands-on example of a question answering system.
Hands-on Question Answering Systems with BERT is a good starting point for developers and data scientists who want to develop and design NLP systems using BERT. It provides step-by-step guidance for using BERT.
You will:
- Examine the fundamentals of word embeddings
- Apply neural networks and BERT for various NLP tasks
- Develop a question-answering system from scratch
- Train question-answering systems for your own data
Caracteristici
Integrates question answering systems with document repositories from different sources Contains an in-depth explanation of the technology behind BERT Takes a step-by-step approach to building question answering systems from scratch