Pretrained Transformers for Text Ranking: BERT and Beyond

Author:   Jimmy Lin ,  Rodrigo Nogueira ,  Andrew Yates
Publisher:   Morgan & Claypool Publishers
ISBN:  

9781636392288


Pages:   325
Publication Date:   30 October 2021
Format:   Paperback
Availability:   Manufactured on demand   Availability explained
We will order this item for you from a manufactured on demand supplier.

Our Price $125.00 Quantity:  
Add to Cart

Share |

Pretrained Transformers for Text Ranking: BERT and Beyond


Add your own review!

Overview

The goal of text ranking is to generate an ordered list of texts retrieved from a corpus in response to a query. Although the most common formulation of text ranking is search, instances of the task can also be found in many natural language processing (NLP) applications. This book provides an overview of text ranking with neural network architectures known as transformers, of which BERT (Bidirectional Encoder Representations from Transformers) is the best-known example. The combination of transformers and self-supervised pretraining has been responsible for a paradigm shift in NLP, information retrieval (IR), and beyond. This book provides a synthesis of existing work as a single point of entry for practitioners who wish to gain a better understanding of how to apply transformers to text ranking problems and researchers who wish to pursue work in this area. It covers a wide range of modern techniques, grouped into two high-level categories: transformer models that perform reranking in multi-stage architectures and dense retrieval techniques that perform ranking directly. Two themes pervade the book: techniques for handling long documents, beyond typical sentence-by-sentence processing in NLP, and techniques for addressing the tradeoff between effectiveness (i.e., result quality) and efficiency (e.g., query latency, model and index size). Although transformer architectures and pretraining techniques are recent innovations, many aspects of how they are applied to text ranking are relatively well understood and represent mature techniques. However, there remain many open research questions, and thus in addition to laying out the foundations of pretrained transformers for text ranking, this book also attempts to prognosticate where the field is heading.

Full Product Details

Author:   Jimmy Lin ,  Rodrigo Nogueira ,  Andrew Yates
Publisher:   Morgan & Claypool Publishers
Imprint:   Morgan & Claypool Publishers
Dimensions:   Width: 15.20cm , Height: 1.70cm , Length: 22.90cm
Weight:   0.562kg
ISBN:  

9781636392288


ISBN 10:   1636392288
Pages:   325
Publication Date:   30 October 2021
Audience:   Professional and scholarly ,  Professional & Vocational
Format:   Paperback
Publisher's Status:   Active
Availability:   Manufactured on demand   Availability explained
We will order this item for you from a manufactured on demand supplier.

Table of Contents

Preface Acknowledgments Introduction Setting the Stage Multi-Stage Architectures for Reranking Refining Query and Document Representations Learned Dense Representations for Ranking Future Directions and Conclusions Bibliography Authors' Biographies

Reviews

Author Information

Jimmy Lin holds the David R. Cheriton Chair in the David R. Cheriton School of Computer Science at the University of Waterloo. Prior to 2015, he was a faculty at the University of Maryland, College Park. Lin received his Ph.D. in Electrical Engineering and Computer Science from the Massachusetts Institute of Technology in 2009 Rodrigo Nogueira is a post-doctoral researcher at the University of Waterloo, an adjunct professor at the University of Campinas (UNICAMP), and a senior research scientist at NeuralMind, a startup focused on applying deep learning to document and image analysis. Nogueira received his Ph.D. in Computer Science from the New York University in 2019. Andrew Yates is an assistant professor in the Informatics Institute at the University of Amsterdam. Prior to 2021, he was a post-doctoral researcher and then senior researcher at the Max Planck Institute for Informatics. Yates received his Ph.D. in Computer Science from Georgetown University in 2016.

Tab Content 6

Author Website:  

Customer Reviews

Recent Reviews

No review item found!

Add your own review!

Countries Available

All regions
Latest Reading Guide

Aorrng

Shopping Cart
Your cart is empty
Shopping cart
Mailing List