Upgrading Search Applications in the Era of LLMs: A Demonstration with Practical Lessons
Upgrading Search Applications in the Era of LLMs: A Demonstration with Practical Lessons
Shuang Yu, Nirandika Wanigasekara, Jeff Tan, Kent Fitch, Jaydeep Sen
Proceedings of the Thirty-Third International Joint Conference on Artificial Intelligence
Demo Track. Pages 8843-8846.
https://doi.org/10.24963/ijcai.2024/1045
While traditional search systems have mostly been satisfactorily relying on lexical based sparse retrievers such as BM25, recent research advances in neural models, the current day large language models (LLMs) hold good promise for practical search applications as well. In this work, we discuss a collaboration between IBM and National Library of Australia to upgrade an existing search application (referred to as NLA) over terabytes of Australian Web Archive data and serving thousands of daily users. We posit and demonstrate both empirically and through qualitative user studies that LLMs and neural models can indeed provide good gains, when combined effectively with traditional search. We believe this demonstration will show the unique challenges associated with real world practical deployments and also offer valuable insights into how to effectively upgrade legacy search applications in the era of LLMs.
Keywords:
Natural Language Processing: NLP: Information retrieval and text mining
Natural Language Processing: NLP: Applications
Natural Language Processing: NLP: Natural language semantics
Natural Language Processing: NLP: Question answering