Akari Asai
Ph.D. student @ Paul G. Allen School of Computer Science & Engineering, University of Washington
Visiting Student Researcher @ Meta AI
I am in the final year of my Ph.D. in NLP at Paul G. Allen School of Computer Science & Engineering, University of Washington. I am fortunate to be advised by Prof. Hannaneh Hajishirzi. I am also spending time at Meta AI Research as a visiting student researcher, under the supervision of Dr. Wen-tau Yih. Prior to joining UW, I obtained a B.E. in Electrical Engineering and Computer Science from The University of Tokyo, Japan.
I am on the academic job market this year! Please feel free to reach out if you’d like to discuss opportunities. I am attending NeurIPS!
My research focuses on natural language processing and machine learning, with a particular emphasis on large language models (LLMs). I investigate the core limitations of LLMs—such as hallucinations—that cannot be overcome by scaling alone. To address these challenges, I pioneered Retrieval-Augmented LMs, a novel class of LLMs that integrate large-scale text data via retrieval during inference. More specifically,
-
Establishing the Necessity of Retrieval-Augmented LMs – I have systematically researched the failure modes of LLMs, and am among the first to develop retrieval-augmented approaches as a solution. My work has shown their effectiveness for (1) reducing hallucinations e.g., Adaptive Retrieval-Augmented LM (ACL 2023 Oral, Best Video Award), (2) achieving more compute-efficient scaling e.g., MassiveDS (NeurIPS 2024); (3) staying updated with real-time knowledge changes e.g., Real-time QA (NeurIPS D&B 2023). I’ve taught the first tutorial of retrieval-augmented LMs.
-
Building the Foundations of Retrieval-Augmented LMs – I have established the foundational components of Retrieval-Augmented LMs, developing better architectures, training strategies, and inference techniques like Self-RAG (ICLR Oral-Top 1%, 2024), Evidentiality-guided RAG (NAACL Oral, 2022) and FAVA (COLM 2024). I’ve advanced retrieval systems to be more versatile, following diverse user instructions (TART, Findings of ACL 2023), more robust to complex queries (Path Retriever, ICLR 2020) (LUKE, EMNLP 2020; 1.5 million downloads on HF), and more efficient (Binary Passage Retriever, ACL 2021).
-
Making Real-world Impacts through Retrieval-Augmented LMs – I advance the frontiers of Retrieval-Augmented LMs by addressing critical real-world challenges, particularly in contexts where information is scarce or fragmented. My work focuses on: (1) Developing reliable LMs for expert domain tasks, such as scientific research (Open Scholar, 2024) or code generation (CodeRAG-Bench, 2024), and (2) improving information access across world languages (e.g., CORA, NeurIPS 2021; XOR QA, NAACL 2021; AfriQA, Findings of EMNLP, 2022).
My work has received multiple paper awards at conferences like ACL and NeurIPS workshop, and has been featured in major media outlets such as Forbes and MIT Technology Review. I’m honored to be named among the EECS Rising Stars (2022), the IBM Global Ph.D. Fellows (2022-2023) and MIT Technology Review Innovators Under 35 from Japan (2024). My work is now integrated into major libraries like Hugging Face, LlamaIndex and LangChain, and used in multiple real-world systems, such as COVID-19 Research Search. Most recently, we released Ai2 OpenScholar Public Demo, assisting scientists to synthesize scientific literature more effectively and efficiently.
I am also passionate about teaching, mentoring and helping students to learn research, especially students from underrepresented groups. I have been the Head TA for CSE473: Intro to AI (undergrad) and CSE599J: Data-centric ML (graduate) at UW. To reduce barriers to starting research or pursuing a Ph.D. in this area, I am hosting weekly office hours open to everyone (please sign up from Calendly!), and was a mentor for UW CSE Ph.D. Pre-Application Mentorship Service (PAMS).
news
Nov 18, 2024 | I’m SUPER excited to release OpenScholar, my latest collaboration project with amazing co-authors from UW, Ai2, Meta and CMU, Stanford, UIUC and UNC. Try out our public demo, and learn more about the project in the paper and Ai2 blog. |
---|---|
Oct 31, 2024 | I’m honored to be chosen as MIT Technology Review Innovators Under 35 from Japan! See the MIT Technology Review article about my work on retrieval-augmented LMs to build more reliable LM-based systems. |
Oct 22, 2024 | We released Pangea, a new state-of-the-art multilingual and multimodal LLM! Check out our demo! |
Sep 25, 2024 | Scaling of retrieval datastore has been accepted at NeurIPS! |
Sep 19, 2024 | CopyBench has been accepted at EMNLP as a main conference paper! |
selected publications
See my full publications at the publication page!