Akari Asai
Ph.D. student @ Paul G. Allen School of Computer Science & Engineering, University of Washington
Visiting Student Researcher @ Meta AI
I am currently in my fnial year of pursuing a Ph.D. in NLP at Paul G. Allen School of Computer Science & Engineering, University of Washington. I am fortunate to be advised by Prof. Hannaneh Hajishirzi. I am also spending some time at Meta AI Research as a visiting student researcher, under the supervision of Dr. Wen-tau Yih. Prior to joining UW, I obtained a B.E. in Electrical Engineering and Computer Science from The University of Tokyo, Japan.
I am on the academic job market this year! Please feel free to reach out if you’d like to discuss opportunities.
My primary research interests are centered around natural language processing and machine learning. My recent research focuses on Retrieval-Augmented Language Models, which address many of inherent limitations in large language models (LLMs) by dynamically retrieving and incorporating external knowledge at inference time. More specifically,
-
Understanding limitations of LLMs and effectiveness of retrieval-augmented LMs – I have extensively researched the failure modes of LLMs, which scaling cannot mitigate, and was among the first to develop retrieval-augmented approaches as a solution. My work has shown that retrieval-augmented LMs can: (1) reduce hallucinations as demonstrated in Adaptive Retrieval-augmented LM (ACL 2023 Oral, Best Video Award), (2) achieve more compute-efficient scaling, as seen in MassiveDS (NeurIPS 2024); (3) stay updated with real-time knowledge changes, illustrated by Real-time QA (NeurIPS D & B 2023). I’ve co-taught the first tutorial of retrieval-augmented LMs. In our position paper, Reliable, Adaptive and Attributable LMs with Retrieval (2024) we advocate for a community shift towards retrieval-augmented LMs.
-
Developing foundations of modern retrieval-augmented LMs – Modern retrieval-augmented LMs are now applied in diverse and complex scenarios beyond specific tasks like QA. I have worked to establish the foundational components to support these advanced uses, developing better architectures, training strategies, and inference techniques like Self-RAG (ICLR Oral-Top 1%, 2024), Evidentiality-guided RAG (NAACL Oral, 2022) and FAVA (COLM 2024). I’ve also advanced retrieval systems to be more versatile (TART, Findings of ACL 2023), more robust to complex queries (Path Retriever, ICLR 2020)(LUKE, EMNLP 2020; 1 million downloads on HF), and more efficient (Binary Passage Retriever, ACL 2021).These methods are now integrated into major libraries like huggingface transformers, LlamaIndex and LangChain, and used in multiple real-world systems, such as COVID-19 Research Search
-
Translating state-of-the-art retrieval-augmented LMs into real-world applications – I push the boundaries of retrieval-augmented LMs by tackling real-world challenges like inequality of information access across languages and the unreliability of LLMs in expert domains. I developed the first end-to-end multilingual retrieval-augmented LM, CORA (NeurIPS 2021), and created multilingual open-retrieval datasets, including XOR QA (NAACL 2021) and AfriQA (Findings of EMNLP), covering underrepresented languages like African languages. As the lead organizer of NAACL 2022 Workshop on Multilingual Information Access, I also hosted the first cross-lingual retrieval and open-domain QA shared task. Recently, I’ve been focusing on expert domains with retrieval-augmented LMs, including code generation (CodeRAG-Bench, 2024) and scientific research.
I am also passionate about teaching, mentoring and helping students to learn research, especially students from underrepresented groups. I have been the Head TA for CSE473: Intro to AI (undergrad) and CSE599J: Data-centric ML (graduate) at UW. To reduce the barrier to start research or Ph.D. in this area, I’m hosting weekly office hours open to everyone (please sign up from Calendly!), and am a mentor for UW CSE Ph.D. Pre-Application Mentorship Service (PAMS).
Update (September 2024): I have temporarily paused my public office hours. If you’re seeking feedback on your Ph.D. application materials or have questions about the UW CSE Ph.D. program, I highly recommend applying to the UW CSE Ph.D. Pre-Application Mentorship Service (PAMS), and exploring similar programs at other institutions. Unfortunately, I won’t be able to mentor new students during the 2024-2025 academic year. If you’re interested in collaborating with students from H2Lab, please submit an inquiry through our group website’s inqury.
news
Sep 25, 2024 | Scaling of retrieval datastore has been accepted at NeurIPS! |
---|---|
Sep 19, 2024 | CopyBench has been accepted at EMNLP as a main conference paper! |
Sep 17, 2024 | I gave a lecture, “Retrieval-augmented LMs: Past, Present and Future” at CMU (Large Language Models: Methods and Applications) |
Jul 15, 2024 | Our new pre-prints, CodeRAG-Bench, Scaling of retrieval datastore and CopyBench are out! |
Jul 10, 2024 | Our Fine-grained Hallucination paper has been accepted at the first COLM! |
selected publications
See my full publications at the publication page!