Mixture-of-Experts is a powerful approach that leverages the strengths of multiple specialized models to tackle complex problems, offering improved performance and efficiency in various machine learning applications. In this talk, Dr. Lin will cover the basics of MoE and its recent development in large language models. Furthermore, he will briefly introduce the theory behind MoE under different se
5 RSVP'd
Mixture-of-Experts is a powerful approach that leverages the strengths of multiple specialized models to tackle complex problems, offering improved performance and efficiency in various machine learning applications. In this talk, Dr. Lin will cover the basics of MoE and its recent development in large language models. Furthermore, he will briefly introduce the theory behind MoE under different setups to better understand its performance in practice.
Wednesday, June 26, 2024
4:00 PM – 5:00 PM (UTC)
University Of Houston
Professor
Rice University
Graduate Researcher
Rice University
Graduate Researcher
Contact Us