Yizhou Liu

Yizhou Liu (刘逸舟)
PhD student @ MIT MechE
Email: liuyz@mit.edu
Hello! My name is Yizhou (⛵) and I am a PhD student at MechE, MIT, working with Prof. Jeff Gore. I am fascinated by the concept of “emergence” - how complex phenomena like life and intelligence arise from simpler components. My approach is to use physics methods to transform these intuitive ideas into something solid. Currently, my research focuses on two areas:
-
Ecology and Evolution: I study how species form communities and maintain their diversity and stability. My work emphasizes mechanistic understanding - examining how specific interaction mechanisms (like competition for food or space) shape community dynamics and evolution.
-
Physics of AI: I investigate the fundamental elements of artificial intelligence - optimizers, data, and architectures - to understand how they generate observed phenomena (like neural scaling laws). I hope to use the understanding to design more powerful AI systems.
Prior to my graduate studies, I received my B.E. from the Tsien Excellence in Engineering Program at Tsinghua University. During my time there, I worked with Professors Tongyang Li, Weijie Su, Shunlong Luo, and Min Chen on various topics.
While my research spans diverse topics, the methods may remain similar. According to F. Dyson’s argument, Birds and Frogs, I would classify myself as a “bird”. More thoughts are presented below:
-
Topics: University departments are mainly organized by subject matter, but methodology, taste, and analytical approach can bridge disciplines. My work is unified by methodology rather than field boundaries.
-
Relevance: Although it is hard to see whether a work can be immediantly useful or not, it is relatively easy to see whether it can ever be useful, which defines its relevance to the reality. I want to do more useful works in general.
-
Elegancy: I seek results or principles that are both simple and non-trivial.
-
Methods: One Two Three (Dynamical systems)…Infinity (Statistical mechanics). In One Two Three…Infinity by G. Gamow, there is a story that Hottentot tribes do not have in their vocabulary the numbers larger than three and simply use “many” for the larger numbers. Similarly, our best analytical understanding comes from either simple systems or thermodynamic limits.
You may see how these principles gradually shape my research works through my selected publications below. For a complete list, you may visit my Google Scholar profile.
news
Sep 18, 2025 | Our paper, Superposition Yields Robust Neural Scaling, has been accepted at NeurIPS 2025 as an oral. See you at San Diego! |
---|---|
Sep 16, 2025 | Presenting our works on ecostability at CMT Kids Seminar, Harvard University |
Aug 22, 2025 | Going to NEMI 2025 |
latest posts
selected publications
- Superposition Yields Robust Neural ScalingThe Thirty-Ninth Annual Conference on Neural Information Processing Systems (Oral). Superposition means that models represent more features than dimensions they have, which is true for LLMs since there are too many things to represent in language. We find that superposition leads to a power-law loss with width, leading to the observed neural scaling law , Dec 2025
- Ecosystem stability relies on diversity difference between trophic levelsProceedings of the National Academy of Sciences. Ecologists continue to debate how biodiversity influences ecosystem stability. We found that after considering the trophic level structure in species interactions, stability does not depend on the absolute diversity but on the diversity difference between trophic levels , Dec 2024