About me
I’m Luning Wang (王麓宁), currently a M.S. student at the University of Michigan.
Before joining U-Mich, I obtained my B.E. degree in 2024 from the Department of Electronic Engineering, Tsinghua University.
🎓 Education
- [09/2020~06/2024] B.E. Department of Electronic Engineering, Tsinghua University
- [08/2024~05/2026] M.S. Department of Electrical and Computer Engineering, University of Michigan
📖 Research
I mainly focused on the efficient algorithms of large language models in my past research, including the compression and acceleration techniques of LLMs. I’m currently trying to get on the way of multimodal models and diffusion models. See my publications to learn more about my works.
I’m lucky to have been working with these groups:
- [09/2022~06/2024] NICS-EFC, Tsinghua Univ. [website]
I’m open to research cooperation opportunities in the field of LLMs, Multimodal Models, and potentially other AI & Data-Science related fields!
💻 Internship
I have the honor to have been working in several organizations, including both academia and industry. See my CV for more details.
- [07/2023~08/2023] HKU-IDS, Research Assisstant. [Website]
- [09/2023~01/2024] Bytedance Data-TnS, Algorithm Research Intern. [Website]
- [02/2024~06/2024] Infinigence AI, Algorithm Research Intern. [Website]
I’m now actively looking for (research/engineering) intern opportunities in the field of LLMs, Multimodal Models, and potentially other AI & Data-Science related fields!
- Prospective: 2025 summer (May ~ August), full-time, remote or on-site. Positions in China or the United States are both applicable. Please contact me if there’s an opportunity!
📝 Publications
- [ENLSP NeurIPS Workshop’24] CSKV: Training-Efficient Channel Shrinking for KV Cache in Long-Context Scenarios. Luning Wang, Shiyao Li, Xuefei Ning, Zhihang Yuan, Shengen Yan, Guohao Dai, Yu Wang. [pdf] [github]
- [(Under review)] A Survey on Efficient Inference for Large Language Models. Zixuan Zhou*, Xuefei Ning*, Ke Hong*, Tianyu Fu, Jiaming Xu, Shiyao Li, Yuming Lou, Luning Wang, Zhihang Yuan, Xiuhong Li, Shengen Yan, Guohao Dai, Xiao-Ping Zhang, Yuhan Dong, Yu Wang. [pdf]
- [ICML’24] Evaluating Quantized Large Language Models. Shiyao Li, Xuefei Ning, Luning Wang, Tengxuan Liu, Xiangsheng Shi, Shengen Yan, Guohao Dai, Huazhong Yang, Yu Wang. [pdf] [github]
- [ENLSP NeurIPS Workshop’23] LLM-MQ: Mixed-precision Quantization for Efficient LLM Deployment. Shiyao Li, Xuefei Ning, Ke Hong, Tengxuan Liu, Luning Wang, Xiuhong Li, Kai Zhong, Guohao Dai, Huazhong Yang, Yu Wang. [pdf]
🌱 More about myself
During my leisure time, I love hanging out in the wild and travelling around (in a casual way). I’m really enthusiastic about exploring more places around the United States in the next two years!
I am a fan of fantasy and epic movies. Lord of the Rings, Harry Potter and Pirates of the Caribbean are some of my favourite masterpieces.
Also, I’ve been an avid collector of antiques (especially ancient coins 🪙) for many years. I’m always attracted by things that have a sense of age!