Hi, I'm Zheyu!
I am a 1st-year PhD student at Technical University of Munich, advised by Gjergji Kasneci.
Prior to that, I studied at LMU Munich, where I obtained my bachelor's degree in computer science and mathematics, and master's degree in computational linguistics.
I am broadly interested in natural language generation and model robustness and generalization. Currently, my research focuses on synthetic data generation, particularly in scenarios with limited data points.
Email  / 
GitHub /
Google Scholar /
LinkedIn /
Twitter
|
|
News
Dec '24 🥳 I start my PhD journey!!!
Jan '24 🦞 One paper is accepted to EACL 2024 and I will be in Malta as a student volunteer!
Oct '23 🍻 One paper is accepted to BabyLM Challenge @CoNLL 2023!
|
|
mPLM-Sim: Better Cross-Lingual Similarity and Transfer in Multilingual Pretrained Language Models
Peiqin Lin, Chengzhi Hu, Zheyu Zhang, André FT Martins, Hinrich Schütze
EACL 2024 Findings
paper /
code /
We evaluated the effectiveness of various mPLMs in measuring language similarity and improved zero-shot cross-lingual transfer performance based on the results.
|
|
Baby’s CoThought: Leveraging Large Language Models for Enhanced Reasoning in Compact Models
Zheyu Zhang, Han Yang, Bolei Ma, David Rügamer, Ercong Nie
EMNLP 2023 Workshop CoNLL-CMCL Shared Task BabyLM Challenge
paper /
code /
poster /
We proposed using LLMs to reconstruct existing data into NLU examples for training compact LMs, demonstrating the effectiveness of synthetic data in small LM training.
|
Academic Service
-
Volunteering
EACL 2024
-
Reviewing
ACL-ARR 2025, BabyLM Challenge 2023
|
|
|