Hi, I'm Zheyu!
I am a 1st-year PhD student at Technical University of Munich, advised by Gjergji Kasneci.
Prior to that, I studied at LMU Munich, where I obtained my bachelor's degree in computer science and mathematics, and master's degree in computational linguistics.
I am broadly interested in natural language generation and model robustness and generalization. Currently, my research focuses on synthetic data generation, particularly in scenarios with limited data points.
Email  / 
GitHub /
Google Scholar /
LinkedIn /
Twitter
|
|
|
Probabilistic Aggregation and Targeted Embedding Optimization for Collective Moral Reasoning in Large Language Models
Chenchen Yuan, Zheyu Zhang, Shuo Yang, Bardh Prenkaj, Gjergji Kasneci
ACL 2025 Findings
paper /
We proposed a framework that aligns different LLMs by aggregating their moral judgments and optimizing embeddings to improve consistency and fidelity.
|
|
mPLM-Sim: Better Cross-Lingual Similarity and Transfer in Multilingual Pretrained Language Models
Peiqin Lin, Chengzhi Hu, Zheyu Zhang, André FT Martins, Hinrich Schütze
EACL 2024 Findings
paper /
code /
We evaluated the effectiveness of various mPLMs in measuring language similarity and improved zero-shot cross-lingual transfer performance based on the results.
|
|
Baby’s CoThought: Leveraging Large Language Models for Enhanced Reasoning in Compact Models
Zheyu Zhang, Han Yang, Bolei Ma, David Rügamer, Ercong Nie
EMNLP 2023 Workshop CoNLL-CMCL Shared Task BabyLM Challenge
paper /
code /
We proposed using LLMs to reconstruct existing data into NLU examples for training compact LMs, demonstrating the effectiveness of synthetic data in small LM training.
|
Academic Service
-
Volunteering
EACL 2024
-
Reviewing
ECML-PKDD 2025, ACL-ARR 2025, BabyLM Challenge 2023
|
|
|