Han Zhou

Han Zhou

PhD student in NLP

University of Cambridge

Biography

I am a second-year PhD student in Computation, Cognition, and Language (NLP) at Language Technology Lab, University of Cambridge. I am supervised by Prof. Anna Korhonen and Dr. Ivan Vulić. Previously, I interned at Gemini (RADML) and CAIR Teams at Google DeepMind (formerly Google Research).

Before starting my PhD, I was an undergraduate student reading Engineering Science at University of Oxford. I received my second Master degree in MSc Machine Learning, UCL NLP Lab. I graduated top of my class.

I am always interested in modular, efficient, and reward-driven responsible intelligence.

Recent News

Academic Services

Reviewer/program committee member at ACL (2023-24), EMNLP (2022-23), ICML (2024, external), NeurIPS (2024), NeurIPS R0-FoMo (2023).

Interests
  • Large Language Models
  • Parameter-Efficient Fine-Tuning
  • Prompt Optimization
  • Modularity
Education
  • PhD in Computation, Cognition, and Language, Oct 2022 -

    University of Cambridge

  • MSc in Machine Learning, 2020 - 2021

    University College London

  • BA, MEng in Engineering Science, 2015 - 2019

    University of Oxford

Publications

Quickly discover relevant content by filtering publications.
(2024). TopViewRS: Vision-Language Models as Top-View Spatial Reasoners. arXiv preprint arXiv:2406.02537.

PDF Cite Code Abstract Project Page

(2024). Aligning with Human Judgement: The Role of Pairwise Preference in Large Language Model Evaluators. arXiv preprint arXiv:2403.16950.

PDF Cite Code Abstract

(2024). Batch Calibration: Rethinking Calibration for In-Context Learning and Prompt Engineering. International Conference on Learning Representations (ICLR).

PDF Cite Abstract (Google Research) OpenReview Blog (Google AI) Talk (NeurIPS Spotlight)

(2024). AutoPEFT: Automatic Configuration Search for Parameter-Efficient Fine-Tuning. Transactions of the Association for Computational Linguistics (TACL).

PDF Cite Code Abstract MIT Press ACL Anthology

(2023). Can Large Language Models Achieve Calibration with In-Context Learning?. ICLR 2024 Workshop on Reliable and Responsible Foundation Models.

PDF Cite Code Abstract OpenReview

(2023). Survival of the Most Influential Prompts: Efficient Black-Box Prompt Search via Clustering and Pruning. Findings of the Association for Computational Linguistics (EMNLP).

PDF Cite Code Abstract OpenReview ACL Anthology

(2023). A Systematic Study of Performance Disparities in Multilingual Task-Oriented Dialogue Systems. The 2023 Conference on Empirical Methods in Natural Language Processing (EMNLP).

PDF Cite Code Abstract OpenReview ACL Anthology

(2023). Multi3WOZ: A Multilingual, Multi-Domain, Multi-Parallel Dataset for Training and Evaluating Culturally Adapted Task-Oriented Dialog Systems. Transactions of the Association for Computational Linguistics (TACL).

PDF Cite Dataset Abstract MIT Press ACL Anthology

(2023). GreenPLM: Cross-Lingual Transfer of Monolingual Large Language Models at Almost No Cost. The 32nd International Joint Conference on Artificial Intelligence (IJCAI).

PDF Cite Code Abstract IJCAI 2023

(2023). XQA-DST: Multi-Domain and Multi-Lingual Dialogue State Tracking. Findings of the Association for Computational Linguistics (EACL).

PDF Cite Code Abstract ACL Anthology

Experience

 
 
 
 
 
Google Research, Responsible AI Team
Student Researcher
Jun 2023 – Dec 2023 London, UK
 
 
 
 
 
Language Technology Lab
Research Intern
Mar 2022 – Sep 2022 Cambridge, UK
 
 
 
 
 
UCL NLP Lab
Research Student
May 2021 – Oct 2021 London, UK
 
 
 
 
 
Tencent
Game Designer
Jul 2019 – Aug 2020 Shenzhen, China
 
 
 
 
 
RACE
Research Intern
Jul 2018 – Sep 2018 Culham, UK
 
 
 
 
 
Battery Intelligence Lab
Research Intern
Jul 2017 – Sep 2017 Oxford, UK

Accomplish­ments

Awarded to advance research with Google Gemma Models.
Awarded to present at International Conference on Learning Representations (ICLR 2024).
Fully funded PhD studentship for research towards globally equitable language technologies.
Awarded to rank 1 for MSc Machine Learning in the academic year at UCL.
Awarded to the distinct Engineering Science student at Oxford.

Contact

  • hz416 [at] cam.ac.uk (Cambridge)
  • LTL, University of Cambridge, UK