Zhongyu Ouyang

Ph.D. Student in Computer Science 💻

Cognitive Modeling in AI; Recommender Systems; Graph Learning

Dartmouth College 🌲

zhongyu.ouyang.gr [AT] dartmouth.edu

Biography

Hi 💚! My name is 欧阳忠宇 Zhongyu Ouyang, born and raised in Jiangxi, China 🇨🇳. I am a Ph.D. student in Computer Science at Dartmouth College, advised by Prof. Soroush Vosoughi. My research interests broadly lie in cognitive modeling in artificial intelligence (e.g., agentic memory, preference modeling), recommender systems (e.g., graph-based collaborative filtering, click-through-rate prediction) and graph representation learning over various domains (e.g., healthcare, mobile application promotion).

In summer 2024 and 2025, I was fortunate to intern at Microsoft AI as a data scientist intern in Redmond WA. In fall 2025, I was fortunate to intern at Amazon AWS as an applied scientist intern in New York City.

Papers

Non-parametric Graph Convolution for Re-ranking in Recommendation Systems

Zhongyu Ouyang*, Mingxuan Ju*, Soroush Vosoughi, Yanfang Ye

RecSys'25: The 19th ACM Conference on Recommender Systems

Scaled Supervision is an Implicit Lipschitz Regularizer

Zhongyu Ouyang, Chunhui Zhang, Yaning Jia, Soroush Vosoughi

ICWSM'25: The 20th International AAAI Conference on Web and Social Media

How to Improve Representation Alignment and Uniformity in Graph-based Collaborative Filtering?

Zhongyu Ouyang, Chunhui Zhang, Shifu Hou, Chuxu Zhang, Yanfang Ye

ICWSM'24: The 18th International AAAI Conference on Web and Social Media

Symbolic Prompt Tuning Completes the App Promotion Graph

Zhongyu Ouyang, Shifu Hou, Shang Ma, Chaoran Chen, Chunhui Zhang, Toby Li, Xusheng Xiao, Chuxu Zhang, Yanfang Ye

ECML PKDD'24: European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases

On the Influence of Connectivity to CTR Prediction

Zhongyu Ouyang, Chunhui Zhang, Yaning Jia, Soroush Vosoughi

Under Review

What Makes LLMs Effective Sequential Recommenders? A Study on Preference Intensity and Temporal Context

Zhongyu Ouyang*, Qianlong Wen*, Chunhui Zhang, Soroush Vosoughi, Yanfang Ye

Under Review, *Equal Contribution

Pretrained Image-Text Models are Secretly Video Captioners

Chunhui Zhang, Yiren Jian, Zhongyu Ouyang, Soroush Vosoughi

Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL 2025) Main Conference

Working Memory Identifies Reasoning Limits in Language Models

Chunhui Zhang, Yiren Jian, Zhongyu Ouyang, Soroush Vosoughi

The Conference on Empirical Methods in Natural Language Processing (EMNLP 2024).

Disentangled dynamic heterogeneous graph learning for opioid overdose prediction

Qianlong Wen, Zhongyu Ouyang, Jianfei Zhang, Yiyue Qian, Yanfang Ye, Chuxu Zhang

KDD'22: SIGKDD Conference on Knowledge Discovery and Data Mining

When Sparsity Meets Contrastive Models: Less Graph Data Can Bring Better Class-Balanced Representations

Chunhui Zhang, Chao Huang, Yiyue Tian, Qianlong Wen, Zhongyu Ouyang, Youhuan Li, Yanfang Ye, Chuxu Zhang

ICML'23: International Conference on Machine Learning

On the Influence of Connectivity to CTR Prediction

Zhongyu Ouyang, Chunhui Zhang, Yaning Jia, Soroush Vosoughi

Under Review

What Makes LLMs Effective Sequential Recommenders? A Study on Preference Intensity and Temporal Context

Zhongyu Ouyang*, Qianlong Wen*, Chunhui Zhang, Soroush Vosoughi, Yanfang Ye

Under Review, *Equal Contribution

Non-parametric Graph Convolution for Re-ranking in Recommendation Systems

Zhongyu Ouyang*, Mingxuan Ju*, Soroush Vosoughi, Yanfang Ye

RecSys'25: The 19th ACM Conference on Recommender Systems

Scaled Supervision is an Implicit Lipschitz Regularizer

Zhongyu Ouyang, Chunhui Zhang, Yaning Jia, Soroush Vosoughi

ICWSM'25: The 20th International AAAI Conference on Web and Social Media

How to Improve Representation Alignment and Uniformity in Graph-based Collaborative Filtering?

Zhongyu Ouyang, Chunhui Zhang, Shifu Hou, Chuxu Zhang, Yanfang Ye

ICWSM'24: The 18th International AAAI Conference on Web and Social Media

Symbolic Prompt Tuning Completes the App Promotion Graph

Zhongyu Ouyang, Shifu Hou, Shang Ma, Chaoran Chen, Chunhui Zhang, Toby Li, Xusheng Xiao, Chuxu Zhang, Yanfang Ye

ECML PKDD'24: European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases

On the Influence of Connectivity to CTR Prediction

Zhongyu Ouyang, Chunhui Zhang, Yaning Jia, Soroush Vosoughi

Under Review

What Makes LLMs Effective Sequential Recommenders? A Study on Preference Intensity and Temporal Context

Zhongyu Ouyang*, Qianlong Wen*, Chunhui Zhang, Soroush Vosoughi, Yanfang Ye

Under Review, *Equal Contribution

Pretrained Image-Text Models are Secretly Video Captioners

Chunhui Zhang, Yiren Jian, Zhongyu Ouyang, Soroush Vosoughi

Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL 2025) Main Conference

Working Memory Identifies Reasoning Limits in Language Models

Chunhui Zhang, Yiren Jian, Zhongyu Ouyang, Soroush Vosoughi

The Conference on Empirical Methods in Natural Language Processing (EMNLP 2024).

From Coarse to Fine: Enable Comprehensive Graph Self-supervised Learning with Multi-granular Semantic Ensemble

Qianlong Wen, Mingxuan Ju, Zhongyu Ouyang, Chuxu Zhang, Yanfang Ye

ICML'24: International Conference on Machine Learning

Disentangled dynamic heterogeneous graph learning for opioid overdose prediction

Qianlong Wen, Zhongyu Ouyang, Jianfei Zhang, Yiyue Qian, Yanfang Ye, Chuxu Zhang

KDD'22: SIGKDD Conference on Knowledge Discovery and Data Mining

Graph Contrastive Learning with cross-view Reconstruction

Qianlong Wen, Zhongyu Ouyang, Chunhui Zhang, Yiyue Qian, Chuxu Zhang

The 40th Conference on Uncertainty in Artificial Intelligence

When Sparsity Meets Contrastive Models: Less Graph Data Can Bring Better Class-Balanced Representations

Chunhui Zhang, Chao Huang, Yiyue Tian, Qianlong Wen, Zhongyu Ouyang, Youhuan Li, Yanfang Ye, Chuxu Zhang

ICML'23: International Conference on Machine Learning

YAGG: Ce transparent ceramics with high luminous efficiency for solid-state lighting application

Hui Hua, Shaowei Feng, Zhongyu Ouyang, Hezhu Shao, Haiming Qin, Hui Ding, Qiping Du, Zhijun Zhang, Jun Jiang, Haochuan Jiang

Journal of Advanced Ceramics (2019)

Vitæ

  • Amazon Sep 2025 - Dec 2025
    Applied Scientist Intern
    AWS Agentic AI
  • Microsoft June 2025 - Sep 2025
    Data Scientist Intern
    Microsoft AI
  • Dartmouth College Fall 2024 - Present
    Ph.D. Student in Computer Science
    Cognitive Modeling in AI; Recommender Systems
  • Microsoft May 2024 - Aug 2024
    Data Scientist Intern
    Microsoft AI
  • University of Notre Dame Spring 2022 - Spring 2024
    Research Assistant in Computer Science
    Recommender Systems
  • Case Western Reserve University Fall 2021
    Research Assistant in Computer Science
    Opioid overdose prediction over PDMP data
  • Qeexo June 2020 - Sep 2020
    Machine Learning Engineer
    Backend support for Qeexo AutoML
  • Georgia Institute of Technology Jan 2020 - Aug 2022
    Master's Degree
    Computer Science, minor in Machine Learning
  • AsiaInfo June 2019 - Aug 2019
    Data Scientist Intern
    Anomaly detection over health insurance claims
  • University of Pittsburgh Aug 2018 - Apr 2020
    Master's Degree
    Masters in Industrial Engineering
  • Tianjin University 2014 - 2018
    Bachelor's Degree
    Materials Forming & Control Engineering

Miscellaneous

Yaya I have a tuxedo cat named Yaya (牙牙, 'tooth' in Mandarin). He was born in Chirstmas 🎄 2021, and was adopted by me in March 2023 in South Bend IN. I originally named him Toothless after the dragon Toothless in How to Train Your Dragon, one of my favorite animations.

My favorite director is Tim Burton and I costumed myself Wednesday Adams last Halloween. My favorite artist is Taylor Swift, and I have been to her concerts in Shanghai (1989), Pittsburgh (Reputation), and Chicago (The Eras). I am also a fan of KPop. Shout out to New Jeans, Aespa, Twins, and other wonderful girl groups 📣!

Website Design

This website theme is designed by Prof. Martin Saveski (big gratitude!). The code needed to build the website is in this GitHub repo.