Ting-Yun (Charlotte) Chang
Hi, I'm Ting-Yun Chang 張婷雲.
I am a 4th-year PhD student at USC CS,
co-advised by Jesse Thomason
and Robin Jia.
I am interested in (1) understanding -> control large language models' behavior
and (2) multimodal learning of language and vision.
Previously, I did my bachelor's and master's degrees in Taiwan, both in Computer Science.
I was advised by Yun-Nung (Vivian) Chen at National Taiwan University
and Chi-Jen Lu at Academia Sinica.
Publications
When Parts Are Greater Than Sums: Individual LLM Components Can Outperform Full Models
Ting-Yun Chang, Jesse Thomason, and Robin Jia
EMNLP 2024 (main)
[Paper] [Code] [Blog] [Video]
Do Localization Methods Actually Localize Memorized Data in LLMs? A Tale of Two Benchmarks
Ting-Yun Chang, Jesse Thomason, and Robin Jia
NAACL 2024 (main)
[Paper] [Code] [Slides] [Video]
CLiMB: A Continual Learning Benchmark for Vision-and-Language Tasks
Tejas Srinivasan, Ting-Yun Chang, Leticia Pinto Alva, Georgios Chochlakis, Mohammad Rostami, and Jesse Thomason
NeurIPS 2022 Datasets and Benchmarks Track
[Paper] [Code] [Video]
Go Beyond Plain Fine-tuning: Improving Pretrained Models for Social Commonsense
Ting-Yun Chang, Yang Liu, Karthik Gopalakrishnan, Behnam Hedayatnia, Pei Zhou, and Dilek Hakkani-Tür
IEEE SLT 2021
[Paper] [Slides]
Incorporating Commonsense Knowledge Graph in Pretrained Models for Social Commonsense Tasks
Ting-Yun Chang, Yang Liu, Karthik Gopalakrishnan, Behnam Hedayatnia, Pei Zhou, and Dilek Hakkani-Tür
DeeLIO Workshop@EMNLP 2020 (best paper award)
[Paper] [Slides]
TinyGAN: Distilling BigGAN for Conditional Image Generation
Ting-Yun Chang, Chi-Jen Lu
Asian Conference on Computer Vision 2020
[Paper] [Code] [Demo] [Video]
What Does This Word Mean? Explaining Contextualized Embeddings with Natural Language Definition
Ting-Yun Chang, Yun-Nung Chen
EMNLP 2019
[Paper] [Thesis] [Code]
TA
- USC CS544 Applied Natural Language Processing (Fall 2024)
- USC CS467 Introduction to Machine Learning (Spring 2023)
- NTU CSIE Applied Deep Learning (Spring 2019)