Ting-Yun (Charlotte) Chang Hi, I'm Ting-Yun Chang 張婷雲.
I am a 4th-year PhD student at USC CS,
co-advised by Jesse Thomason
and Robin Jia.
I am interested in understanding -> controlling large language models' behavior, e.g., studying internals of LLMs to improve their few-shot learning capabilities. Recently, I work on reducing LLM quantization errors by studying the causes of errors.
Previously, I did my bachelor's and master's degrees in Taiwan, both in Computer Science.
I was advised by Yun-Nung (Vivian) Chen at National Taiwan University
and Chi-Jen Lu at Academia Sinica.
Applied Scientist InternSpring 2020Amazon Alexa AI
Publications
Why Do Some Inputs Break Low-Bit LLM Quantization?Ting-Yun Chang, Muru Zhang, Jesse Thomason, and Robin Jiapreprint 2025[Paper]
When Parts Are Greater Than Sums: Individual LLM Components Can Outperform Full ModelsTing-Yun Chang, Jesse Thomason, and Robin JiaEMNLP 2024 (main)[Paper][Code][Blog][Video]
Do Localization Methods Actually Localize Memorized Data in LLMs? A Tale of Two BenchmarksTing-Yun Chang, Jesse Thomason, and Robin JiaNAACL 2024 (main)[Paper][Code][Slides][Video]
Go Beyond Plain Fine-tuning: Improving Pretrained Models for Social CommonsenseTing-Yun Chang, Yang Liu, Karthik Gopalakrishnan, Behnam Hedayatnia, Pei Zhou, and Dilek Hakkani-TürIEEE SLT 2021[Paper][Slides]
Incorporating Commonsense Knowledge Graph in Pretrained Models for Social Commonsense TasksTing-Yun Chang, Yang Liu, Karthik Gopalakrishnan, Behnam Hedayatnia, Pei Zhou, and Dilek Hakkani-TürDeeLIO Workshop@EMNLP 2020 (best paper award)[Paper][Slides]
TinyGAN: Distilling BigGAN for Conditional Image GenerationTing-Yun Chang, Chi-Jen LuAsian Conference on Computer Vision 2020[Paper][Code][Demo][Video]
What Does This Word Mean? Explaining Contextualized Embeddings with Natural Language DefinitionTing-Yun Chang, Yun-Nung ChenEMNLP 2019[Paper][Thesis][Code]
TA
USC CS544 Applied Natural Language Processing (Fall 2024)
USC CS467 Introduction to Machine Learning (Spring 2023)