Alumni

Past Graduate Students #


Seungwoo Son #

.

Compress Gigantic Transformers, but Efficiently 🤑
M.S. @ POSTECH EE (22.03–24.06)
Keywords: Model Compression
webpage, mail


Hagyeong Lee #

.

Data Compression, but for more than what we see 🔮
M.S. @ POSTECH EE (22.09–24.06)
Keywords: Data Compression, Model Bias, Visual-Language Model
webpage, mail, twitter

#

Jiwoon Lee #

.

Efficient ML in the Wild 🐊
M.S. @ POSTECH EE (22.03–24.02)
Keywords: Model Merging, Federated Learning, Knowledge Distillation
webpage, mail


Junwon Seo #

.

Blazing-Fast Neural Field Generation đŸ”Ĩ
M.S. @ POSTECH EE (22.03–24.02)
Keywords: Neural Field, Training Efficiency, Implicit Bias of SGD
webpage, mail

#

Past Interns #

Jeonghyun Choi (Winter ‘23)
Properties of Data Augmentation

Minjae Park (Winter ‘23; now at EffL)
Faster State-Space Models

Minyoung Kang (Fall ‘23–Winter ‘23)
Neural Cellular Automata

Yousung Roh (Fall ‘23–Winter ‘23)
Byte-Processing Neural Networks

Jiyun Bae (Summer ‘23–Fall ‘23; now at EffL)
Visual Prompt Tuning

Sangyoon Lee (Summaer–Fall ‘23; now at EffL)
Fast Neural Field Generation

Jegwang Ryu (Summer ‘23; now at Samsung)
Test-time Training with Masked Modeling

Dohyun Kim (Summer ‘23; now at đŸĢĄ)
Zeroth Order Optimization

Juyun Wee (Spring ‘23 → EffL)
Time-Series Modeling with Transformers

Soochang Song (Winter ‘22 – Spring ‘23; now exchange student at đŸ‡Ģ🇷)
Model Interpolation with SIRENs

Jeonghun Cho (Winter ‘22)
Pruning Models under Challenging Scenarios

Seyeon Park (Winter ‘21 → Yonsei)
Efficient Attentions for Language Models

Hagyeong Lee (Winter ‘21 → EffL)
Data Compression with Implicit Neural Representations