News

09.23

BugComp is accepted to NeurIPS 2023.

05.23

Joined UCSF as a Postdoc Scholar!

03.23

Defended Ph.D. Thanks Kangwook and everyone!

10.22

Happy to receive the NeurIPS Scholar Award

10.22

BugComp is accepted to AMLC (ML for Code)

10.22

WALIP is accepted to EMNLP 2022 (Findings)

09.22

LIFT is accepted to NeurIPS 2022.

08.22

PROS is accepted to MobiCom 2022.

06.22

Starts internship at AWS AI Research and Education

06.22

LIFT is available on Arxiv

06.22

WALIP is available on Arxiv

05.21

CodedInvnet is accepted to ICML-21

04.20

I’m fortunate to be supervised by Prof. Kangwook Lee


Deep Learning with Foundation Models

Link Topic Title Summary Github
NeurIPS'23 LLM Large Language Models of Code Fail at Completing Code with Potential Bugs summary code
TL;DR: LLMs may fail drastically at completing functional code when potential bugs (aka anti-flow pattens) exist in the context.
EMNLP'22 (Findings) Multimodal Utilizing Language-Image Pretraining for Efficient and Robust Bilingual Word Alignment summary code
TL;DR: Text-Image correlation (via CLIP embedding) can be effeciently utilized with static embedding for robust word translation.
NeurIPS'22 LLM LIFT: Language-Interfaced Fine-Tuning for Non-Language Machine Learning Tasks summary code
TL;DR: Pretrained LLMs, via language-interface, can be useful for learning non-language tasks, e.g., tabular data classification.
ICMLW'22 GAN, PEFT Improved Input Reprogramming for GAN Conditioning summary code
TL;DR: Pretrained GANs can be efficiently repurposed (without modification) to conditionally generate samples in their support.
ICML'21 (Oral) MLSys, GAN Coded-InvNet for Resilient Prediction Serving Systems summary code
TL;DR: Coded-InvNet is a coded computation method combined with image-to-image translation to improve resilience of MLSS.
TPAMI'20 GAN, Medical Imaging Performing Group Difference Testing on Graph Structured Data from GANs: Analysis and Applications in Neuroimaging code
TL;DR: Analyzing when GAN-based data can obtain the similar conclusions with trained data in scientific or biomedical studies.
AAAI'20 (Oral) Optimization, GAN The Promise of Conditional Gradient Methods for Training Deep Models code
TL;DR: Conditional gradients can be utilized to faster training of deep networks with provably better generalization guarantees.


AI for Science and Healthcare

Link Topic Title Summary Github
MobiCom'22 Healthcare PROS: an Efficient Pattern-Driven Compressive Sensing Framework for Low-Power Biopotentialbased Wearables with On-chip Intelligence code
MobiSys'21 Healthcare WAKE: A Behind-the-ear Wearable System for Microsleep Detection
IEEE TMC'21 Healthcare Detection of Microsleep Events with a Behind-the-ear Wearable System
Oxford Journal'18 Epidemiology Forecasting Dengue Incidences: Statistical and Dynamic Models
CtaD'17 Medical Imaging Graph Imputation techniques for estimating amyloid positivity from longitudinal cognitive and MRI measurements for efficient secondary prevention trials
ACIIDS'16 (Oral) Epidemiology Forecasting the Magnitude of Dengue in Southern Vietnam


Patents

Link Topic Title
US 11087525 AI Framework, Inverse Graphics Unsupervised learning of three dimensional visual alphabet
US 16186121 Algorithm, Training Framework Training System for Artificial Neural Networks Having a Global Weight Constrainer