CV_Jongho_Lee_2023_08.pdf

AI Research Engineer
딥러닝 모델 최적화를 통해 누구나 인공지능의 편리함을 누릴 수 있도록 노력하고 있습니다.
E-mail
[email protected]
[email protected]
LinkedIn / Github / GoogleScholar
✨ Research Interests / Language
- Computer Vision, Lightweight Deep Learning,
Image Recognition, Model Compression, Image Compression, Vision Transformer, CNN
- Python, PyTorch, Matlab, C++
📚 Education
Seoul National University of Science and Technology,
Department of Electrical and Information Engineering**,** Korea, Seoul.
Master of Science (21.03 ~ 23.02)
Thesis: Vision Transformer Performance Improvement and Pruning Techniques using Discrete Cosine Transform
Bachelor of Science (17.03 ~ 21.02)
Advisor: Hyun Kim
📝 Publication
Arxiv Preprint
- Taesu Kim, Jongho Lee, Daehyun Ahn, Sarang Kim, Jiwoong Choi, Minkyu Kim, and Hyungjun Kim, “QUICK: Quantization-aware Interleaving and Conflict-free Kernel for efficient LLM inference”, arXiv:2402.10076 , 2024
Conference Publications
- Nam Joon Kim*, Jong Ho Lee,* and Hyun Kim, ”HyQ: Hardware-Friendly Post-Training Quantization for CNN-Transformer Hybrid Networks”, *33rd International Joint Conference on Artificial Intelligence (IJCAI 2024). (IF: 4, NRF BK21+) Equally Contributed Authors