Skip to menu

http://slsp.kaist.ac.kr/xe/index.php?mid=mlt2016

※  3월 27일 일요일 오혜연 교수님 강의(주황색 바탕으로 표시)에 실습이 예정되어 있으니, 관심 있는 분들은 노트북을 지참해주시기 바랍니다.

Machine Learning Tutorial 2016 Program 
IEIE's Distinguished Lecturer Series for Machine Learning
날짜 시간 주제   강사
2016-03-24
목요일
09:00~11:00            
11:00~11:10        
11:10~12:00 Deep Learning for
Natural Language Processing
1. Introduction
2. Word embedding and Compositionality
3. RNNs and LSTMs for modeling sequential data
4. RNNs and LSTMs for NLP applications: Sequential labeling, neural machine translation, QA(question answering), etc.
5. Recursive neural networks for parsing sequential data
6. Memory-based LSTMs: Neural turning machine, etc
7. Discussion and Conclusion
나승훈 교수
전북대학교
12:00~13:00 Lunch
13:00~13:50 Deep Learning for Natural Language Processing
13:50~14:00 Break
14:00~14:50 Deep Learning for Natural Language Processing
14:50~15:10 Break    
15:10~16:30 Computational Genomics 1. Introduction to computational genomics
2. Gene prediction (based on HMM) and motif discovery (based on EM algorithm)
3. Transcriptome (microarray, RNA-seq, gene expression analysis)
4. Integrative approaches to genome-scale data sets
5. Applications in cancer genomics 
이현주 교수
GIST
16:30~16:50 Break
16:50~18:10 Computational Genomics
2016-03-25
금요일
09:00~10:20 Bayesian Nonparametrics for Machine Learning 1. Introduction
2. Dirichlet process prior and its applications
3. Variants of Dirichlet process prior
4. Beta process prior and its applications
김용대 교수
서울대학교
10:20~10:40 Break
10:40~12:00 Bayesian Nonparametrics for Machine Learning
12:00~13:00 Lunch    
13:00~14:20 Using Neural Networks for Modelling and Representing Natural Languages 1. Introduction
2. Basic machine learning applied to NLP
3. Introduction to neural networks
4. Distributed representations of words
5. Neural network based language models
6. Future research
7. Resources
Tomas Mikolov
FACEBOOK
14:20~14:40 Break
14:40~16:00 Using Neural Networks for Modelling and Representing Natural Languages
16:00~16:20 Break    
16:20~17:40 Machine Learning for Biomedical Informatics 1. Introduction
2. Current research problems
3. Non-parametric Bayesian models for joint inference of multiple related tasks
4. Sparse structured regression models for knowledge-guided prediction
5. Summary
손경아 교수
아주대학교
17:40~18:00 Break
18:00~19:20 Machine Learning for Biomedical Informatics
2016-03-26
토요일
09:00~10:20 Deep Learning in Vision 1. Introduction to deep learning
2. Convolutional neural networks
3. Image classification
4. Understanding and visualizing concolutional neural networks
5. Image captioning
김준모 교수
KAIST
10:20~10:40 Break
10:40~12:00 Deep Learning in Vision
12:00~13:00 Lunch    
13:00~14:20 SNS Analysis and Recommendation System 1. Sentiment Analysis
2. Sentiment Analysis Tools
3. Rumor Detection Problem in OSN
4. Rumor Detection Analysis
5. Recommendation System
6. Crowd Sourcing System
정교민 교수
서울대학교
14:20~14:40 Break
14:40~16:00 SNS Analysis and Recommendation System
16:00~16:20 Break    
16:20~17:40 Large-scale Optimization for Deep Learning 1. Introduction & Background
2. First-order methods for convex optimization
3. Second-order methods for convex optimization
4. Methods for non-convex optimization
5. Applications in machine learning
신진우 교수
KAIST
17:40~18:00 Break
18:00~19:20 Large-scale Optimization for Deep Learning
2016-03-27
일요일
10:00~11:20 Programming Exercises for
Word2vec and Latent Dirichlet Allocation
1. Introduction
2. Elice programming platform
3. Word2vec
4. Latent Dirichlet Allocation
5. Program
오혜연 교수
KAIST
11:20~11:40 Break
11:40~13:00 Programming Exercises for
Word2vec and Latent Dirichlet Allocation
13:00~14:00 Lunch    
14:00~15:20 Then, Now and the Future of Deep Learning 1. Foundation of Machine learning
2. Start of deep learning
3. Graphical models and rise of deep
4. Recent research output
5. Future of Deep Learning
6. Summary & references
유창동 교수
KAIST
15:20~15:40 Break
15:20~15:40 Then, Now and the Future of Deep Learning

※ 강의 내용에 대한 저작권 관계로, 내용의 전체 또는 일부를 휴대폰 및 카메라 등을 이용하여 촬영하실 수 없습니다.

     이에 양해를 부탁드립니다.