Loading...

ECE656(00) Ubiquitous Networks

▣ Lecture outline

  In this semester, we will cover two different topics: IoT and deep learning. The first part of this lecture will focus on wireless adhoc network technology that provide low energy and low latency lightweight communication for WSN (wireless sensor networks) and MANET (mobile adhoc networks). The second part of this class will focus on deep learning techniques of artificial intelligence. Recently, deep learning techniques are used not only for machine learning but also applied to other fields of information technology as well as to the other fields of engineering and science in general. We will discuss the basic concepts of deep learning such as CNN and RNN by studying the classical papers of machine learning and deep learning. Then, we move to more recent topics of deep learning such as reinforcement learning (RL) and generative adversarial networks (GAN) by reviewing the recent research papers in deep learning.

 Professor : Lynn Choi( lchoi@korea.ac.kr, Engineering Bldg, #411, 3290-3249)

 Assistant : WonJoon Son (swj8905@korea.ac.kr, Engineering Bldg, #236, 3290-3896) 

 Time(Place) : Wednesday (1-2) Engineering Building, #366

 Textbook :
"Deep Learning: Adaptive Computation and Machine Learning", Ian Goodfellow, MIT Press, 2016 

 Reference book : A Collection of Research Papers 

 Class notice

 

1. Lecture Note 1 was updated on September 3rd.

2. Lecture Note 2 was updated on September 10th.

3. Lecture Note 3 was updated on September 18th.

4. Lecture Note 4 was updated on October 1st.

5. Lecture Note 5 was updated on October 16th.

6. Lecture Note 6 was updated on October 16th.

7. Lecture Note 7 was updated on October 30th.

8. Lecture Note 8 was updated on November 6th.


▣ Lecture slide

 

  1.IoT and WSN.pdf

  2.WLAN_WPAN.pdf

  3.WSN_MAC_and_Power_Management.pdf

  4.Wakeup Scheduling.pdf

  5.Clock_Synchronization.pdf

  6.WSN_Routing_Protocols.pdf

  7.MANET.pdf

  8.Deep Learning.pdf

            



▣ Paper Presentation
 

    

 



 Reference
 

 Paper List : Paper List.docx

 

 

 Convolutional Neural Network (12 papers)

 

 - Rethinking the inception architecture for computer vision (2016), C. Szegedy et al.pdf

 - Inception-v4, inception-resnet and the impact of residual connections on learning (2016), C. Szegedy et al.pdf

 - Identity Mappings in Deep Residual Networks (2016), K. He et al.pdf

 - Deep residual learning for image recognition (2016), K. He et al.pdf

 - Spatial transformer network (2015), M. Jaderberg et al.pdf

 - GoingDeeperwithConvolutions - GoogleNet.pdf

 - VERY DEEP CONVOLUTIONAL NETWORKS FOR LARGE-SCALE IMAGE RECOGNITION - VGGNet.pdf

 - Return of the devil in the details, delving deep into convolutional nets (2014), K. Chatfield et al.pdf

 - OverFeat Integrated recognition, localization and detection using convolutional networks (2013), P. Sermanet et al.pdf

 - Maxout networks (2013), I. Goodfellow et al.pdf

 - NetworkInNetwork.pdf

 - ImageNet classification with deep convolutional neural networks (2012), A. Krizhevsky et al.pdf

 

 

 Natural Language Processing / Recurrent Neural Network (12 papers)

 

 - Conditional random fields as recurrent neural networks (2015), S. Zheng and S. Jayasumana.pdf

 - Effective approaches to attention-based neural machine translation (2015), M. Luong et al.pdf

 - Generating sequences with recurrent neural networks (2013), A. Graves.pdf

 - Learning phrase representations using RNN encoder-decoder for statistical machine translation (2014), K. Cho et al..pdf

 - Memory networks (2014), J. Weston et al.pdf

 - Neural Architectures for Named Entity Recognition (2016), G. Lample et al.pdf

 - Sequence to sequence learning with neural networks (2014), I. Sutskever et al.pdf

 - Teaching machines to read and comprehend (2015), K. Hermann et al.pdf

 - Training and analysing deep recurrentneuralnetworks.pdf

 

 

 Optimization / Training Techniques (10 papers) 

 

 - ADAM A METHOD FOR STOCHASTIC OPTIMIZATION.pdf

 - Batch normalization Accelerating deep network training by reducing internal covariate shift.pdf

 - Delving deep into rectifiers, Surpassing human-level performance on imagenet classification (2015), K. He et al.pdf

 - Dropout A Simple Way to Prevent Neural Networks from Overfitting.pdf

 - Improving neural networks by preventing co-adaptation of feature detectors (2012), G. Hinton et al.pdf

 - Learning long-term dependencies with gradient descent is difficult.pdf

 - Learning representations by back-propagating errors.pdf

 - Random search for hyper-parameter optimization (2012) J. Bergstra and Y. Bengio.pdf

 - Training very deep networks (2015), R. Srivastava et al.pdf ADADELTA AN ADAPTIVE LEARNING RATE METHOD.pdf

 

 

 Understanding / Generalization / Transfer (7 papers)

 

 - Distilling the knowledge in a neural network (2015), G. Hinton et al..pdf 

 - Deep neural networks are easily fooled, High confidence predictions for unrecognizable images (2015), A. Nguyen et al.pdf

 - How transferable are features in deep neural networks.pdf

 - CNN features off-the-Shelf, An astounding baseline for recognition (2014), A. Razavian et al.pdf

 - Learning and transferring mid-Level image representations using convolutional neural networks (2014), M. Oquab et al.pdf

 - Visualizing and understanding convolutional networks (2014), M. Zeiler and R. Fergus.pdf

 - Decaf, A deep convolutional activation feature for generic visual recognition (2014), J. Donahue et al.pdf

 



 Reading List

 

 


 project

LOGIN

SEARCH

MENU NAVIGATION