2019 Fall Deep learning Seminar

Undergraduate deep learning seminar, Room 102, Jingyuan 5th Courtyard, Peking University, 2019

Seminar overview

  • Each time, we will have 1~3 presenters.
  • Each presenter should present a bunch of paper of related topics.
  • All topics related with machine learning are welcomed.

Homework: Choose 3~5 your favorite papers, present them and explain why.

If you want to attend, just send me an email.


Schedule

DatePresenterSlidesRelated papers
30/9 2019Chence Shi, Tianyuan ZhangN/A, N/AZhang:
--- Image-to-image translation
Image-to-image translation with conditional adversarial networks
Photographic image synthesis with cascaded refinement networks.
Unpaired image-to-image translation using cycle-consistent adversarial networks
Unsupervised image-to-image translation networks.
Multimodal unsupervised image-to-image translation
--- Style transfer & Conditional normalization
A neural algorithm of artistic style.
Perceptual losses for real-time style transfer and super-resolution.
Arbitrary style transfer in real-time with adaptive instance normalization.
A style-based generator architecture for generative adversarial networks
Semantic image synthesis with spatially-adaptive normalization
Few-shot unsupervised image-to-image translation.
6/10 2019Baifeng Shi, Ziqi PangShi, ppt, N/AShi:
Internal Statistics of a Single Natural Image
Zero-Shot” Super-Resolution using Deep Internal Learning
Blind Deblurring Using Internal Patch Recurrence.
Semi-parametric Image Synthesis.
13/10 2019Ziqi Pang, Chence ShiPang: Distillation pptPang:
Distilling the Knowledge in a Neural Network
Label Refinery: Improving ImageNet Classification through Label Progression
A Gift from Knowledge Distillation: Fast Optimization, Network Minimization and Transfer Learning
Paying More Attention to Attention: Improving the Performance of Convolutional Neural Networks via Attention
Structured Knowledge Distillation for Semantic Segmentation <Learning without Forgetting, Progress & Compress: A scalable framework for continual learning>