• image

    Deep Neural Networks

    Playing around with deep learning since 2011. Switched to neuroscience (optogentcis and fMRI) and kernel methods from 2012 to 2014, and then turned back to deep learning in 2015.

    1. Bidirectional Backpropagation: Towards Biologically Plausible Error Signal Transmission in Neural Networks, PDF, GitHub project, 2017.

    2. Incremental Training of Neural Network with Knowledge Distillation, PDF, GitHub project. Done by David Heryanto (in KPMG now) for Final-Year-Project under my supervision, 2015.

    3. Learning Hierarchical Sparse Filters for Feature Matching, PDF, deep learning for feature matching, 2012, Master thesis.

  • image

    Hyperparameter Optimization

    Study both gradient-free (Bayesian optimization) and gradient-based (automatic differentiation and DQN) methods, which was funded by Microsoft Azure Research.

    1. DrMAD: Distilling Reverse-Mode Automatic Differentiation for Optimizing Hyperparameters of Deep Neural Networks, GitHub project, IJCAI 2016

    2. Tuning Hyperparameters of Deep Neural Networks using Bayesian Deep Neural Networks, PDF, GitHub project. Done by Yumeng Yin (in IBM now) for Final-Year-Project under my supervision, 2016.

  • image

    Neural Abstract Machines, Probabilistic Reasoning & Program Induction

    Work in progress.