Professional Documents
Culture Documents
After this pre-training the network can be fine tuned with the labeled data that is available.
These techniques have already been used successfully in image recognition, and are the
current state of the art.
Work Description:
1. Literature study on MSI and Deep Neural Networks.
2. Create an artificial MSI dataset suited for benchmarking the classification method.
3. Recreate and adjust a Deep Neural Network in Python (Theano, Pylearn2) or Lua
(Torch7).
4. Pre-train the deep network in an unsupervised setting.
5. Fine tune the network with the labelled data.
6. Test the trained network on the MSI dataset.
Literature (40 %), Programming (30%), Writing (30%)
Profile:
Basic Python (or Lua) knowledge.
Courses in Machine Learning, Artificial Neural Networks or Computer Vision are a plus
Promotor:
Prof. Bart De Moor (bart.demoor@esat.kuleuven.be )
Department of Electrical Engineering (ESAT)
STADIUS
Daily supervisors:
Yousef El Aalamat (yousef.elaalamat@esat.kuleuven.be)
Nico Verbeeck (nico.verbeeck@esat.kuleuven.be)
Peter Roelants (peter.roelants@esat.kuleuven.be)
Number of students: 1
Suitable for: Bioi, WIT, AI, CS
References:
[1] Why Does Unsupervised Pre-training Help Deep Learning? Dumitru Erhan, Yoshua Bengio,
Aaron Courville, Pierre-Antoine Manzagol, Pascal Vincent, Samy Bengio; 11(Feb):625660,
2010.
[2] Convolutional deep belief networks for scalable unsupervised learning of hierarchical
representations. Honglak Lee, Roger Grosse, Rajesh Ranganath, and Andrew Y. Ng. In
Proceedings of the 26th Annual International Conference on Machine Learning, ICML 09, pages
609616, New York, NY, USA, 2009. ACM.