FPF:UIINFNP016 Deep learning - Course Information
UIINFNP016 Deep learning
Faculty of Philosophy and Science in OpavaWinter 2022
- Extent and Intensity
- 2/2/0. 6 credit(s). Type of Completion: zk (examination).
- Teacher(s)
- doc. Ing. Petr Sosík, Dr. (lecturer)
Mgr. Tomáš Filip (seminar tutor)
doc. Ing. Petr Sosík, Dr. (seminar tutor) - Guaranteed by
- doc. Ing. Petr Sosík, Dr.
Institute of Computer Science – Faculty of Philosophy and Science in Opava - Timetable
- Tue 8:55–10:30 B2
- Timetable of Seminar Groups:
- Prerequisites
- - elements of the probability theory - differential multivariable calculus, partial derivations, gradient - basic knowledge of Python provides an advantage
- Course Enrolment Limitations
- The course is also offered to the students of the fields other than those the course is directly associated with.
- fields of study / plans the course is directly associated with
- Informatics (programme FPF, INFOR-nav)
- Course objectives
- Today's the most successful branch of machine learning is loosely inspired by brain neurophysiology to design "neuronal" algorithms capable of learning by example, generalizing knowledge, and finding approximate solutions to difficult problems. These algorithms are typically run on graphics card farms (GPUs) today. The most common applications include classification tasks, image analysis and recognition, comprehension and text generation, or strategic decision making.
- Learning outcomes
- The student will get acquainted with the basic mathematical and structural principles of deep learning. Will be able to design and test deep learning networks for a variety of tasks such as classification, image analysis, comprehension and text generation, or strategic decision making.
- Syllabus
- 1. Motivation and principles. Mathematical model of a neuron. The ability of UNS to learn from examples and generalize learned data. Active, adaptive and organizational dynamics, types of training. Loss function and its role in network training. Overfitting and underfitting.
- 2. Perceptron - a basic model of a neural network for supervised learning. Minimization of loss function, use of gradient methods. Backpropagation algorithm, description and mathematical derivation.
- 3. Hyperparameters, regularization, optimizers to increase the speed of training and to improve the quality of the training results.
- 4. Convolutional networks for computer vision - principles, graphical representation, recent results. Deep architectures with special types of layers: convolutional layers and max-pooling layers.
- 5. Recurrent networks for sequence data - texts, sequences of images (video), music recordings and the like. Principle of recurrent network layers, unrolling in time. Special layer types: LSTM (Long Short-Term Memory) and GRU (Gated Recurrent Unit).
- Literature
- required literature
- Chollet, F. Deep learning v jazyku Python. Grada, Praha, 2019.
- ŠÍMA, J., NERUDA, R. Teoretické otázky neuronových sítí. 1996. URL info
- recommended literature
- Goodfellow, I, Bengio, Y., Courville, A. Deep Learning. MIT Press, 2016. Dostupné online.
- Teaching methods
- Interactive lecture
Lecture with a video analysis - Assessment methods
- Individual projects and exercises for solutions at home.
- Language of instruction
- Czech
- Further comments (probably available only in Czech)
- Study Materials
- Teacher's information
- 1. Theoretical and practical exercises given at the seminar.
2. Final practical project from deep learning.
3. At least 50% of points from theoretical examples from the entire content of the course.
- Enrolment Statistics (Winter 2022, recent)
- Permalink: https://is.slu.cz/course/fpf/winter2022/UIINFNP016