FPF:UIN3042 Artificial Neural Networks - Course Information
UIN3042 Artificial Neural Networks
Faculty of Philosophy and Science in OpavaWinter 2020
- Extent and Intensity
- 2/2/0. 6 credit(s). Type of Completion: zk (examination).
- Teacher(s)
- doc. Ing. Petr Sosík, Dr. (lecturer)
doc. Ing. Petr Sosík, Dr. (seminar tutor)
Mgr. Tomáš Filip (seminar tutor) - Guaranteed by
- doc. Ing. Petr Sosík, Dr.
Institute of Computer Science – Faculty of Philosophy and Science in Opava - Timetable
- Tue 9:45–11:20 B2
- Timetable of Seminar Groups:
- Prerequisites
- TYP_STUDIA(N)
- elements of the probability theory
- differential multivariable calculus, partial derivations, gradient
- basic knowledge of Python provides an advantage - Course Enrolment Limitations
- The course is also offered to the students of the fields other than those the course is directly associated with.
- fields of study / plans the course is directly associated with
- Computer Science and Technology (programme FPF, N1801 Inf)
- Course objectives
- Today's the most successful branch of machine learning is loosely inspired by brain neurophysiology to design "neuronal" algorithms capable of learning by example, generalizing knowledge, and finding approximate solutions to difficult problems. These algorithms are typically run on graphics card farms (GPUs) today. The most common applications include classification tasks, image analysis and recognition, comprehension and text generation, or strategic decision making.
- Learning outcomes
- The student will get acquainted with the basic mathematical and structural principles of deep learning. Will be able to design and test deep learning networks for a variety of tasks such as classification, image analysis, comprehension and text generation, or strategic decision making.
- Syllabus
- 1. Motivation and principles. Mathematical model of a neuron. The ability of UNS to learn from examples and generalize learned data. Active, adaptive and organizational dynamics, types of training. Loss function and its role in network training. Overfitting and underfitting.
- 2. Perceptron - a basic model of a neural network for supervised learning. Minimization of loss function, use of gradient methods. Backpropagation algorithm, description and mathematical derivation.
- 3. Hyperparameters, regularization, optimizers to increase the speed of training and to improve the quality of the training results.
- 4. Convolutional networks for computer vision - principles, graphical representation, recent results. Deep architectures with special types of layers: convolutional layers and max-pooling layers.
- 5. Recurrent networks for sequence data - texts, sequences of images (video), music recordings and the like. Principle of recurrent network layers, unrolling in time. Special layer types: LSTM (Long Short-Term Memory) and GRU (Gated Recurrent Unit).
- Literature
- required literature
- Chollet, F. Deep learning v jazyku Python. Grada, Praha, 2019.
- NERUDA, R., ŠÍMA, J. Teoretické otázky neuronových sítí. Matfyzpress, Praha, 1996. info
- recommended literature
- Goodfellow, I, Bengio, Y., Courville, A. Deep Learning. MIT Press, 2016. Dostupné online.
- Teaching methods
- Interactive lecture
Lecture with a video analysis - Assessment methods
- Individual projects and exercises for solutions at home.
- Language of instruction
- Czech
- Further comments (probably available only in Czech)
- Study Materials
The course can also be completed outside the examination period. - Teacher's information
- 1. Continuous theoretical and practical examples given at the seminar.
2. Final practical project from deep learning.
3. At least 50% of points from theoretical examples from the entire content of the course.
- Enrolment Statistics (Winter 2020, recent)
- Permalink: https://is.slu.cz/course/fpf/winter2020/UIN3042