FPF:UINA342 Artificial Neural Networks - Course Information
UINA342 Artificial Neural Networks
Faculty of Philosophy and Science in OpavaWinter 2018
- Extent and Intensity
- 2/2/0. 6 credit(s). Type of Completion: zk (examination).
- Teacher(s)
- doc. Ing. Petr Sosík, Dr. (lecturer)
doc. Ing. Petr Sosík, Dr. (seminar tutor) - Guaranteed by
- doc. Ing. Petr Sosík, Dr.
Institute of Computer Science – Faculty of Philosophy and Science in Opava - Prerequisites
- - elementary propositional logic, logic connectives
- differential multivariable calculus, partial derivations, gradient
- elements of object programming (Java, C# etc.) - Course Enrolment Limitations
- The course is also offered to the students of the fields other than those the course is directly associated with.
- fields of study / plans the course is directly associated with
- Computer Science and Technology (programme FPF, N1801 Inf)
- Course objectives
- This classical branch of Artificial Intelligence makes use of mathematical aspects of behavior of neural cells in living organisms. The result is a sequence of "neural" algorithms capable to learn from examples, to generalize knowledge and to search for approximate solutions of intractable problems. These algorithms can be run on special parallel machines but also on classical computers.
- Syllabus
- 1. The structure of biological neuron, mathematical model of a simple neuron and a multi-layer neural network. Features and applications of artificial neural networks.
2. Active, adaptive and organization dynamics, neural training schemes (supervised/unsupervised/reinforcement). Training and testing sets, training process, the overfitting problem.
3. The perceptron and its training algorithm. Implementation of simple logic functions. Limited capabilities of single-layer perceptron.
4. Multilayer networks and the Backpropagation (BP) algorithm. Modifications and improvements of the BP algorithm (training speed adjustment, the momentum term, gain adaptation).
5. Efficient methods for training of the multilayer perceptron: conjugate-gradient methods, resilient propagation, further methods.
6. Hetero- and auto-associative networks, topology and training, synchronous and asynchronous models. The Hopfield model, stability and energy, storage capacity
7. Radial Basis Function networks, organization and active dynamics. Three phases of training, properties, applications, a comparison with multilayer perceptron.
8. Competitive networks, the vector quantization problem, Lloyd's algorithm. The Kohonen training rule, the UCL, DCL a SCL variants of training.
9. Self-organizing maps - SOM, description and applications, the neighbourhood function, examples of single- and two-dimensional maps.
10. The ART networks, principles and properties, the vigilance function.
- 1. The structure of biological neuron, mathematical model of a simple neuron and a multi-layer neural network. Features and applications of artificial neural networks.
- Literature
- required literature
- KRIESEL, D. A Brief Introduction to Neural Networks, Zeta version. 2007. URL info
- NERUDA, R., ŠÍMA, J. Teoretické otázky neuronových sítí. Matfyzpress, Praha, 1996. info
- recommended literature
- SACKS, O. Muž, který si pletl manželku s kloboukem. Praha: Mladá Fronta, 1993. info
- HERTZ, J. et. al. Introduction to the Theory of Neural Computation. Addison-Wesley, New York, 1991. info
- not specified
- MARČEK, D. Neuronové sítě a fuzzy časové řady. Opava: SU Opava, 2002. ISBN 80-7248-157-6. info
- Teaching methods
- Interactive lecture
Lecture with a video analysis - Assessment methods
- Exam
- Language of instruction
- English
- Further comments (probably available only in Czech)
- The course can also be completed outside the examination period.
- Teacher's information
- 1. An individual programming project in training and testing of artificial neural networks.
2. Evaluation of at least 50% in a written exam covering the whole topic of the course.
- Enrolment Statistics (Winter 2018, recent)
- Permalink: https://is.slu.cz/course/fpf/winter2018/UINA342