(Syllabus)
(Syllabus)
Riadok 69: Riadok 69:
 
|03
 
|03
 
|02.3.
 
|02.3.
|Single-layer NS: Linear autoassociation: General Inverse model. Classification into n-classes. Error functions, relation to information theory . <b>Note: starting at 8:10 </b> <!--[http://dai.fmph.uniba.sk/courses/NN/Lectures/nn.single-layer-models.L03.4x.pdf slajdy-L03]-->
+
|Single-layer NS: Linear autoassociation: General Inverse model. Classification into n-classes. Error functions, relation to information theory . [http://dai.fmph.uniba.sk/courses/NN/Lectures/nn.single-layer-models.L03.4x.pdf slajdy-L03] <b>Note: we start at 8:10 </b>
  
 
|[U4/3][U5/4]
 
|[U4/3][U5/4]

Verzia zo dňa a času 17:54, 1. marec 2021

Neural Networks 2-AIN-132

Course information sheet

The aim of the course is to get acquainted with the basic concepts and algorithms of learning artificial neural networks and their use in solving various problems. Theoretical lectures are combined with practical modeling in Python exercises

News

Partial changes were made in lectures and exercises last year. Some older parts have been shortened, newer topics have been added. The syllabus is updated, as well as the evaluation of course activities.


Schedule

Type Day Time Location Teacher
Lecture Tuesday 09:50 - 11:20 online Igor Farkaš
Exercise Thursday 16:30 - 18:00 online Endre Hamerlik, Štefan Pócoš

Syllabus

No. Date Topic References
01 16.2. Conditions for passing the course. Introduction, inspiration from neurobiology, brief history of NN, basic concepts. NN with logical neurons. slajdy-L01 [U1/1][U3/1][U4/1][U5/1]
02 23.2. Binary and continuous perceptron: supervised learning, error functions, binary classification and regression, linear separability. Relation to the Bayesian classifier. slajdy-L02 [U1/1-3][U4/2]
03 02.3. Single-layer NS: Linear autoassociation: General Inverse model. Classification into n-classes. Error functions, relation to information theory . slajdy-L03 Note: we start at 8:10 [U4/3][U5/4]
04 09.3. Multilayer perceptron: error back-propagation algorithm. Training, validation, testing. Model selection. [U1/4][U4/4]
05 16.3. Modifications of gradient methods, second order optimization, regularization. Optimization problems. [U1/15][U4/11]
06 23.3. Unsupervised learning, feature extraction, neural PCA model. Data visualization: self-organizing map (SOM) model. [U1/8-9][U5/7]
07 30.3. Sequential data modeling: forward NS, relation to n-grams, partially and fully recurrent models, SRN model, BPTT, RTRL algorithm. [U4/8][U5/6]
08 06.4. Expansion of hidden representation: NS with radial basis functions (RBF), echo state network (ESN). [U1/5][U2]
09 13.4. Deep learning. Convolutional neural networks: introduction. [U3/6,9, U4/6]
10 20.4. More recent models: autoencoders, GRU, LSTM. [U3/14,U4/9.1-2]
11 27.4. Hopfield model: deterministic dynamics, attractors, autoassociative memory, sketch of the stochastic model . [U1/13][U5/9]
12 04.5. Stochastic recurrent models: basics of probability theory and statistical mechanics, Boltzmann machine, RBM model, Deep Belief Network. [U1/11][U3/16]
13 11.5. Recent advances in the field.

References

  • Farkaš I. (2016). Neural networks. Knižničné a edičné centrum FMFI UK v Bratislave. Slajdes to the lectures (not updated).
  • Haykin S. (2009). Neural Networks and Learning Machines (3rd ed.). Upper Saddle River, Pearson Education (k dispozícii na štúdium v knižnici FMFI, ale aj stiahnuteľné z webu). [U1]
  • Jaeger H. (2007). Echo-state network. Scholarpedia, 2(9):2330. [U2]
  • Goodfellow I., Bengio Y., Courville A. (2016). Deep Learning. MIT Press. [U3]
  • Zhang A. et al. (2020). Dive into Deep Learning. An interactive deep learning book with code, math, and discussions, based on the NumPy interface. [U4]
  • Kvasnička V., Beňušková., Pospíchal J., Farkaš I., Tiňo P. a Kráľ A. (1997). Úvod do teórie neurónových sietí. Iris: Bratislava. [U5]

Conditions and grading

  • Submission of at least two (out of three) functioning projects during the semester (max. 3x5 = 15 points). The deadlines will be announced on the webpage. The projects will offer bonuses (max. 2 points).
  • The exercises will consist of small tasks to be completed, and will be graded (max. 17 points during the semester). You have to acquire at least 7 points from exercises.
  • Passing the final oral exam (3 questions, 5 points each, pseudorandom choice). To register for the exam, you have to have at least two functioning projects graded. The exam is compulsory, you have to get at least 6 points.
  • The lectures are not compulsory, but you can get up to 3 points for participation.
  • Overall grading: A (50-46), B (45-41), C (40-36), D (35-31), E (30-26), Fx (25-0).

Project during the semester

  • The project, together with the source code, is to be submitted before the deadline. Late submissions are penalized by -1 point each day. The successful project (i.e. with a well functioning model) submitted more than 5 days after the deadline counts, without points.
  • The projects are graded mainly based on content, but the form is considered, too (readability). The content should be comprehensible, i.e. graphical outputs combined with text.
  • The model is to be implemented in Python and the project must be submitted as a PDF (no title page is required, the title and your name is enough).
  • In case of plagiarism detection, the student automatically receives zero points from the project and will not be admitted to the exam.