(Vytvorená stránka „{{CourseHeader | code = 2-AIN-132 | title = Neural Networks | otherprograms = 2-IKV }} {{Infolist|2-AIN-132|Course information sheet >}} {{NAMESPACE}}:{{R...“) |
(→Syllabus) |
||
(35 intermediate revisions by 2 users not shown) | |||
Line 4: | Line 4: | ||
| otherprograms = 2-IKV | | otherprograms = 2-IKV | ||
}} | }} | ||
− | {{Infolist|2-AIN-132|Course information sheet | + | {{Infolist|2-AIN-132|Course information sheet}} |
− | [[{ | + | __TOC__ |
+ | |||
+ | The aim of the course is to get acquainted with the basic concepts and algorithms of learning artificial neural networks and their use in solving various problems. Theoretical lectures are combined with practical modeling in Python exercises | ||
+ | |||
+ | == News == | ||
+ | Partial changes were made in lectures and exercises last year. Some older parts have been shortened, newer topics have been added. The syllabus is updated, as well as the evaluation of course activities. | ||
+ | |||
+ | <!-- | ||
+ | [https://moodle.uniba.sk/moodle/inf11/mod/assign/view.php?id=29805 Projekt 1] - termín odovzdania: 8.4.2020 23:55 | ||
+ | |||
+ | Projekt [https://moodle.uniba.sk/moodle/inf11/mod/assign/view.php?id=29809 (2a)] alebo [https://moodle.uniba.sk/moodle/inf11/mod/assign/view.php?id=29810 (2b)] - termín odovzdania: štvrtok 30.4.2020 23:55 | ||
+ | |||
+ | Projekt [https://moodle.uniba.sk/moodle/inf11/mod/assign/view.php?id=29814 (3a)] alebo [https://moodle.uniba.sk/moodle/inf11/mod/assign/view.php?id=29815 (3b)] - termín odovzdania: štvrtok 20.5.2020 23:55 | ||
+ | |||
+ | '''Skúška:''' bude online, termíny som vám poslal emailom, ako aj informácie o jej priebehu. Vyberiete si (spôsob určím) pseudonáhodne 3 otázky zo [http://dai.fmph.uniba.sk/courses/NN/ns-otazky.2020.pdf zoznamu otázok]. | ||
+ | |||
+ | <!--tkom písomnej časti vždy o 8:30 v I-8. Maximálne 8 ľudí na termín.--> | ||
+ | |||
+ | <!-- [[#Archív noviniek|Archív noviniek…]] | ||
+ | |||
+ | <b>Poznámka:</b> Aktualizované slajdy prednášok budú poskytované priebežne na stránke predmetu.--> | ||
+ | |||
+ | == Schedule == | ||
+ | {| class="alternative table-responsive" | ||
+ | !Type | ||
+ | !Day | ||
+ | !Time | ||
+ | !Location | ||
+ | !Teacher | ||
+ | |- | ||
+ | |Lecture | ||
+ | |Tuesday | ||
+ | |12:20 - 13:50 | ||
+ | |posl. B | ||
+ | |[[Igor Farkas|Igor Farkaš]] | ||
+ | |- | ||
+ | |[https://moodle.uniba.sk/moodle/inf11/course/view.php?id=734 Exercise] | ||
+ | |Thursday | ||
+ | |16:30 - 18:00 | ||
+ | |online | ||
+ | |[[Stefan Pocos|Štefan Pócoš]],[[Iveta Beckova|Iveta Bečková]] | ||
+ | |} | ||
+ | Note: The first week will be online in [https://teams.microsoft.com/l/meetup-join/19%3a78400d0d62c34556a041189c52412f05%40thread.tacv2/1644572341734?context=%7b%22Tid%22%3a%22ce31478d-6e7a-4ce7-8670-a5b9d51884f9%22%2c%22Oid%22%3a%224d7bc542-0d77-47fd-b030-6a835c5b4918%22%7d MS Teams]. | ||
+ | |||
+ | == Syllabus == | ||
+ | |||
+ | {| class="alternative table-responsive" | ||
+ | !No. | ||
+ | !Date | ||
+ | !Topic | ||
+ | !References | ||
+ | |- | ||
+ | |01 | ||
+ | |15.2. | ||
+ | |Conditions for passing the course. Introduction, inspiration from neurobiology, brief history of NN, basic concepts. NN with logical neurons. <!-- [http://dai.fmph.uniba.sk/courses/NN/Lectures/nn.intro.L01.4x.pdf slides-L01]--> | ||
+ | | [U1/1][U3/1][U4/1][U5/1] | ||
+ | |- | ||
+ | |02 | ||
+ | |22.2. | ||
+ | |Binary and continuous perceptron: supervised learning, error functions, binary classification and regression, linear separability. Relation to the Bayesian classifier. <!--[http://dai.fmph.uniba.sk/courses/NN/Lectures/nn.perceptron.L02.4x.pdf slides-L02]--> | ||
+ | |[U1/1-3][U4/2] | ||
+ | |- | ||
+ | |03 | ||
+ | |01.3. | ||
+ | |Single-layer NS: Linear autoassociation: General Inverse model. Classification into n-classes. Error functions, relation to information theory. <!--[http://dai.fmph.uniba.sk/courses/NN/Lectures/nn.single-layer-models.L03.4x.pdf slides-L03] --> | ||
+ | |||
+ | |[U4/3][U5/4] | ||
+ | |- | ||
+ | |04 | ||
+ | |08.3. | ||
+ | |Multilayer perceptron: error back-propagation algorithm. Training, validation, testing. Model selection. <!--[http://dai.fmph.uniba.sk/courses/NN/Lectures/nn.mlp.L04.4x.pdf slides-L04] --> | ||
+ | |[U1/4][U4/4] | ||
+ | |- | ||
+ | |05 | ||
+ | |15.3. | ||
+ | |Modifications of gradient methods, second-order optimization, regularization. Optimization problems. <!--[http://dai.fmph.uniba.sk/courses/NN/Lectures/nn.optimization.L05.4x.pdf slides-L05]--> | ||
+ | |[U1/15][U4/11] | ||
+ | |- | ||
+ | |06 | ||
+ | |22.3. | ||
+ | |Unsupervised learning, feature extraction, neural PCA model. Data visualization: self-organizing map (SOM) model. <!--[http://dai.fmph.uniba.sk/courses/NN/Lectures/nn.unsup.L06.4x.pdf slides-L06] --> | ||
+ | |[U1/8-9][U5/7] | ||
+ | |- | ||
+ | |07 | ||
+ | |29.3. | ||
+ | |Sequential data modeling: forward NS, relation to n-grams, partially and fully recurrent models, SRN model, BPTT, RTRL algorithm. <!--[http://dai.fmph.uniba.sk/courses/NN/Lectures/nn.seq-models.L07.4x.pdf slides-L07]--> | ||
+ | |[U4/8][U5/6] | ||
+ | |- | ||
+ | |08 | ||
+ | |05.4. | ||
+ | |Expansion of hidden representation: NS with radial basis functions (RBF), echo state network (ESN). <!--[http://dai.fmph.uniba.sk/courses/NN/Lectures/nn.rbf-esn.L08.4x.pdf slides-L08]--> | ||
+ | |[U1/5][U2] | ||
+ | |- | ||
+ | |09 | ||
+ | |12.4. | ||
+ | |Deep learning. Convolutional neural networks: introduction. <!--[http://dai.fmph.uniba.sk/courses/NN/Lectures/nn.deep-convol.L09.4x.pdf slides-L09]--> | ||
+ | |[U3/6,9, U4/6] | ||
+ | |- | ||
+ | |10 | ||
+ | |19.4. | ||
+ | |holiday (dean's order) | ||
+ | | | ||
+ | |- | ||
+ | |11 | ||
+ | |26.4. | ||
+ | |More recent models: autoencoders, GRU, LSTM. <!--[http://dai.fmph.uniba.sk/courses/NN/Lectures/nn.autoenc-gated.L10.4x.pdf slides-L10]--> | ||
+ | |[U3/14,U4/9.1-2] | ||
+ | |- | ||
+ | |12 | ||
+ | |03.5. | ||
+ | |Hopfield model: deterministic dynamics, attractors, autoassociative memory, sketch of the stochastic model, modern versions. <!--[http://dai.fmph.uniba.sk/courses/NN/Lectures/nn.hopfield-aam.L11.4x.pdf slides-L11]--> | ||
+ | |[U1/13][U5/9] | ||
+ | |- | ||
+ | |13 | ||
+ | |10.5. | ||
+ | |Stochastic recurrent models: basics of probability theory and statistical mechanics, Boltzmann machine, RBM model, Deep Belief Network. <!-- [http://dai.fmph.uniba.sk/courses/NN/Lectures/nn.stochastic.L12.4x.pdf slides-L12]--> | ||
+ | |[U1/11][U3/16] | ||
+ | |} | ||
+ | |||
+ | == References == | ||
+ | |||
+ | * Farkaš I. (2016). [http://dai.fmph.uniba.sk/courses/NN/neural-networks.2016.pdf Neural networks]. Knižničné a edičné centrum FMFI UK v Bratislave. Slajdes to the lectures (not updated). | ||
+ | * Haykin S. (2009). Neural Networks and Learning Machines (3rd ed.). Upper Saddle River, Pearson Education (k dispozícii na štúdium v knižnici FMFI, ale aj stiahnuteľné z webu). [U1] | ||
+ | * Jaeger H. (2007). [http://www.scholarpedia.org/article/Echo_state_network Echo-state network]. Scholarpedia, 2(9):2330. [U2] | ||
+ | * Goodfellow I., Bengio Y., Courville A. (2016). [http://www.deeplearningbook.org Deep Learning]. MIT Press. [U3] | ||
+ | * Zhang A. et al. (2020). [https://d2l.ai/ Dive into Deep Learning]. An interactive deep learning book with code, math, and discussions, based on the NumPy interface. [U4] | ||
+ | * Kvasnička V., Beňušková., Pospíchal J., Farkaš I., Tiňo P. a Kráľ A. (1997). [http://dai.fmph.uniba.sk/courses/NN/UvodDoTeorieNS.pdf.zip Úvod do teórie neurónových sietí]. Iris: Bratislava. [U5] | ||
+ | |||
+ | == Conditions and grading == | ||
+ | |||
+ | * Submission of at least two (out of three) functioning projects during the semester (max. 3x10 = 30 points). The deadlines will be announced on the webpage. The projects will offer bonuses (max. 4 points). | ||
+ | * The exercises will consist of small tasks to be completed, and will be graded (32 points during the semester). You have to acquire at least 15 points from exercises. | ||
+ | * Passing the final oral exam (3 questions, 30 points in total, pseudorandom choice). To register for the exam, you have to have at least two functioning projects graded. The exam is compulsory, you have to get at least 12 points. | ||
+ | * The lectures are not compulsory, but you can get up to 6 points for participation. | ||
+ | * <b>Overall grading:</b> A (100-91), B (90-81), C (80-71), D (70-61), E (60-51), Fx (50-0). | ||
+ | |||
+ | == Projects during the semester == | ||
+ | |||
+ | * The project, together with the source code, is to be submitted before the deadline. Late submissions are penalized by -1 point each day. The successful project (i.e. with a well functioning model) submitted more than 5 days after the deadline counts, without points. | ||
+ | * The projects are graded mainly based on content, but the form is considered, too (readability). The content should be comprehensible, i.e. graphical outputs combined with text. | ||
+ | * The model is to be implemented in Python and the project must be submitted as a PDF (no title page is required, the title and your name is enough). | ||
+ | * In case of plagiarism detection, the student automatically receives zero points from the project and will not be admitted to the exam. |
Latest revision as of 15:02, 7 June 2022
Neural Networks 2-AIN-132
Contents
The aim of the course is to get acquainted with the basic concepts and algorithms of learning artificial neural networks and their use in solving various problems. Theoretical lectures are combined with practical modeling in Python exercises
News
Partial changes were made in lectures and exercises last year. Some older parts have been shortened, newer topics have been added. The syllabus is updated, as well as the evaluation of course activities.
Schedule
Type | Day | Time | Location | Teacher |
---|---|---|---|---|
Lecture | Tuesday | 12:20 - 13:50 | posl. B | Igor Farkaš |
Exercise | Thursday | 16:30 - 18:00 | online | Štefan Pócoš,Iveta Bečková |
Note: The first week will be online in MS Teams.
Syllabus
No. | Date | Topic | References |
---|---|---|---|
01 | 15.2. | Conditions for passing the course. Introduction, inspiration from neurobiology, brief history of NN, basic concepts. NN with logical neurons. | [U1/1][U3/1][U4/1][U5/1] |
02 | 22.2. | Binary and continuous perceptron: supervised learning, error functions, binary classification and regression, linear separability. Relation to the Bayesian classifier. | [U1/1-3][U4/2] |
03 | 01.3. | Single-layer NS: Linear autoassociation: General Inverse model. Classification into n-classes. Error functions, relation to information theory. | [U4/3][U5/4] |
04 | 08.3. | Multilayer perceptron: error back-propagation algorithm. Training, validation, testing. Model selection. | [U1/4][U4/4] |
05 | 15.3. | Modifications of gradient methods, second-order optimization, regularization. Optimization problems. | [U1/15][U4/11] |
06 | 22.3. | Unsupervised learning, feature extraction, neural PCA model. Data visualization: self-organizing map (SOM) model. | [U1/8-9][U5/7] |
07 | 29.3. | Sequential data modeling: forward NS, relation to n-grams, partially and fully recurrent models, SRN model, BPTT, RTRL algorithm. | [U4/8][U5/6] |
08 | 05.4. | Expansion of hidden representation: NS with radial basis functions (RBF), echo state network (ESN). | [U1/5][U2] |
09 | 12.4. | Deep learning. Convolutional neural networks: introduction. | [U3/6,9, U4/6] |
10 | 19.4. | holiday (dean's order) | |
11 | 26.4. | More recent models: autoencoders, GRU, LSTM. | [U3/14,U4/9.1-2] |
12 | 03.5. | Hopfield model: deterministic dynamics, attractors, autoassociative memory, sketch of the stochastic model, modern versions. | [U1/13][U5/9] |
13 | 10.5. | Stochastic recurrent models: basics of probability theory and statistical mechanics, Boltzmann machine, RBM model, Deep Belief Network. | [U1/11][U3/16] |
References
- Farkaš I. (2016). Neural networks. Knižničné a edičné centrum FMFI UK v Bratislave. Slajdes to the lectures (not updated).
- Haykin S. (2009). Neural Networks and Learning Machines (3rd ed.). Upper Saddle River, Pearson Education (k dispozícii na štúdium v knižnici FMFI, ale aj stiahnuteľné z webu). [U1]
- Jaeger H. (2007). Echo-state network. Scholarpedia, 2(9):2330. [U2]
- Goodfellow I., Bengio Y., Courville A. (2016). Deep Learning. MIT Press. [U3]
- Zhang A. et al. (2020). Dive into Deep Learning. An interactive deep learning book with code, math, and discussions, based on the NumPy interface. [U4]
- Kvasnička V., Beňušková., Pospíchal J., Farkaš I., Tiňo P. a Kráľ A. (1997). Úvod do teórie neurónových sietí. Iris: Bratislava. [U5]
Conditions and grading
- Submission of at least two (out of three) functioning projects during the semester (max. 3x10 = 30 points). The deadlines will be announced on the webpage. The projects will offer bonuses (max. 4 points).
- The exercises will consist of small tasks to be completed, and will be graded (32 points during the semester). You have to acquire at least 15 points from exercises.
- Passing the final oral exam (3 questions, 30 points in total, pseudorandom choice). To register for the exam, you have to have at least two functioning projects graded. The exam is compulsory, you have to get at least 12 points.
- The lectures are not compulsory, but you can get up to 6 points for participation.
- Overall grading: A (100-91), B (90-81), C (80-71), D (70-61), E (60-51), Fx (50-0).
Projects during the semester
- The project, together with the source code, is to be submitted before the deadline. Late submissions are penalized by -1 point each day. The successful project (i.e. with a well functioning model) submitted more than 5 days after the deadline counts, without points.
- The projects are graded mainly based on content, but the form is considered, too (readability). The content should be comprehensible, i.e. graphical outputs combined with text.
- The model is to be implemented in Python and the project must be submitted as a PDF (no title page is required, the title and your name is enough).
- In case of plagiarism detection, the student automatically receives zero points from the project and will not be admitted to the exam.