|
BIL5050 | Artificial Neural Systems | 3+0+0 | ECTS:7.5 | Year / Semester | Fall Semester | Level of Course | Second Cycle | Status | Elective | Department | DEPARTMENT of COMPUTER ENGINEERING | Prerequisites and co-requisites | None | Mode of Delivery | Face to face | Contact Hours | 14 weeks - 3 hours of lectures per week | Lecturer | -- | Co-Lecturer | None | Language of instruction | | Professional practise ( internship ) | None | | The aim of the course: | To give information about Artificial neural systems (ANS) . |
Programme Outcomes | CTPO | TOA | Upon successful completion of the course, the students will be able to : | | | PO - 1 : | understand what ANN is and how it works | 1,4,5,8,9,10,15 | 1,3 | PO - 2 : | design and train feedforward networks | 1,4,5,8,15 | 1,3 | PO - 3 : | design and train feedback networks | 1,3,5,8,13,15 | 1,3 | PO - 4 : | gain knowledge on how multi-layer ANN's work and are trained | 1,3,4,5,8,9,14,15 | 1,3 | PO - 5 : | design and train associative memory networks | 1,3,5,8,15 | 1,3 | PO - 6 : | Design and apply convolutional neural networs | | | CTPO : Contribution to programme outcomes, TOA :Type of assessment (1: written exam, 2: Oral exam, 3: Homework assignment, 4: Laboratory exercise/exam, 5: Seminar / presentation, 6: Term paper), PO : Learning Outcome | |
Fundamental concepts and models of ANS; Single-layer perceptron classfiers. Multilayer feedforward networks. Single-layer feedback networks. Associative memories. Convolution Neural Networks: Architectures, Convolution/Pooling Layers, Case Study : AlexNet, VGGNet, DarkNet, ResNet, DenseNet, Recurrent Neural Network (RNN), Long Short Term Memory (LSTM) Networks; |
|
Course Syllabus | Week | Subject | Related Notes / Files | Week 1 | Fundamental concepts and models of ANS: Biological neurons, Models of artificial neural network (ANN), Neural processing, Learning and adaptation | | Week 2 | Neural network learning rules. Single layer perceptron classifiers: Classification models, features, and decision regions, | | Week 3 | Discriminant functions, Lineer machine and minimum distance classification. | | Week 4 | Nonparametric training concept. Single layer continuous perceptron networks for lineerly separable classification. Examples. | | Week 5 | Multilayer feedforward networks: Lineerly nonseparable pattern classification, delta learning rulefor multiperceptron layer, Feedforward recall and error back-propagation training | | Week 6 | Learning factors | | Week 7 | Classifying and expert layered networks | | Week 8 | Single layer feedback networks: Basic concepts of dynamical systems,Mathematical foundations of discrete time Hopfield networks, Transient responce of continuous time networks | | Week 9 | Mid-term Examination | | Week 10 | Relaxation modeling in single layer feedback networks, Example solution of optimization problems | | Week 11 | Associative memories | | Week 12 | Convolutional Neural Networks : Acrhitecture COnvolutional/Pooling Layers | | Week 13 | Case Studies : AlexNet, VGGNet, ResNet, DenseNet | | Week 14 | DartkNet-YoloV3-V4 Architecture and coding with C/C++ | | Week 15 | Recurrent and Long Short Term Memory (LSTM) Neural Networks | | Week 16 | End-of-term exam | | |
1 | Jacek M. Zurada, Artificial Neural Systems, West Publishing Company | | |
1 | Simon Haykin, Neural Networks and Learrning Machines, Pearson International Edition | | 2 | Mohamad H. Hassoun, Fundamentals of Artificial Neural Networks, The MIT Press | | |
Method of Assessment | Type of assessment | Week No | Date | Duration (hours) | Weight (%) | Project | 15 | 25012021 | 10 | 50 | End-of-term exam | 16 | 17/01/2021 | 2.0 | 50 | |
Student Work Load and its Distribution | Type of work | Duration (hours pw) | No of weeks / Number of activity | Hours in total per term | Yüz yüze eğitim | 3 | 14 | 42 | Sınıf dışı çalışma | 3 | 14 | 42 | Arasınav için hazırlık | 4 | 1 | 4 | Arasınav | 2 | 1 | 2 | Proje | 25 | 1 | 25 | Dönem sonu sınavı için hazırlık | 5 | 1 | 5 | Dönem sonu sınavı | 2 | 1 | 2 | Total work load | | | 122 |
|