Fall 2014 Computational Cognition Prof. Manevitz Posted Feb 1, 2015 (1:41 PM)


If you are loooking for the old review page from 2012 look here. This year's requirements are similar; the suggestions of kinds of questions are reasonable there.


Exam Requirements


No outside materials


I. Know the models we have discussed and their main aspects and what tasks they are good for. What are the learning methods?


Neurons:

McCullough Pitts Neuron; and Sigmoid approximation

Leaky Integrate and Fire Neuron

Hodgkin Huxley Neuron (Not on Moed Aleph)


Networks:


Feed forward

Kohonen ( equiprobable maps)

Hopfield

Sparse Distributed Memory (Slides from SDM/TSDM lecture) Paper on Hebrew-English

Counter-Propagation

Liquid State Machine


One can find information on the basic neurons and networks in the basic introduction to Neural Network texts such as e.g. Faussett, or Haykin. Wikipedia is also a good source.


Notions of Time


Leaky Integrate and Fire

Liquid State Machine

Temporal SDM


Understand importance of Feature Selection


II. Know the following applications (Know what they are trying to do; and how the networks are set up to do them)


NetTalk (Paper by Sejnowski) Slides

Reading the Mind (Paper by Boehm, Hardoon, Manevitz) (fMRI on cognitive tasks) (Slides)

Brain Modeling (Paper by Kolis, Gilboa Manevitz) of Human declarative Memory (Slides – first section of slides)

Topology and Robustness in LSM (Paper by Hazan and Manevitz) (Slides1) (Slides 2)

Silent Reading Paper (Hazan, Manevitz, Peleg, Eviatar) (“Two Hemispheres – two models”) (Slides)







III. SLIDES and Other Material from the Course. Some of this may help with the above subjects.

You are not required to know all of it.




Below are materials from the course  (Some may be repeated more than once (and may be copies of links listed above, and you are not responsible for all the material here. However, they may be useful to look up information.)

You can also look in Wikipedia or in a standard Neural Network Book (Such as Faussett,  Introduction to Neural Networks) for background on the computational neural networks.



1. Background

2.Background in Kinds of Neurons and Networks

3. NetTalk and Feedforward net

4. FEM and FeedForward net

5.Reading the Mind, one class and feature representation

6. Hopfield

7. Sparse Distributed Memory Associative Memory and TSDM

8.EE and FM

9. Kohonen Algorithm and Self Organizing Maps

10. Liquid State Machines

11. Virtual Reality

12. Disambiguifying in Silent Reading

This site is under construction

1.  Here are slides from the first lecture

2. Here are the slides of the general overview of “Classical Neural Networks”

NetTalk Original Paper

NetTalk Data Set

Interview with Mitchell and Just

Hardoon-Manevitz-Boehm Lecture on One Class Reading the Brain

For Work on Hopfield Associative Memory Consult any text on Neural Networks

Sparse Distributed Memory

Temporal Sparse Distributed Memory

Talk on Using fMRI to discover alternative declarative associative memory