neural-networks.io

neural-networks.io

MIT open course on artificial intelligence part 2/3

This page references the MIT 6.034 Artificial Intelligence open course (Fall 2010). This course has been taugh by Patrick Winston at Fall 2010. You can view the complete course at http://ocw.mit.edu/6-034F10. Creative Commons BY-NC-SA Licence.

 

# 11. Learning: Identification Trees, Disorder

In this lecture, we build an identification tree based on yes/no tests. We start by arranging the tree based on tests that result in homogeneous subsets. For larger datasets, this is generalized by measuring the disorder of subsets.

 

# 12a: Neural Nets

In this video, Prof. Winston introduces neural nets and back propagation.

 

# 12b: Deep Neural Nets

In this lecture, Prof. Winston discusses BLANK and modern breakthroughs in neural net research.

 

# 13. Learning: Genetic Algorithms

This lecture explores genetic algorithms at a conceptual level. We consider three approaches to how a population evolves towards desirable traits, ending with ranks of both fitness and diversity. We briefly discuss how this space is rich with solutions.

 

# 14. Learning: Sparse Spaces, Phonology

Why do "cats" and "dogs" end with different plural sounds, and how do we learn this? We can represent this problem in terms of distinctive features, and then generalize. We end this lecture with a brief discussion of how to approach AI problems.

 

# 15. Learning: Near Misses, Felicity Conditions

To determine whether three blocks form an arch, we use a model which evolves through examples and near misses; this is an example of one-shot learning. We also discuss other aspects of how students learn, and how to package your ideas better.

 

# 16. Learning: Support Vector Machines

In this lecture, we explore support vector machines in some mathematical detail. We use Lagrange multipliers to maximize the width of the street given certain constraints. If needed, we transform vectors into another space, using a kernel function.

 

# 17. Learning: Boosting

Can multiple weak classifiers be used to make a strong one? We examine the boosting algorithm, which adjusts the weight of each classifier, and work through the math. We end with how boosting doesn't seem to overfit, and mention some applications.

 

# 18. Representations: Classes, Trajectories, Transitions

In this lecture, we consider the nature of human intelligence, including our ability to tell and understand stories. We discuss the most useful elements of our inner language: classification, transitions, trajectories, and story sequences.

 

# 19. Architectures: GPS, SOAR, Subsumption, Society of Mind

In this lecture, we consider cognitive architectures, including General Problem Solver, SOAR, Emotion Machine, Subsumption, and Genesis. Each is based on a different hypothesis about human intelligence, such as the importance of language and stories.