Thinking as a Hobby

Get Email Updates
Email Me

Admin Password

Remember Me

3478357 Curiosities served
Share on Facebook

The Neocortex: Modeling
Previous Entry :: Next Entry

Read/Post Comments (2)

In the last three entries I've given a broad overview of the neocortex and introduced a hypothesis about how it might work. To test these ideas, I want to simulate the neocortex in software.

The purpose of a model is to try to gain insights into the target system, the thing being modeled. The modeler has many, many decisions to make about what goes into the model. Ideally, you want to include those aspects that are going to give you useful insights, and you want to choose a level of detail that is appropriate. This is a bit of a catch-22, since if you knew which features were going to be important to gain the insights you want, you'd already have a very good understanding of the target system. So what you do is make educated guesses.

Jeff Hawkins' new company, Numenta, is trying to model cortical function using Bayesian modules. Bayesian inference is a mathematical method for updating a system's belief state based on evidence, or input. By choosing this level to model the neocortex at, they are abstracting away a lot of biological detail. Who knows? Maybe it will actually work.

Personally I'm interested in what's going on at a finer-grained level. Some people simulate neural systems with very extreme detail, modeling properties of the cells, including the branching of the dendrites and axons, the dispersal of neurotransmitters, and the function of ion channels. But there's a trade-off. Models at that level of detail are so computationally expensive they can only simulate a small number of cells. That makes it hard to try to model a system with billions of cells.

My model will fall somewhere between these two extremes. I'm using artificial neurons, but they are far, far simpler than biological neurons. They capture some of the salient features of real neurons, though. Each neuron receives input from a number of other neurons. At each moment, the neuron updates its internal state based on all this input, and sends its signal to all the other neurons that it is connected to.

I'm using artificial neurons in the class of "leaky integrators", which basically means that they don't update their state instantaneously, but incrementally over time. It also means that if their activity is built up over time, when you take away all input, they still have residual output. One way to think of it is as a leaky bucket. Say you have a number of small hoses pouring into the bucket. It's going to take a while to fill it up. But even when you turn off all the incoming hoses, it's still got water, which slowly leaks out. In terms of electrical circuits, it's like a capacitor that builds up a charge, which then decays over time in the absence of current.

This gives the artificial neuron a crude form of memory. If you hear a musical note at one instant, even when the note goes away, the neuron is still firing, though not as strongly as when initially presented with the input. So when you hear a second note, there is overlap in the activations of the two neurons. In my model, this kind of overlapping activation is important to associate stimuli that occur close together in time.

Anyway, that's at the neuron level. I plan to explicitly model minicolumns as groups consisting of about 100 artificial neurons. Minicolumns will be organized into simulated cortical sheets, which will be connected hierarchically.

The model will also be evolutionary and developmental. Because it is an evolutionary model, there will be a genetic representation which will be used to build the phenotype (this is like the DNA/body relationship).

I think that's enough for one entry, so next time I'll talk about the genotype/phenotype encoding scheme I plan to use, and then the entry after that will discuss how learning and development work in the model.

Read/Post Comments (2)

Previous Entry :: Next Entry

Back to Top

Powered by JournalScape © 2001-2010 All rights reserved.
All content rights reserved by the author.