Where are These Intelligent Robots of Which You Speak?

The brain's cerebellum (see illustration) has between 50 and 69 billion neurons.













Note that the rest of the brain is more like 16 billion neurons.
We are close to simulating the 16 billion neurons given that robots can fly an airplane from
Boston to LA without any human intervention.
We are not as close to simulating the 69 billion neurons in the cerebellum, which would
allow us to have a robot that could reach into "his" pant's pockets and retrieve a set of keys.
Since we can do the 16 billion neurons now, it gives us an idea as to how close we are to
also accomplishing those things whose function is more remote from our conscious minds.
Looks like we may have preliminary versions working by 2020-2025.

Full-blown versions should be working well and costs should be coming down by 2030 or
so.


With more recent estimates of 21–26 billion neurons in the cerebral cortex (Pelvig et al.,
2008) and 101 billion neurons in the cerebellum (Andersen et al., 1992), however, the total
number of neurons in the human brain would increase to over 120 billion neurons. Nov 9,
2009



King Of The Chatbots , Agence France-Presse

Excerpts: George, who is 39, single and light-hearted, is looking for friends on the Internet.
He has gifts: the ability to speak in 40 languages and with 2000 people at the same time.
There's just one quirk: he doesn't really exist.

George is a piece of software, arguably the best of the speaking "chatbots" or talking robots,
and he's recently received the Loebner prize in Britain, a scientific award recognising the
machines best capable of matching the most realistic human dialogues with their own.

King Of The Chatbots, 2006/09/25, Agence France-Presse

http://www.news.com.au/story/0,23599,20471364-13762,00.html
- - - - - - - - - - -

Audience, Inc.
From Audience, Inc.:
Lloyd Watts founded Audience, Inc., which is involved in simulating the human ear,
cochlea and auditory neural processing mechanism of the brain.  He described this in a 23
minute talk on May 3, 2007.
http://www-bisc.eecs.berkeley.edu/CognitiveComputing07/CognitiveComputing2007Video.
htm
Lloyd Watts' first slide was the following:













http://www.audience.com/
"Audience is at the leading edge of the next generation of audio for telecommunications.
Using revolutionary core technology which duplicates human hearing biology, we are
radically changing
the voice communications experience."
- - - - - - - -

Evolved Machines, Inc.

Paul Rhodes, Stanford & Evolved Machines, Inc., presented his company's simulation of
growing neurons in his presentation which you can find at:
http://www-bisc.eecs.berkeley.edu/CognitiveComputing07/CognitiveComputing2007Video.
htm
His company can be found at: http://www.evolvedmachines.com/

From Evolved Machines:
http://www.evolvedmachines.com/
"Visual object recognition systems
In the mammalian visual system, different retinal images of the same object may have no
overlap, yet trigger a largely constant representation in higher stations of the cortical
hierarchy of cortical visual areas.  This representation arises in concert with the report of
perception of object identity, both occurring in 140 ms in primates.  Since this time period
is adequate for no more than 10 neuronal input/output cycles it is likely that the invariant
representation must be triggered by a largely feedforward cascade of wiring-defined
activation, rather than “computed” in an algorithmic sense.  Thus it is the wiring which
must embed this computation, so that when an image of an object is presented, the
representation is triggered, not computed.   Clearly the process by which the visual cortex is
wired is integral to any system of invariant object recognition based upon reverse
engineering the brain.

"We simulate the self-organization of wiring patterns in a hierarchy of cortical areas by
driving a filter bank (either designed to be a center-surround or edge-detection array) with
libraries of moving images, and then employ a host of neural local mechanisms to drive the
wiring process in an interconnected cascade of arrays of electrically active branched
neurons.  The intrinsic local neural mechanisms which we found to be required for
synthetic olfactory system function are employed in this more elaborate hierarchical and
somewhat topographic representation cascade, just as the mammalian visual cortex
circuitry is fashioned from neural components which evolved from the simpler and
evolutionarily more ancient olfactory cortex.  In each successive area the activity triggered
by an object, its “representation”, becomes more invariant to the classes of movements to
which the system is exposed during the wiring process, with activity in the later arrays in
this “cortical” hierarchy furnishing the desired invariant representation of its identity."

- - - - - - -

DARPA Grand Challenge November 3, 2007
Race to Las Vegas
http://www.darpa.mil/grandchallenge/index.asp
This time it is an urban race.

- - - - - - -

R2 Corporation was purchased by Hologic.  Their computers have the ability to read breast X-
rays for cancer more accurately than any doctor.
http://www.r2tech.com/main/company/news_one_up.php?prID=127&offset=2

About R2 Technology, Inc.
in the early detection of breast cancer, actionable lung nodules and other lung
abnormalities. As a and disease states R2 was given the 2004 Frost and Sullivan Product of
the Year award. For information, medical software company, R2 Technology is developing
CAD systems for a variety of imaging modalities visit
www.r2tech.com.


Film showing R2 technology of Computer Aided Detection:
http://www.r2tech.com/main/info/tech_closeup.php
Note that with multiple "slices" through breast, lung and colon with CT scans, the
radiologist's job has grown.  The computer is getting faster and has more experience.  The
computer is trained on 6000 known cancer cases of breast cancer and a typical radiologist
sees 10,000 patients and only 40 cases of breast cancer.

R2 ImageChecker CT Lung CAD Featured on FOX KTVU 2 News
http://www.ktvu.com/video/8810657/index.html


- - - - - - -
Next
Key Points

- I agree that compute capacity and memory of affordable machines will approach human-scale
levels by 2020-25.

- I remain optimistic that Neuroscience Knowledge will continue to advance over next 10-15 years
as needed to support neuroscientifically realistic algorithm development (cortical
structure/function-Callaway, Schuz, etc.)

                                                                                         AUDIENCE