Nordic Track Book Club Review: The Singularity is Near, Part 3

It’s time for another installment of Ray Kurzweil’s The Singularity is Near. Part 3 from the Nordic Track Ski Machine.  We’re about half-way through the book, if you’re curious.  Previous Singularity excerpt posts are located here.

Chapter Three: Achieving the Computational Capacity of the Human Brain

P.116 Future circuits will continuously monitor their own performance and route information around sections that are unreliable (like the Internet.)

Emulating Biology.  Self-replicating and self-organizing electronic and mechanical systems are inspired by biology.

P.124 To “upload” personality (capture knowledge, skills, personality), we may need to simulate neural processes at the level of the individual neurons and portions of neurons—soma (cell body), axon (output connection), dendrites (trees of incoming connections) and synapses (regions connecting axons and dendrites.)

Each device will function as a node, sending information to and receiving information from every other device.

P.127 While human neurons are wondrous creations, we wouldn’t (and don’t) design computing circuits using the same slow methods.  Despite the ingenuity of the designs evolved through natural selection, they are many orders of magnitude less capable than what we will be able to engineer.

Most of the complexity of the human neuron is devoted to maintaining its life-support functions, not its information-processing capabilities.  Ultimately we will be able to port our mental processes to a more suitable computational substrate.  Then our minds won’t have to stay so small.

Chapter Four: Achieving the Software of Human Intelligence

P.145 The computational capacity needed to emulate human intelligence will be available in less than two decades…The hardware computational capacity is necessary but not sufficient.  Understanding the organization and content of these resources–the software of intelligence—is even more critical and is the objective of the brain reverse-engineering undertaking.

P.157 Imagine we were trying to reverse engineer a computer without knowing anything about it (the “black box” approach).  We might start by placing arrays of magnetic sensors around the device.  We would notice that during operations that updated a database, significant activity was taking place in a particular circuit board.

The hypothetical situation described above mirrors the sort of efforts that have been undertaken to scan and model the human brain with the crude tools that have historically been available.  Most models based on contemporary brain-scanning research are only suggestive of the underlying mechanisms.

P.165 Nanobots used to monitor sensory signals are important for reverse engineering the inputs to the brain and creating full-immersion virtual reality from within the nervous system.

In order to reverse engineer the brain, we only need to scan the connections in a region sufficiently to understand their basic pattern.  We do not need to capture every single connection.

Based on these projections, we can conservatively anticipate the requisite nanobot technology to implement these types of scenarios during the 2020s.  Once nanobot-based scanning becomes a reality, we will finally be in the same position that circuit designers are in today: we will be able to place highly sensitive and very high-resolution sensors (in the form of nanobots) at millions or even billions of locations in the brain and thus witness in breathtaking detail living brains in action.

P.173 San Diego researchers asked, could electronic neurons engage in this chaotic dance alongside biological ones?  They connected artificial neurons with real neurons from spiny lobsters in a single network…chaotic interplay followed by a stable emergent pattern in both, with the biological neurons accepting their electronic peers.

P.184 Lloyd Watts’ human auditory-processing system is an example of neuromorphic modeling of a region of the brain.

Cerebral cortex: 5 billion+ synapses of different shapes and sizes: perception, planning decision making, i.e., conscious thinking.

Recursion is the key capability in a new theory of linguistic competence, the ability to put together small parts into a larger chunk, then use that chunk as a part in yet another structure and to continue the process iteratively.  In that way, we are able to build elaborate structures of sentences and paragraphs from a limited set of words.

Another key feature of the brain is the ability to make predictions.

The most complex capability of the human brain is our emotional intelligence, our ability to respond appropriately to emotion, to interact in social situations, to have a moral sense, to get the joke.

Deep interconnectedness is when certain neurons provide connections across numerous regions.

These findings are consistent with a growing consensus that our emotions are closely linked to areas of the brain that contain maps of the body.

P.194 Interfacing the Brain and Machines.

Researchers at Duke implanted sensors in the brains of monkeys, enabling them to control a robot through thought alone.

The process of understanding the principles of operation of the brain is proceeding through a series of increasingly sophisticated models derived from increasingly accurate and high-resolution data.

We will have the data-gathering and computational tools needed by the 2020s to model and simulate the entire brain, which will make it possible to combine the principles of operation of human intelligence with the forms of intelligent information processing that we have derived from other AI research.  We will also benefit from the inherent strength of machines in storing, retrieving, and quickly sharing massive amounts of information.  We will then be in a position to implement these powerful hybrid systems on computational platforms that greatly exceed the capabilities of the human brain’s relatively fixed architecture.

    

[tags: Singularity, Nordic Track Ski]

Article written by

A long time developer, I was an early adopter of Linux in the mid-90's for a few years until I entered corporate environments and worked with Microsoft technologies like ASP, then .NET. In 2008 I released Sueetie, an Online Community Platform built in .NET. In late 2012 I returned to my Linux roots and locked in on Java development. Much of my work is available on GitHub.