k‎ > ‎


Jeff Hawkins Develops a Brainy Big Data Company

November 28, 2012, 12:13 pm Jeff HawkinsJeff Hawkins

Jeff Hawkins has been a pioneer of mobile devices, a distinguished lecturer in neuroscience, and a published author of a revolutionary theory of how the brain works. If he’s right about Big Data, a lot of people are going to wish he’d never gone into that field.

Mr. Hawkins, who helped develop the technology in Palm, an early and successful mobile device, is a co-founder of Numenta, a predictive software company. Numenta’s technology is based on Mr. Hawkins’s theories of how the brain works, a subject he has studied and published on intensively. Perhaps most important for the technology industry, the product works off streams of real-time information from sensors, not the trillions of bytes of data that companies are amassing.

“It only makes sense to look at old data if you think the world doesn’t change,” said Mr. Hawkins. “You don’t remember the specific muscles you just used to pick up a coffee cup, or all the words you heard this morning; you might remember some of the ideas.”

If no data needs to be saved over a long term and real-time data can stream in all the information that is needed, a big part of the tech industry has a problem. Data storage companies like EMC and Hewlett-Packard thrive on storing massive amounts of data cheaply. Data analysis companies including Microsoft, I.B.M., and SAS fetch that data and crunch the history to find patterns. They and others rely on both the traditional relational databases from Oracle, and newer “unstructured” databases like MongoDB, and batch processing frameworks like Hadoop.

Much of this will be a relic within a few years, according to Mr. Hawkins. “Hadoop won’t go away, but it will manage a lot less stuff,” he said in an interview at Numenta’s headquarters in Redwood City, Calif. “Querying databases won’t matter as much, as people worry instead about millions of streams of real-time data.” In a sensor-rich world of data feeds, he is saying, we will model ourselves more closely on the constant change that is the real world.

Mr. Hawkins thinks that the human neocortex, that part of the brain that includes the perception and reasoning functions, itself works as a kind of pattern-seeking and predictive system. Brain cells, starting at some of their most elemental components, work together to build expectations, initially about things like light and dark, or near and far, that they gather from sensory organs.

Patterns of one or the other are reinforced over time. As new data streams in, the brain figures out if it is capturing more complexity, which requires either modifying the understanding of the original pattern or splitting it into two patterns, making for new knowledge. Sometimes, particularly if it not repeated, the data is discarded as irrelevant information. Thus, over time, sounds become words, words occupy a grammatical structure, and ideas are conveyed.

“The key to artificial intelligence has always been the representation,” he says. “You and I are streaming data engines.”

It is a model of consciousness that Mr. Hawkins has promoted not just in the tech world, but to neuroscientists. While some have questioned the idea, he published a popular book on the rudiments of the subject, “On Intelligence.” Last spring he was invited to present the work at the Charles M. and Martha Hitchcock Lectures, at the University of California, Berkeley. Previous lecturers include Martin Rees, who Britain’s Astronomer Royal, and Steven Chu, the Nobel Prize winner and Energy Department secretary.

Numenta’s product, called Grok, is a cloud-based service that works much the same way. Grok takes steady feeds of data from things like thermostats, Web clicks, or machinery. From initially observing the data flow, it begins making guesses about what will happen next. The more data, the more accurate the predictions become.

It has been much more difficult to engineer than that sounds. Modeling itself on 40 sensory receptors feeding over 128 information-seeing dendrites on each cell of the brain, Mr. Hawkins put into Grok a mathematical algorithm that he says approximates the way brain cells work together, even sometimes canceling out each other’s signals to refine a sense of what’s going on.

“There are the equivalent of 60,000 neurons, each one fairly sophisticated, in each Grok,” he said. That model of 300 million connections, he notes, is about one millionth the actual capability of the neocortex

Grok is still in limited release, with just a few customers in the fields of energy, media, and video processing. So far, the company claims, Grok has delivered results that are 10 percent to 20 percent better than various benchmarks, like revenue, optimal purchasing mixes, and machine servicing. The company expects to start selling Grok more broadly in the first half of 2013.

As more companies use the product, and Grok feeds on more streams of data, the world will be in a better position to judge whether Mr. Hawkins is correct. He evinces few doubts, however.

“This is the future of machine intelligence,” he said. “Twenty years from now the computer industry will be driven by this, I’m certain of it.”

A version of this article appears in print on 12/03/2012, on page B8 of the NewYork edition with the headline: Big Data’s Role Is Still Evolving.


Subpages (2): 9 g