Why your cat is 83 times smarter than your computer, or how government funding can bridge the gap.
This month, io9 editor Annalee Newitz approached a roster of leading science journalists and independent writers with an ambitious project: Public Science Triumphs — an ongoing series of articles highlighting the importance of publicly-funded scientific research in an effort to influence Congress’s budgets supercommittee, which on November 23 will present a proposal for $1.2 trillion in cuts to government spending. For more on what public science actually is and why it matters, see Annalee’s excellent primer on the importance and urgency of the issue.
Brain Pickings is participating, so today we’re highlighting the promise of public science through an intersection of two of our running themes — biomimicry and the future of computing, which converge in the work of University of Michigan researcher Wei Lu. In 2010, Lu led a biologically-inspired computing research project, in which his team used the face-recognition circuitry of the cat brain to model a new kind of machine that promises to outsmart a supercomputer in learning and recognizing faces, making complex decisions, and performing more simultaneous tasks than current computers can manage. For comparison, a typical supercomputer with 140,000 CPUs and a dedicated power supply performs 83 times slower than a cat’s brain. The research was made possible through DARPA funding.
We are building a computer in the same way that nature builds a brain. The idea is to use a completely different paradigm compared to conventional computers. The cat brain sets a realistic goal because it is much simpler than a human brain but still extremely difficult to replicate in complexity and efficiency.” ~ Wei Lu
To mimic how feline brains perform higher-level computations as their synapses link thousands of neurons into complex pathways that store past interactions, Lu’s team connected two electronic circuits with one memristor. This created a system capable of a process called “spike timing dependent plasticity,” which is thought to be the foundation of memory and learning in mammalian brains.
Lu’s research is not only a pinnacle of the cross-disciplinary exploration at the heart of some of the most groundbreaking innovation, but also holds remarkable promise for everything from supercomputer performance to memory and facial recognition technology to sophisticated artificial intelligence decision-making. It’s also a reminder that, rather than a glamorous Eureka! headline, the most valuable research, the kind that serves as a wayfinding signpost for future study, is often just a small but significant step in the incremental process of innovation — a step that takes countless person-hours in the lab, a great deal of technical and intellectual resources and, yes, public science funding to make it all possible.
This article is part of the Public Science Triumphs series. Do you have a story about how publicly-funded science is making the world a better place? Share it at [email protected], with the subject line “public science triumphs.”