Brain Pickings

Posts Tagged ‘culture’

22 SEPTEMBER, 2011

The Communist Threat: A Trip Through America’s Ideological Wayback Machine

By:

From Walt Disney to Stalin, or how 1952 America interpreted the Soviet regime.

During World War II, some of the West’s greatest filmmakers — including Frank Capra, John Huston, John Ford, and Alfred Hitchcock — put their Hollywood films on hiatus and started producing propaganda films on behalf of the U.S. government. Even Walt Disney did his part. Eventually, when the war drew to a close, these iconic filmmakers went back to making commercial films. But propaganda films kept right on going. The Cold War was getting underway, and because the danger was more potential than actual, the U.S. government felt an extra need to paint a picture for its citizens.

Just what was the existential threat coming out of the Soviet Union? A series of films made it clear. Some , like Communism (1952), offered a brief overview of the historical and ideological foundations of Communism and its point men — Marx, Lenin, Stalin, and the rest. Others, like the famous Duck and Cover educational video, gave young Americans and their parents every reason to fear the atomic bomb. And others still talked about the superiority of capitalism and the American way of life.

The fact that the Soviet regime (which produced its own Cold War propaganda) was repressive, no one doubts. But whether the regime truly posed an existential threat to the U.S. has remained somewhat open to debate. Just watch Noam Chomsky speaking on the matter in 1985.

Dan Colman edits Open Culture, which brings you the best free educational media available on the web — free online courses, audio books, movies and more. By day, he directs the Continuing Studies Program at Stanford University. You can find Open Culture on Twitter and Facebook

Brain Pickings has a free weekly newsletter and people say it’s cool. It comes out on Sundays and offers the week’s best articles. Here’s an example. Like? Sign up.

21 SEPTEMBER, 2011

Culturomics: What We Can Learn from 5 Million Books

By:

How to put your “beft” foot forward, or what the algorithm of censorship has to do with 1950.

We’ve already established that we could learn a remarkable amount about language from these 5 essential books, but imagine what we could learn from 5 million books. In this excellent talk from TEDxBoston, Harvard scientists Jean-Baptiste Michel and Erez Lieberman Aiden reveal fascinating insights from their computational tool that inspired Google Labs’ addictive NGram Viewer, which pulls from a database of 500 billion words and ideas culled from 5 million books across many centuries, 12% of the books that have ever been published.

They call their approach Culturomics — “the application of massive scale data collection and analysis to the study of human culture.” From advising you on the best career choices for early success to figuring out when an artist is being censored to proving that we’re forgetting the past exponentially more quickly than ever before, the data speaks volumes when queried with intelligence and curiosity.

[The database pulls from] a collection of 5 million books. 500 billion words. A string of characters a thousand times longer than the human genome. A text which, when written out, would stretch from here to the moon and back ten times over. A veritable shard of our cultural genome.”

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner:





You can also become a one-time patron with a single donation in any amount:





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

21 SEPTEMBER, 2011

Public Science Triumphs: Cat-Inspired Computing of the Future

By:

Why your cat is 83 times smarter than your computer, or how government funding can bridge the gap.

This month, io9 editor Annalee Newitz approached a roster of leading science journalists and independent writers with an ambitious project: Public Science Triumphs — an ongoing series of articles highlighting the importance of publicly-funded scientific research in an effort to influence Congress’s budgets supercommittee, which on November 23 will present a proposal for $1.2 trillion in cuts to government spending. For more on what public science actually is and why it matters, see Annalee’s excellent primer on the importance and urgency of the issue.

Brain Pickings is participating, so today we’re highlighting the promise of public science through an intersection of two of our running themes — biomimicry and the future of computing, which converge in the work of University of Michigan researcher Wei Lu. In 2010, Lu led a biologically-inspired computing research project, in which his team used the face-recognition circuitry of the cat brain to model a new kind of machine that promises to outsmart a supercomputer in learning and recognizing faces, making complex decisions, and performing more simultaneous tasks than current computers can manage. For comparison, a typical supercomputer with 140,000 CPUs and a dedicated power supply performs 83 times slower than a cat’s brain. The research was made possible through DARPA funding.

We are building a computer in the same way that nature builds a brain. The idea is to use a completely different paradigm compared to conventional computers. The cat brain sets a realistic goal because it is much simpler than a human brain but still extremely difficult to replicate in complexity and efficiency.” ~ Wei Lu

To mimic how feline brains perform higher-level computations as their synapses link thousands of neurons into complex pathways that store past interactions, Lu’s team connected two electronic circuits with one memristor. This created a system capable of a process called “spike timing dependent plasticity,” which is thought to be the foundation of memory and learning in mammalian brains.

Lu’s research is not only a pinnacle of the cross-disciplinary exploration at the heart of some of the most groundbreaking innovation, but also holds remarkable promise for everything from supercomputer performance to memory and facial recognition technology to sophisticated artificial intelligence decision-making. It’s also a reminder that, rather than a glamorous Eureka! headline, the most valuable research, the kind that serves as a wayfinding signpost for future study, is often just a small but significant step in the incremental process of innovation — a step that takes countless person-hours in the lab, a great deal of technical and intellectual resources and, yes, public science funding to make it all possible.

This article is part of the Public Science Triumphs series. Do you have a story about how publicly-funded science is making the world a better place? Share it at [email protected], with the subject line “public science triumphs.”

Brain Pickings has a free weekly newsletter and people say it’s cool. It comes out on Sundays and offers the week’s best articles. Here’s an example. Like? Sign up.