Brain Pickings

Posts Tagged ‘psychology’

29 OCTOBER, 2009

East Meets West: An Infographic Portrait

By:

German punctuality, Western ego and how to stand in line like a Chinese.

What’s not to love about minimalist infographics — such an elegant way to depict complex concepts with brilliant simplicity. We also have a longtime love affair with social psychology, some of which deals with the fascinating cultural differences between Eastern and Western mentality — from the individualistic tendencies of the West versus the pluralism of Asian societies, to how differently Westerners and Easterners read the emotions of others. Naturally, we’re head-over-heels with designer Yang Liu‘s ingenious East Meets West infographic series, tackling everything from differences in self-perception to evolution of transportation.

Born in China but living in Germany since she was 14, Liu has a unique grip of this cultural duality — and she channels it with great wit and eloquent minimalism in graphics that say so much by showing so little.

Lifestyle: Independent vs. dependent

Attitude towards punctuality

Problem-solving approach

Size of the individual's ego

Perception: How Germans and the Chinese see one another

How to stand in line

Complexity of self-expression

The evolution of transportation over the last three decades

The volume of sound in a restaurant

Catch an interview with Liu about the project over at the always-excellent NOTCOT. The book is still finable online and an absolute delight.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner:





You can also become a one-time patron with a single donation in any amount:





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

09 SEPTEMBER, 2009

Robots In Our Image

By:

How a biblical creation myth replays itself from bone and muscle to jazz improvisation.

In labs around the world, a new breed is arising. A friendly breed of intelligent machines designed to look like us, move like us, behave like us, and interact with us in ways that seem less and less distinguishable from the human ways.

Cutting-edge AI projects aim, with an impressive degree of success, to embed human social and cognitive skills in intelligent machines that will eventually be able to seamlessly integrate into our social and cultural fabric. These machines will soon be able to read and understand our mental and emotional states and respond to them. They will be able to engage with us, to learn, adapt and share, challenging us to redefine our very conception of human identity and essence.

Here are 6 compelling beacons.

ASIMO

Robots walking amongst us has been a science fiction dream for many years. Recently, however, science is rapidly catching up in bringing this dream into reality. Japan’s technological giant Honda has been building an experimental anthropomorphic robot since the 80’s — ASIMO. His name stands for Advanced Step In Innovative Motion, but it’s hard to avoid associating it with Asimov, the iconic science-fiction author who envisioned intelligent humanoid robots in his stories and was the first to lay down the then-fictional 3 laws of robotics, regulating human-machine interactions.

Since as early as 2000, ASIMO’s advanced models have been capable, among other things, of demonstrating complex body movements, navigating complex environments, recognizing objects and faces, reacting to human postures, gestures and voice commands, and much more. ASIMO can safely conduct himself among us (not bumping into people) and perform an impressive set of complex tasks, like taking and order and serving coffee. Recent models even have limited autonomous learning capabilities.

KISMET

Social interaction, usually taken for granted in our everyday life, is a very complex system of signaling. We all use such signaling to share our rich mental and emotional inner lives. It includes voice, language production and understanding, facial expressions and many additional cues such as gesturing, eye contact, conversational distance, synchronization and more.

The Sociable Machine project at MIT has been exploring this complex system with Kismet, a “sociable machine” that engages people in natural and expressive face-to-face interaction.

The project integrates theories and concepts from infant social development, psychology, ethology and evolution that enable Kismet to enter into natural and intuitive social interaction.

The most significant achievement with Kismet is its ability to learn by direct interaction the way infants learn from their parents — previously a skill inherent only to biological species, and thus a major paradigm shift in robotics.

NEXI

Developed at MIT Media Lab’s Personal Robots Group, Nexi combines ASIMO’s mobility with Kismet’s social interactivity skills. Nexi presents itself as an MDS robot, which stands for Mobile, Dexterous, and Social.

The purpose of this platform is to support research and education goals in human-robot interaction, teaming, and social learning. In particular, the small footprint of the robot (roughly the size of a 3 year old child) allows multiple robots to operate safely within a typical laboratory floor space.

Nexi’s design adds advanced mobility and object manipulation skills to Kismet’s social interactivity. Nexi’s facial expressions, though basic, are engaging and rather convincing. It’s also hard to overlook the “cute” factor at play, reminiscent of human babies.

While still slow and very machine-like in appearance, Nexi demonstrates today what was science fiction just a few years ago.

HANSON ROBOTICS

Hardly anything is more essential to the recognition of humanity than facial expressions, which modulate our communication with cues about our feelings and emotional states. Hanson Robotics combines art with cutting-edge materials and technologies to create extremely realistic robotic faces capable of mimicking human emotional expressions, conversing quite naturally, recognizing and responding to faces, and following eye contact.

We feel that these devices can serve to help to investigate what it means to be human, both scientifically and artistically.

Jules, a Conversational Character Robot designed by David Hanson, has a remarkably expressive face and is equipped with natural language artificial intelligence that realistically simulates human conversational intelligence. This, together with his/her rich nonverbal interaction skills, offers a glimpse of how fast robots are becoming virtually indistinguishable from us — social, interactive, eerily affective.

The team is also working on a futuristic project aiming to develop machine empathy and machine value system based on human culture and ethics that will allow robots to bond with people.

ECCEROBOT

In the quest to create machines in our image, of particular interest is ECCEROBOT — a collaborative project coordinated and funded by the EU Seventh Framework Programme.

ECCE stands for Embodied Cognition in a Compliantly Engineered Robot. Simply put, it means that while other humanoid robots are currently designed to mimic human form but not its anatomy and physiological mechanisms, ECCEROBOT is anthropomimetic — specifically designed to replicate human bone, joint and muscle structure and their complex movement mechanism.

The project leaders believe that human-like cognition and social interaction are intimately connected to the robot’s embodiment. A robot designed according to a human body plan should thus engage more fluently and naturally in human-like behavior and interaction. Such an embodiment would also help researches build robots that learn to engage with their physical environment the way humans do — an interesting concept that brings us a step closer to creating human-like robotic companions.

SHIMON

Music, many of us believe, makes us distinctively human — playing music together, especially improvising, is perhaps one of the most impressive and complex demonstrations of human collaborative intelligence where the whole becomes much more than the sum of its parts.

But extraordinarily skillful music-playing robots are already challenging this very belief. Earlier this year, we saw the stage debut of Shimon — a robotic marimba artist developed at the The Georgia Tech Center for Music Technology. Shimon doesn’t look the least bit human, entirely lacks mobility and affective social skills, but is capable of something definitely considered exclusively human — playing real-time jazz improvisation with a human partner.

Shimon isn’t merely playing a pre-programmed set of notes, but is capable to intelligently respond to fellow human players and collaborate with them, producing surprising variations on the played theme. The robot’s head (not on video), currently implemented in software animation, provides fellow musicians with visual cues that represent social-musical elements, from beat detection and tonality to attention and spatial interaction.

Spaceweaver is a thinker, futurist and writer living in future tense, mostly on the web. Check out his blogs at Space Collective and K21st, and follow him on Friendfeed and Twitter.

Brain Pickings has a free weekly newsletter and people say it’s cool. It comes out on Sundays and offers the week’s best articles. Here’s an example. Like? Sign up.

10 AUGUST, 2009

Subway Personality: The MBTI Map

By:

What your subway station has to do with your propensity for extroversion.

We love psychology. We love data visualization. So we’re all over the MBTI Map, a visualization showing the relationships between human personality descriptors from the Myers-Briggs Type Indicator test — a tool designed to make iconic Swiss psychologist Carl Jung’s theory of psychological types more digestible — using subway lines as a metaphor for the connections between the different representative words and personality types.

A product of the Integrated Design Laboratory at Korea’s Ajou University, the map is a rare application of information design to the fields of psychology and sociology — and a brave effort to capture something as vague and abstract as personality visually and concretely.

Using the 161 words in the MBTI test, the team conducted a survey asking the relative closeness between pairs of words. Using cluster analysis, they extracted a total of 39 representative words. These were then arranged spatially using multidimensional scaling (MDS), which explores the similarities and dissimilarities of data, and wrapped in a subway metaphor.

Each subway line represents one of the 16 MBTI personality types, with subway stations arranged based on the semantic distance of the 39 word descriptors based on the MDS analysis. The outer circle contains the 161 original word descriptors from the test, grouped in 8 layers based on their hierarchical order. Finally, the colors of the words intuitively represent their meaning — so “calm” is in the blue spectrum and “passionate” in the red.

Pore over the brilliantly crafted map in this high-res PDF. And why not kill a few hours by taking one of these Jung-inspired tests, each resulting in a four-letter personality type, then finding yourself on the map? They aren’t the real MBTI deal, but they’re free and a ton of fun.

Brain Pickings has a free weekly newsletter and people say it’s cool. It comes out on Sundays and offers the week’s best articles. Here’s an example. Like? Sign up.

07 AUGUST, 2009

Notes & Neurons: Music, Emotion and the Brain

By:

From axons to a cappella, or why music gives us chills and thrills.

Music is easily the widest-reaching, most universal emotional facilitator. Anecdotally, it shapes so many of life’s everyday experiences: An epic movie would fall flat without a cinematic soundtrack, a party without dance music is unthinkable, and a run without an upbeat playlist feels somehow much more tiresome. Scientifically, music has been shown to impact anything from our alertness and relaxation to our memory to our physical and emotional well-being.

Today, we take a look at just how music affects our brain and emotion, with Notes & Neurons: In Search of a Common Chorus — a fascinating event from the 2009 World Science Festival.

But before we launch into the geekier portion, here’s a quick improvised treat from phenomenal jazz and a cappella performer Bobby McFerrin, who embodies the intimate relationship between music and the human element.

The panel — hosted by John Schaefer and featuring Jamshed Barucha, scientist Daniel Levitin, Professor Lawrence Parsons and Bobby McFerrin — takes us through a series of live performances and demonstrations that illustrate music’s interaction with the brain and our emotions, exploring some of the most interesting questions about this incredible phenomenon.

Is our response to music hard-wired or culturally determined? Is the reaction to rhythm and melody universal or influenced by environment?

We encourage you to see the full Notes & Neurons: In Search of a Common Chorus program, or snack on some more digestible bites over at World Science Festival’s Vimeo channel.

And while we’re at it, we highly recommend neuroscientist Oliver Sacks’ Musicophilia: Tales of Music and the Brain — an utterly fascinating read about the extreme effect music can have on our cognitive and emotional lives.

Brain Pickings has a free weekly newsletter and people say it’s cool. It comes out on Sundays and offers the week’s best articles. Here’s an example. Like? Sign up.