Brain Pickings

Posts Tagged ‘software’

15 JUNE, 2011

Brian X. Chen on How the iPhone Changed Everything

By:

Business advice from Steve Jobs, or why everything you knew about multitasking might be wrong.

Last month, we took a look at how Shakespeare changed everything. It turns out, the great bard may have some stiff competition in the face of another cultural agent: the iPhone. At least that’s the premise of Always On: How the iPhone Unlocked the Anything-Anytime-Anywhere Future — and Locked Us In, a fascinating new addition to this list of essential books on the future of the internet by Wired contributor Brian X. Chen that explores how the “Jesus phone” transcended its status as a mere gadget to become a powerful force of cultural change.

Today, I sit down with Brian to chat about the secret to Apple’s success, open experiences amidst closed platforms, and what we can do to be smart information omnivores.

q1

What was it about the iPhone that transformed it from a personal technology to a conduit of cultural change?

There are two pieces to the iPhone zeitgeist: the product itself and the App Store business platform. Somehow, Steve Jobs negotiated with AT&T to carry the iPhone without even allowing the carrier to touch or see the device; the handset’s hardware and software were designed entirely by Apple. This was a significant turning point in the wireless industry, because previously carriers told the manufacturers what features to put in their handsets.

The second piece is just as significant: the App Store, which opened in 2008. The App Store allowed any programmer put up an app for sale in the App Store. And for the customer, the App Store was an extremely simple way to purchase apps with the tap of a button. The store opened the floodgates for hundreds of thousands of “apps” — 400,000 to date.

Now the iPhone isn’t just a smartphone, but also a medical device, a musical instrument, an education tool and thousands of other apps. A single app has potential to compete with an entire industry and impact our culture.

q2

How has Apple managed to find and retain success in a vertical, closed business model in the age of sharing, open-source and collaborative consumption?

It’s interesting that Apple is the most valuable corporation in the world today thanks largely to its vertically integrated business model, whereas in the past it was a niche player in the PC industry with the same approach. One broad reason is that times have changed, and now that computers have become a mainstream staple, the iPhone entered the picture to offer something fresh, new and more convenient for customers than ever before.

The fundamental reason the iPhone is so convenient is because its design and app ecosystem are tightly controlled by one company, Apple.

Furthermore, despite being closed and exclusive to Apple hardware, the iPhone, and now the iPad, are succeeding thanks to the gigantic army of developers providing apps. Many of these apps do enable people to share and collaborate (e.g., we still have Twitter apps, a Dropbox app, Facebook, etc.) Even though this is a “closed” platform, we still get more from the iPhone experience than we do other platforms, because there are more programmers contributing to the App Store compared to competing stores.

q3

A lot has been said about how social technology is changing the way we think. Where do you stand on this, as it pertains to the iPhone?

Many journalists have already concluded that the “multitasking” we do in this always-on lifestyle is bad for the brain. However, little research backs these claims. One study on “media multitasking” by Stanford found that people who juggled around a lot of media (e-mail, videos, music) were poor at concentrating compared to those who didn’t consume much media. But a study by University of Utah found that a small number of people are incredibly good at multitasking, which challenges the theory that multitasking is bad for the brain. I urge people to be cautious about drawing hasty conclusions.

I’d say rather than live in fear of smartphones, we can be more productive by asking ourselves what we can do NOW with this technology to make ourselves more powerful individuals.

What apps can I download to be better at my job, or help improve my health, or contribute to a community? In my book I tell stories about people using always-on technology in incredible ways, like a blind man who is able to use apps to “see” and take pictures, and scientists using smartphones to diagnose malaria in Africa. This is the future at our fingertips.

Ed. note: Always On is out today this month and a must-read for smartphone-slingers and cultural scholars alike.

Brain Pickings has a free weekly newsletter and people say it’s cool. It comes out on Sundays and offers the week’s best articles. Here’s an example. Like? Sign up.

24 MAY, 2011

The Interface is the Message: Aaron Koblin on Visual Storytelling at TED

By:

What 10,000 sheep have to do with Johnny Cash, Marshall McLuhan and the evolution of storytelling.

I was thrilled to see my friend Aaron Koblin, presently of Google Creative Labs, take the TED stage earlier this year. I’m an enormous data viz geek, I’m deeply interested in the evolution of storytelling, and have been a longtime supporter of Aaron’s work. This talk is an excellent primer to both the discipline itself and Aaron’s stellar projects within it, but also an insight-packed treasure chest even for those already immersed in the world of data visualization. Perhaps most interestingly, Aaron revises iconic media theorist Marshall McLuhan‘s revered catchphrase, “The medium is the message,” to a thought-proviking, culture-appropriate modernization: “The interface is the message.”

An interface can be a powerful narrative device, and as we collect more and more personally and socially relevant data, we have an opportunity and maybe even an obligation to maintain the humanity and tell some amazing stories as we explore and collaborate together.” ~ Aaron Koblin

Aaron mentions a number of projects previously featured on Brain Pickings: The Sheep Market, A Bicycle Built for 2,000 and The Johnny Cash Project, if you’d like to take a closer look.

For more on the kind of magic Aaron is making, you won’t go wrong with Data Flow 2: Visualizing Information in Graphic Design — easily the most comprehensive compendium on data visualization candy around.

Brain Pickings has a free weekly newsletter and people say it’s cool. It comes out on Sundays and offers the week’s best articles. Here’s an example. Like? Sign up.

12 MAY, 2011

The Filter Bubble: Algorithm vs. Curator & the Value of Serendipity

By:

How the web gives us what we want to see, and that’s not necessarily a good thing.

Most of us are aware that our web experience is somewhat customized by our browsing history, social graph and other factors. But this sort of information-tailoring takes place on a much more sophisticated, deeper and far-reaching level than we dare suspect. (Did you know that Google takes into account 57 individual data points before serving you the results you searched for?) That’s exactly what Eli Pariser, founder of public policy advocacy group MoveOn.org, explores in his fascinating and, depending on where you fall on the privacy spectrum, potentially unsettling new book, The Filter Bubble — a compelling deep-dive into the invisible algorithmic editing on the web, a world where we’re being shown more of what algorithms think we want to see and less of what we should see.

I met Eli in March at TED, where he introduced the concepts from the book in one of this year’s best TED talks. Today, I sit down with him to chat about what exactly “the filter bubble” is, how much we should worry about Google, and what our responsibility is as content consumers and curators — exclusive Q&A follows his excellent TED talk:

The primary purpose of an editor [is] to extend the horizon of what people are interested in and what people know. Giving people what they think they want is easy, but it’s also not very satisfying: the same stuff, over and over again. Great editors are like great matchmakers: they introduce people to whole new ways of thinking, and they fall in love.” ~ Eli Pariser

q0

What, exactly, is “the filter bubble”?

EP: Your filter bubble is the personal universe of information that you live in online — unique and constructed just for you by the array of personalized filters that now power the web. Facebook contributes things to read and friends’ status updates, Google personally tailors your search queries, and Yahoo News and Google News tailor your news. It’s a comfortable place, the filter bubble — by definition, it’s populated by the things that most compel you to click. But it’s also a real problem: the set of things we’re likely to click on (sex, gossip, things that are highly personally relevant) isn’t the same as the set of things we need to know.

q1

How did you first get the idea of investigating this?

EP: I came across a Google blog post declaring that search was personalized for everyone, and it blew my mind. I had no idea that Google was tailoring its search results on an individual basis at all — the last I’d heard, it was showing everyone the same “authoritative” results. I got out my computer and tried it with a friend, and the results were almost entirely different. And then I discovered that Google was far from the only company that was doing this. In fact, nearly every major website is, in one way or another. (Wikipedia is a notable exception.)

q2

In an age of information overload, algorithms certainly finding the most relevant information about what we’re already interested in more efficiently. But it’s human curators who point us to the kinds of things we didn’t know we were interested in until, well, until we are. How does the human element fit into the filter bubble and what do you see as the future of striking this balance between algorithmic efficiency and curatorial serendipity?

EP: The great thing about algorithms is that, once you’ve got them rolling, they’re very cheap. Facebook doesn’t have to pay many people to edit the News Feed. But the News Feed also lacks any real curatorial values — what you’re willing to Like is a poor proxy for what you’d actually like to see or especially what you need to see. Human curators are way better at that, for now — knowing that even though we don’t click on Afghanistan much we need to hear about it because, well, there’s a war on. The sweet spot, at least for the near future, is probably a mix of both.

One interesting place this comes up is at Netflix — the basic math behind the Netflix code tends to be conservative. Netflix uses an algorithm called Root Mean Squared Error (RMSE, to geeks), which basically calculates the “distance” between different movies. The problem with RMSE is that while it’s very good at predicting what movies you’ll like — generally it’s under one star off — it’s conservative. It would rather be right and show you a movie that you’ll rate a four, than show you a movie that has a 50% chance of being a five and a 50% chance of being a one. Human curators are often more likely to take these kinds of risks.

q3

How much does Google really know about us, in practical terms, and — more importantly — how much should we care?

EP: That depends on how much you use Google — about me, it knows an awful lot. Just think: it’s got all of my email, so it not only has everything I’ve written to friends, it has a good sense of who I’m connected to. It knows everything I’ve searched for in the last few years, and probably how long I lingered between searching for something and clicking the link. There are 57 signals that Google tracks about each user, one engineer told me, even if you’re not logged in.

Most of the time, this doesn’t have much practical consequence. But one of the problems with this kind of massive consolidation is that what Google knows, any government that is friends with Google can know, too. And companies like Yahoo have turned over massive amounts of data to the US government without so much as a subpoena.

Companies like Yahoo have turned over massive amounts of data to the US government without so much as a subpoena.” ~ Eli Pariser

I’d also argue there’s a basic problem with a system in which Google makes billions off of the data we give it without giving us much control over how it’s used or even what it is.

q4

Do you think that we, as editors and curators, have a certain civic responsibility to expose audiences to viewpoints and information outside their comfort zones in an effort to counteract this algorithmically-driven confirmation bias, or are people better left unburdened by conflicting data points?

EP: In some ways, I think that’s the primary purpose of an editor — to extend the horizon of what people are interested in and what people know. Giving people what they think they want is easy, but it’s also not very satisfying: the same stuff, over and over again. Great editors are like great matchmakers: they introduce people to whole new ways of thinking, and they fall in love.

q4

Is it possible to reconcile personalization and privacy? What are some things we could do in our digital lives to strike an optimal balance?

EP: Well, personalization is sort of privacy turned inside out: it’s not the problem of controlling what the world knows about you, it’s the problem of what you get to see of the world. We ought to have more control over that — one of the most pernicious things about the filter bubble is that mostly it’s happening invisibly — and we should demand it of the companies we use. (They tend to argue that consumers don’t care — we should let them know we do.)

On an individual level, I think it comes down to varying your information pathways. There was a great This American Life episode which included an interview with the guy who looks at new mousetrap designs at the biggest mousetrap supply company. As it turns out, there’s not much need for a better mousetrap, because the standard trap does incredibly well, killing mice 90% of the time.

The reason is simple: Mice always run the same route, often several times a day. Put a trap along that route, and it’s very likely that the mouse will find it and become ensnared.

So, the moral here is: don’t be a mouse. Vary your online routine, rather than returning to the same sites every day. It’s not just that experiencing different perspectives and ideas and views is better for you — serendipity can be a shortcut to joy.

Ed. note: The Filter Bubble is out today and one of the timeliest, most thought-provoking books I’ve read in a long time — required reading as we embrace our role as informed and empowered civic agents in the world of web citizenship.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner:





You can also become a one-time patron with a single donation in any amount:





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.