Intel has assembled a team to figure out what technologies will be like 20 years from now
DOCTOR GENEVIEVE Bell is concerned that we aren’t bored enough. For the exuberant and energetic Australian anthropologist and Intel fellow, who has spent over a decade at Intel considering the ways in which we use (and don’t use) technologies and why, curious side journeys into surprisingly complex topics like boredom often start and end with technology.
In this case, mobile phones.
She had been talking to a colleague about the evolution of mobile phones from feature phones (which she thinks were more about sending text messages than phone calls) to smart phones (awash in apps, games, calendars and other diversions), and they were wondering what might come next.
“Neither of them really make a promise about phone calls,” she says. Many of today’s smart phones, such as iPhones and Blackberries, don’t even look like phones, on first glance. She holds up a BlackBerry. “What this promises, is that you never have to be bored again. There will never be those moments where you’ll have nothing to do.”
And that led to months of study about the nature of boredom – a word, she says, which was only coined in 1852, by Charles Dickens in his novel Bleak House. Dickens led her to the philosopher Martin Heidegger's writings on boredom, and then to medical studies.
It turns out that when researchers used magnetic resonance imaging (MRI) to see what is happening inside people’s brains when they are bored, their brains are not inactive. Instead, the centres that are involved in creativity, and synthesising information are quite active, much as they are during sleep.
“It turns out boredom is a different way to go about re-knitting information in our brains,” she says. “We actually need it to be creative.” Constantly fending off boredom with phone apps and iTunes downloads may not be a good thing, when boredom used to mean that we went out and figured out something interesting to do. “Maybe the problem is that we haven’t clawed back enough time to do things.”
And maybe – or maybe not – the mobile phones of the future will reflect a realisation that we don’t need to be quite so constantly and instantly entertained by our devices.
Thinking about what technologies will be like 20 years from now is her new brief at Intel, as director of interaction and experience research, a new division set up a year ago and run out of Intel's Hillsboro, Oregon operations (she has also worked out many ideas in a new book, Divining the Digital Future, from MIT Press).
The lab has about 100 people of mixed disciplines and skills, ranging from anthropologists to sociologists, communication theorists, user experience designers, interface design experts, a team of hardware and software engineers, and even futurists.
“Our charter is both simple and deeply daunting – what will computing be like in 2020? What are the experiences people will value with technology, and how do we make sure that we’re building technology that helps people have the experiences they want? And of course, how to make people love their technology,” she says.
The last point is critical and may seem obvious, but often isn’t. She notes that there are almost invisible technologies that people do tend to love – televisions or mobile phones, which most people operate without a second thought. By contrast, she recalls interviewing a woman in China who said that she felt that carrying her laptop and other devices in her backpack was like lugging around a pack full of baby birds all screaming “Feed me! Feed me!”.
She say she knows exactly what she meant. “What people want is something simple, straightforward, that doesn’t demand a great deal of it – that doesn’t demand that you take care of it.” The best technologies, she says, are those that manage issues like security and connectivity themselves, where the software engineering is almost invisible to the user.
One of the more interesting Intel projects for imagining what our relationship will be like with the technologies of the future has been to look to science fiction for prototyping, she says.
The company has commissioned some leading writers in the area – people like Bruce Sterling and Cory Doctorow – to start with the science fact of today and then produce fiction that pushes today’s factual limits to imagine what might be coming tomorrow.
Her group has also been doing global ethnographic work – one of her specialities – looking at cars. “Not only do they have lots of technology in them, but we take a lot of technology to them,” she says. To try to better understand how people use their cars, she and her researchers travelled the world photographing the contents of people’s cars in Australia, Asia, and the Americas. “It’s been fascinating. Cars are totally cultural items. As soon as you start opening glove compartments, you can tell where people are from.”
She has been thinking about the car as a space in which technology is consumed, but also a place of mobile technology, and realising how much we hide away in our cars – she likens them to garden sheds on wheels – got her thinking about how our technology devices, such as mobile phones, also hide away much of what we use them for.
A phone contains all sorts of invisible functions, services, and applications. “You never quite confront the mess of it, and I wondered if that would change what interactions we had with technology.” She hasn’t come to any conclusions yet and is not quite sure what to do with the research, but thinks useful insights will come from it.
Ask how businesses will interact with technology in 2020, and she starts by laughing off a phrase currently being bandied about: “the post-PC world”. That, she says, is a phrase that is increasingly being used to describe the future for businesses, thanks to the current focus on smart phones, iPads, and similar devices.
“When I hear that phrase, all I think of is ‘the paperless office’ and ‘the cashless society’,” she laughs. What she thinks society is really signalling when they talk about a post-PC world is that “we are finally willing to see that computing is everywhere. It’s not about the desktop or laptop. It doesn’t mean the end of the PC, but the end of the PC as the only digital device we’ll really own.”
Instead, “your employer and company will be working in a world of high device density”, which includes everything from cars to PCs to tablet computers to smart buildings. So companies will need to start thinking about things like information security, device management, and applications and services “in the cloud”.
Cloud computing will bring concerns about latency, the devices that will be used and how, adequate storage space for data, and more. “These are real issues and raise some really interesting questions.”
She thinks we probably underestimate how long it takes to adjust to such major technology shifts. Some theorists have said humans need 10 years or so to adjust to a new innovation, but she says it is often longer – 20 or 30 years.
The mark of a really significant technology, she says, is one that changes both our perceptions and behaviour – technologies that “rewrite time, space, and other people. Those are the ones that are truly powerful.”