DUBLIN—Brian David Johnson is always looking ahead. As a principal engineer and futurist at the world's largest chip manufacturer, Intel, he forecasts the interaction between humans and computers, using insights from a variety of scientific fields, to help the company's engineers and product developers. He says he’s currently focused on the year 2020.
ScienceNOW talked to Johnson after his talk on 12 July at the Euroscience Open Forum, in which he sketched a world full of computers that will develop relationships with the humans they serve. Questions and answers have been edited for brevity and clarity.
Q: What does your work as a futurist look like?
B.D.J.: The day job is to help design the chips 10 years out. I create models that will help find out how people will interact with computers, and these are based on social science, computer science, statistical data, and even some science fiction. I am a principal engineer. I go to the designers and say: “This is what the chip will have to contain and will need to do.”
Q: Can you give an example of the impact of your work on chip design?
B.D.J.: We have two chip types that go into set-top boxes, which are cable boxes for television. Intel did not know anything about television before designing these chips a couple of years ago. For years, companies have tried to convert television into television on demand. That has failed. At Intel we said: “Let’s not turn the television into anything, but let’s send social scientists into the field to find out how people watch television and what they love about it.”
Q: What did the scientists find out?
B.D.J.: It turns out to be a social actor in people’s lives. Watching television is not just watching television.
You can watch it by yourself, with your family, sometimes it keeps you company. Television the way people like it is a mix between television, games, and films; everything outside of people’s work, actually, layered over each other. We then designed prototype chips that have the ability to bring the internet and the television on one screen at the same time. Technically speaking, the chip enables between two and four layers of high-definition video, which it can overlay with HTML and web. We created the capabilities that allowed this to happen.
Q: How does science fiction help?
B.D.J.: Science fiction based on science fact gives us a language to talk about the future. It triggers people to think about this future, whether they like it or not, what they are afraid of or excited about. To give you an example, I am not a synthetic biologist, you are not, but if we read a story about synthetic biology, we can talk about what its applications would mean.
Q: So you build a scenario?
B.D.J.: Yes, but it is a human-based scenario. It is about character, interacting. With this story I can go to the engineers and say: “This is how it could look like.”
Q: You study the interaction between humans and computers. What do you foresee in the next 10, 15 years?
B.D.J.: Looking at the past, technology has been about command and control. In the future it will be about relationships. Our technologies will get to know us and we'll become more tightly connected. That has an impact on what we do productivity-wise, but even more it connects us to the things and people we love. Siri, the personal assistant built into your iPhone, is an early example of that. You literally talk with your phone and it can talk back to you.
Q: In what way does the development of chips play a role in this?
B.D.J.: As we move closer to 2020, the size of computational chips is becoming so small that it is approaching zero. This means we could literally turn anything into a computer. Your tea glass, the table, you name it. There is a switch coming, where we do not have to ask: “Can we turn that into a computer?” but we know we can and we wonder: Is there a use to do it? That is what we have the social scientists for. We do not study markets, we study people.
Q: Have the technical challenges to do that been overcome?
B.D.J.: They will be. Siri is far from perfect. Currently if you have a different accent than people like me who live around Silicon Valley, it will not work well. With more computational power, things will improve, but there is a lot more to be done.
Q: There are downsides to the rapid development of technology. People are said to develop attention deficit hyperactivity disorder (ADHD), due to the information and interaction overload. Do you study that as well?
B.D.J.: Yes, we do. That project is called The Future of Fear. The thing is: These communication technologies are very new. We do not have rules and norms on how to handle them. Those norms will take shape in the coming years. I already talked to people telling me they have set maximum screen time.
Q: And what about the ability to think deep or to remember? Do you consider these issues?
B.D.J.: Yes, we do that. An interesting study by the University of British Columbia published in Science magazine last July showed that indeed we are off-loading our memory to our devices. It is already happening, they said. We have lower rates of recall to information but higher recall rates for access to the information. And what they also said: This is not new. We have been off-loading our oral history to books. That is not bad—it’s progression.
Q: Can you make some predictions about how our relationship with computers will evolve further?
B.D.J.: No, I do not predict! The number one reason: Anyone who does predictions is underestimating the complexity of developments. Anyone who does is probably trying to sell you something. It is my job to look ahead and anticipate—we roughly think that this is going to be it. I don't want to be right, I want to get it right, work towards it.
Q: Have you ever been wrong in these 10 years?
B.D.J.: Well, we had to correct things. We had forecasted television on demand as well, but what we found is that people wanted the internet on their television. We went back to the social scientists and learned that what they wanted was personalization. Your internet is very different from my internet, it totally fits my demands. You can call it hyperpersonalization.