Read our COVID-19 research and news.

A head-only MRI scanner, under construction at the Victoria University of Wellington in New Zealand, is one of several portable imaging devices that could raise ethical questions.

Ben Parkinson

Cheap, portable scanners could transform brain imaging. But how will scientists deliver the data?

The machines that scan our brains are usually monstrous contraptions, locked away in high-end research centers. But smaller, cheaper technologies may soon enter the field, like an MRI scanner built for the battlefield and a lightweight, wearable magnetoencephalography system that records magnetic fields generated by the brains of people in motion.

If such devices become widespread, they’ll raise new ethical questions, says Francis Shen, a law professor and neuroethicist at the University of Minnesota (UMN) in Minneapolis and Massachusetts General Hospital in Boston. How should researchers share results with the far-flung populations they may soon be able to study? Could direct-to-consumer neuroimaging become an industry alongside personal genetic testing?

With a grant from the federal Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative, Shen has teamed up with three UMN colleagues, including MRI physicist Michael Garwood, to start a conversation about the ethical implications of portable neuroimaging. Garwood is part of a multicenter team building an MRI machine powerful enough to be used in medical diagnostic tests that weighs just 400 kilograms—less than a tenth of traditional scanners. He expects the new scanner to take its first images in 3 years. And if market demand can bring down the cost of a key component, he thinks it could eventually cost $200,000 or less, versus millions of dollars for current scanners. Shen and Garwood discussed the ethical issues at play with Science, after presenting their work at a meeting of BRAIN Initiative investigators last week in Washington, D.C. This interview has been edited for brevity and clarity.

Q: What is your dream scenario for deploying this new MRI machine?

M.G.: We have several dreams. [One is that MRI] becomes cheap and accessible enough that everybody can get a scan starting around age 30, and then get a scan maybe every other year. Then … artificial intelligence evaluates the images and can come back and tell somebody when there’s been some change in the brain.

More importantly to me … about 90% of the world’s population doesn’t have access to MRI, and it’s the most important technique for detecting and diagnosing brain disease. The brain is behind this hard skull. So many other diseases, you can poke needles in or do a physical exam, but the brain needs imaging. And it’s just a shame that [so much of] the world’s population can’t have it right now because it’s too expensive, or it’s too far away.

Q: What ethical issues come with more portable neuroimaging?

F.S.: Right now, “mobile MRI” means that you put a big scanner on a flatbed truck and you take it from one medical facility to another. [If you find a big tumor while you’re doing research,] all of the ethical infrastructure, like the [institutional review board] … is all in the same ecosystem. If you need to get a radiologist, they’re right there, and the patient is right there.

This breaks that mold in a lot of different ways. You could scan one place Tuesday, drive over to another place Wednesday. … Let’s imagine you’ve taken a brain scan in the Appalachians. The research team is back in Minnesota, the clinician is in Boston. Who is actually going to get back to this person? Are you going to set up Skype? And then when you say, “You know what, you need to go to a hospital,” where are you going to send them? You’ll have to rewrite your code and your informed consent and your incidental findings [policy] to account for all this.

Another issue is that you’re not sending the experts with [the machine]. You’re sending probably a grad student or a technician around. The rest of the expertise [may] rely on artificial intelligence, both to store and then to analyze your data, so it’s unclear who’s going to return the results.

Q: What other issues do you foresee?

F.S.: There’s never been direct-to-consumer brain imaging, and now the vision of this group is that there will be. So you could walk into CVS, and you know how they have that machine to take your blood pressure? In the corner, imagine that there’s just a technician there, just like with the photo [department]. They press a few buttons … and you get your brain data. That raises questions akin to what’s happening around 23andMe. … It’s not that science and society have never dealt with these issues, but it’s that neuroscience has not. And there is something, I think, special about brain images.

Q: What’s so special about them?

F.S.: It’s hard to put one’s finger on it exactly. … I do a lot of work with law and neuroscience, and there’s some evidence to suggest that the invocation of brain language or brain images can be more persuasive than comparable behavioral data for would-be jurors. Reference to the brain carries some special weight. … I suspect that [if] you put that brain scanner out there and you say folks can get their brains scanned, a lot of people will and won’t necessarily know what to do with the results.

Q: What are you hoping will come out of this grant?

F.S.: [Through] these administrative awards, NIH [the National Institutes of Health] has been very good about trying to embed ethicists [in research labs] so that there is this kind of rich dialogue, and it’s not just me sitting back, thinking, “Oh yea, I read a story about portable MRI, now I’m going to write about what I think it will do.” … Our grant is initially just to ask the questions. And then [with] a further grant we’ll have to figure out, alright, how do we actually answer these in a way that still allows the technology to get out there?

M.G.: I’ve learned so much from Francis about all the issues. To me, it’s extremely valuable, because I don’t want to go through building this and then be highly disappointed when I can’t use it.