Skip to main content
Menu

Today, a black hole observed … tomorrow, a pothole avoided?

On Wednesday, researchers with the Event Horizon Telescope project released the first images ever taken of a black hole — a gravitational sinkhole in space powerful enough to suck in even light itself. Capturing those images was an amazing feat. But the technologies developed to produce images of a supermassive void 55 million light years away could end up having far-reaching impacts back on Earth. So could the people who developed them.

First theorized by Albert Einstein, black holes have previously been recorded only by the gaps they left in our data. In 2001, for instance, scientists announced that the Hubble Space Telescope observed ultraviolet light becoming dimmer and, eventually, disappearing altogether as it fell into the black hole Cygnus XR-1.

To get a picture of the black hole, itself, the EHT project used a network of 10 Earthbound radio telescopes, linked together to function as a single system. The telescopes collected high-frequency radio waves from space, and four independent teams of scientists used algorithms to convert the radio signals into visual images.

While the task of coordinating telescopes and collecting radio signals was complex and impressive, it’s the algorithm development that is likely to have long-term impacts on technology, said Jonathan Weintroub, an electrical engineer who developed physical instrumentation for the EHT project. His team used off-the-shelf products and existing telescopes to essentially build a globe-spanning telescope like a kid might build a Lego model. That’s no small feat. The final system was able to collect and store 5 petabytes of data. If 1 byte were a 2-foot-by-2-foot tile, then 1 petabyte would cover the whole Earth. But the job of converting that data into an image required the creation of entirely new software tools.

The problem: That global megatelescope (while obviously awesome) is still producing data as holey as a slice of Swiss cheese. The telescopes are collecting photons — packets of light — that fall from space like the proverbial pennies from heaven. But even working together, they can only catch a tiny sampling of those photons. Reconstructing an image from that sparse data set represents a challenge as massive as the black hole itself, Weintroub told me. The algorithms EHT researchers developed were crucial to solving that challenge, and their solution could have wide-ranging implications.

Imagine trying to put together a puzzle with 90 percent of the pieces missing. Not only is it hard to assemble that image correctly, it ends up being hard to even know what image you’re trying to make. “Since we have such sparse measurements, there tends to be an infinite number of images that could match the data,” said Lindy Blackburn, a researcher at the Center for Astrophysics who works as a data scientist on the EHT project.

The algorithms the EHT scientists built help restrain that infinite number of possible images by sorting out which results were physically plausible and which were wildly unlikely. For example, Blackburn told me, the algorithms all tended to favor the images that could explain the measurements taken by the telescopes in the simplest possible way, weeding out the images with lots of fine detail or complicated features. When they applied a list of limitations like that, putting the puzzle together correctly (or, at least, realistically) became a little less hard. It’s not perfect — the image of the black hole is blurry, Blackburn told me, partly because each of the four teams produced a slightly different image and the researchers were a little conservative in choosing which details made it into the final, representative image. But it was enough to turn radio wave data into a picture.

And that matters, Blackburn told me, because astronomy isn’t the only field facing the problem of converting sparse data into images. It comes up in medical imaging, for example, when doctors use MRIs to convert radio waves into pictures of your body. It’s also a key part of self-driving cars, which rely on computer visualization to “see” everything from potholes to people. The kinds of algorithms developed to photograph a black hole built on that research from other fields and, in turn, could help improve the way computers see life on Earth. This blurry image of a dark whirlpool in space could end up as a chapter in the story of how technological developments allowed humans to ride safely in 2-ton hunks of metal and plastic propelled by computers alone.

If that happens, Weintraub said, it won’t just be the technology that changes the future, it’ll also be the people who made it. Many of the people working on imaging technology for the EHT project are, like Blackburn, graduate students. Taking a picture of a black hole didn’t just mean developing some cool tech — it meant empowering a bunch of early-career scientists to come up with different ideas and get really good at creating new tools, right before they disperse throughout academia and industry. Blackburn’s colleague, Katie Bouman, for example, worked in MIT’s computer vision lab as part of a postdoctoral fellowship, collaborating on algorithm improvement across many different fields and computer vision applications. Part of the team that developed the EHT’s “eyes”, she’s set to start her first professorship at Caltech this fall. Years from now, when we think back about what being able to see a black hole for the first time did for humanity, the project’s role as an incubator of scientific talent could end up being its biggest contribution.

CORRECTION (April 12, 2019, 9:07 a.m.): An earlier version of this article misstated Lindy Blackburn’s professional affiliation. He is a researcher at the Center for Astrophysics and no longer a graduate student at MIT.


From ABC News:


Maggie Koerth-Baker is a senior science writer for FiveThirtyEight.

Comments