Humans 2.0: Seeing Ourselves Anew in ‘Algorithmic Cascades of Data’

Sensors are cheap and abundant. They’re already in our devices, and soon enough, many of us may elect to carry sensors in and on our bodies, and embed them in our homes, offices, and cities. This terrifies people, Jason Silva says in a new video.

Who hasn’t heard of Big Brother or feared the rise of the surveillance state? But Silva says there’s an upside.

As the world is reduced to “algorithmic cascades of data” he thinks we’ll get what Steven Johnson calls the “long view,” like a microscope or telescope for previously invisible information and datasets.

Billions of sensors measuring location, motion, orientation, pressure, temperature, vital signs and more—each of these will be like a pixel. Seen up close, a modestly flashing primary color. But at a distance, individual pixels dissolve. Discrete points will smooth out into a contiguous image no one could have guessed by looking at each pixel alone.

Exactly what image will our sensors reveal?

Silva thinks it will be like looking in a mirror and, seeing ourselves individually and collectively for the first time, will spark a feedback loop—one in which information leads to new behaviors, new behaviors to new information.

“Who knows what we might learn about ourselves?” Silva asks. “How might we be able to take that data and use those insights to feed them back into us?

Image Credit:

Jason Dorrier
Jason Dorrier
Jason is editorial director of Singularity Hub. He researched and wrote about finance and economics before moving on to science and technology. He's curious about pretty much everything, but especially loves learning about and sharing big ideas and advances in artificial intelligence, computing, robotics, biotech, neuroscience, and space.
Don't miss a trend
Get Hub delivered to your inbox