Recap #42: mind-controlled objects & mind-reading technologies

This week's Recap covers mind-controlled objects, mind-reading technologies, the new data economy, and more.

2 months ago   •   2 min read

By Caroline Barrueco, Michell Zappa,

AI generative illustration by Caroline Barrueco made with with the prompt "Mind-Reading Technologies".

Welcome to the Envisioning Community Recap, a weekly publication in which we curate the best stories about emerging technology and futures literacy.

Thoreau claimed that “our inventions are […] but improved means to an unimproved end” and that we are in great haste to construct things without necessarily having anything important to do with such things. This can be said about technology at large, which too often strengthens existing power structures without sufficiently considering its downstream effects. Whether systemic change is possible remains to be seen – I have as many moments of hope as ones of frustration. Maybe that’s our role: to prepare a future where everybody thrives. ~MZ

Emerging Technology

🪐 The New Data Economy
Community member Itay Katz published an introduction to web3 approaches to cloud storage, with a focus on Interplanetary Filesystem (IPFS) and Filecoin. Don't miss Part 2.

🦠 New Type of Ultraviolet Light Makes Indoor Air as Safe as Outdoors
A different type of UV light, known as far-UVC light, took less than five minutes to reduce the level of indoor airborne microbes by more than 98%. It could be used to disinfect indoor environments.

⚛️ The Quantum Technology Ecosystem – Explained
Author Steve Blank presents an overview of quantum technology; starting with key concepts and expanding to explain several emerging use cases possibilities.

🕶️ Snapchat Thinks Brain-Controlled AR Glasses Are the Future
Snapchat just bought NextMind, a company that develops neural controllers which let users interact with AR and VR object using only their minds.

💬 Completely Locked-in Man Uses Brain-Computer Interface to Communicate
An ALS patient, who had lost the ability to talk and move, had a microelectrode implanted on the surface of his motor cortex. The patient-generated brain signals by attempting to move, which were picked up by the implanted microelectrodes and decoded by a machine learning model to communicate "yes" or "no".

Future Perspective: Imagine a world in which brain-machine interfaces are mainstream. Information could be gathered from neural communication and transferred to the cloud, allowing for direct conversation between humans, animals, and machines without the need for physical gestures or sound. Individual thoughts could become accessible publicly and identities would be diluted in a post-privacy society.

Tech & Culture

🥸 Gadget Lab: When Facial Recognition Tech Is Wrong
Wire podcast Gadget Lab identifies the main blind spots of facial recognition technology today, especially when it comes to racist bias in crime-solving.

Caroline Barrueco
Research Fellow

Spread the word

Keep reading