Americans spend more than 10 hours every day staring at screens. But researchers are developing cutting-edge interfaces that could change the way we interact with digital media–interfaces which, in many cases, don’t require screens at all. Instead, your skin is the interface. Objects around you are the interface. Architecture itself is the interface.
The evolution–or devolution–of the interface was prominently on display this week at the Association for Computing Machinery’s 2018 CHI Conference on Human Factors in Computing Systems (or ACM CHI, for short). The conference is a hub for the world’s top minds to share the latest breakthroughs in human-computer interaction. The papers and projects presented at ACM CHI tend to act like a barometer for what the future of computers might look like–and this year, it seemed to suggest that our computers will be increasingly embedded in the world around us.
Here are six of the conference’s most fascinating prototypes.
Who needs a screen when your house can do double duty? This project, a collaboration between Disney Research and Carnegie Mellon’s Future Interfaces Group, transforms a simple wall into a touchscreen using inexpensive materials. The idea is clever because it only uses layers of conductive paint and copper tape that only cost about $20 per 10.75 square feet. So what can you do with a smart wall anyway? You could turn certain spots on your bedroom wall into buttons that turn your lights on or off, for instance, or program a specific action to serve as a password to unlock a door.
A new project out of MIT Media Lab’s Tangible Media Group uses a technique called “electrowetting” to move water droplets around on a surface, essentially creating a water-based computer interface. They call it a “calm interface.” The team’s video–which won the conference’s award for best video demo–shows how the water drops could be used in art and interactive gaming. In one demo, a droplet is manipulated by the user tilting the game board, and the others are programmed to flee when that droplet comes near. The team imagines how the droplets could be used for communication: A woman draws a message “have a nice day” on her phone, and that message appears in the mirror of her home, where her partner is brushing his teeth. That’s way sweeter than a text message.
Who said health-tracking has to be serious? BioFidget is an augmented fidget spinner that also happens to have a heart rate variability sensor and a respiration sensor. Researchers from the department of industrial design at Eindhoven University of Technology designed it to help reduce stress: you blow on the protrusions of the spinner to make it spin, and the slow repetition of this practice is meant to calm down your physiological stress response. White and red lights on the device light up to let you know when your heart rate and pulse have slowed down. Most importantly, the experience doesn’t require any sensors stuck to the users’ skin, unlike many medical tests and health tracking devices. BioFidget provides more evidence that today’s smartest interfaces are in the most unexpected places.
Speaking of smart watches, another project out of the Future Interfaces Group at Carnegie Mellon has another solution for the small size of a smart watch screen: why not just project onto the skin of your arm? As the researchers acknowledge in their paper, this is “a long-standing yet elusive goal, largely written off as science fiction”–one no one has achieved yet. But the LumiWatch can actually project a touchscreen onto your skin. It’s an exciting development, bringing some of the more far-flung ideas about turning your skin into an interface closer to something you could buy in a store.