AI And The Dancing Mushroom
It sounds like the title of a Roald Dahl story, but researchers have devised a robot that moves in response to the wishes of a mushroom.
OK, so a shroom might not desire to jump or walk across a room, but they possess neuron-like branch-things called hyphae that transmit electrical impulses in response to changes in light, temperature, and other stimuli.
These impulses can vary in amplitude, frequency, and duration, and mushrooms can share them with one another in a quasi-language that one researcher believes yields at least 50 words that can be organized into sentences.
Still, to call that thinking is probably too generous, though a goodly portion of our own daily cognitive activity is no more, er, thoughtful than similar responses to prompts with the appropriate grunt or simple declaration.
But doesn’t it represent some form of intelligence, informed by some type of awareness?
The video of the dancing mushroom robot suggests that the AI sensed the mushroom’s intentionality to move. It’s not necessarily true, since the researchers had to make some arbitrary decisions about which stimuli would trigger what actions, but the connection between the organism and machine is still quite real, and it suggests stunning potential for the further development of an AI that mediates that interchange.
Much is written about the race to make AI sentient so that we can interact with it as if we were talking to one another, and then it could go on to resolve questions as we would but only better, faster, and more reliably.
Yet, like our own behavior, a majority of what happens around the world doesn’t require such higher-level conversation or contemplation.
There are already many billions of sensors in use that capture changes in light, temperature, and other stimuli, and then prompt programmed responses.
Thermostats trigger HVAC units to start or stop. Radars in airplanes tell pilots to avoid storms and trigger a ping when your car drifts over a lane divider. My computer turned on this morning because the button I pushed sensed my intention and translated it into action.
Big data reads minds, of a sort, by analyzing enough external data so that a predictive model can suggest what we might internally plan to do next. It’s what powers those eerily prescient ads or social media content that somehow has a bulls-eye focus on the topics you love to get angry about.
The mushroom robot research suggests ways to make these connections — between observation and action, between internal states of being and the external world — more nuanced and direct.
Imagine farms where each head of lettuce manages its own feeding and water supply. House pets that articulate how they feel beyond a thwapping tail or sullen quiet. Urban lawns that can flash a light or shoot a laser to keep dogs from peeing on them.
AI as a cross-species Universal Translator.
It gets wilder after that. Imagine the complex systems of our bodies being able to better manage their interaction, starting with prescribing a bespoke vitamin to start every day and leading to more real-time regulation of water intake, etc. (or microscopic AIs that literally get inside of us and encourage our organs and glands to up their game).
Think of how the AI could be used by people who have infirmities that impede their movement or even block their interaction with the outside world. Faster, more responsive exoskeletons. Better hearing and sight augmentation. Active sensing and responses to counter the frustrating commands of MS or other neurological diseases.
Then, how about looking beyond living things and applying AI models to sense the “intentionally” of, say, a building or bridge to stay upright or resist catching on fire, and then empowering them to “stay healthy” by adjusting stresses of weight and its allocation.
It’s all a huge leap beyond a dancing mushroom robot, but it’s not impossible.
Of course, there’s a downside to such imagined benefits: The same AI that can sense when a mushroom wants to dance will know, by default, how to trigger that intention. Tech that better reads us will be equally adept at reading to us.
The Universal Translator will work both ways.
There are ethical questions here that are profound and worthy of spirited debate, but I doubt we’ll ever have them. AI naysayers will rightly point out that a dancing mushroom robot is a far cry from an AI that reads the minds of inanimate objects, let alone people.
But AI believers will continue their development work.
The dance is going to continue.
[This essay appeared originally at Spiritual Telegraph]