New York City has deployed a robot to patrol its Times Square subway station. It will retire it within weeks, if not sooner.
The gizmo, called ”K5,” is shaped like a giant Weeble and is stuffed with cameras and other sensors. It has no arms, can’t navigate stairs, and has no capacity for communication (other than a mic connecting it to a human operator somewhere offsite).
But it’s covered in large NYPD stickers. The idea is that it will deter crime simply by its presence.
If that’s the goal, they should have hired a more threatening robot. Maybe something that looked like the fictional Robocop or its purported replacement, the ED-209. Guns and claws are scary. Eggs, not so much.
Heck, even a version of a Dalek, which is little more than a metallic salt shaker with an indeterminant proboscis, would be more likely to frighten away would-be criminals.
A real robot called Spot looks like a dog and moves like a stiff horse, and it can’t do much more than K5. But I’ve seen it live and it’s just scary, like it’s ready to pounce or maybe some laser bean will shoot out of the area where its head should be (like I said, scary). The LA Police Department is testing it.
So, why put a goofy giant rolling white egg in Times Square Station? I can think of at least three reasons:
First, the bureaucrats behind the scheme don’t know a thing about technology.
There’s no need for any special skills to see that K5’s appearance all but dares people to laugh out loud. Inside, it’s effectively a bucket of cameras on wheels, so it’s technically redundant to the cameras that are already peppered throughout the station.
As a robot, it’s bringing no new tech to work. Even its capacity for returning to a charging station to get another jolt was pioneered years ago by Roomba vacuum cleaners.
Second, the bureaucrats don’t know a thing about human behavior.
Privacy advocates are already worried that the surveillance eggs could use facial recognition or other as-yet uninvented tools for monitoring people. Most folks don’t like having machines watching and judging their behavior. There are extremely difficult questions about the future uses of AI in not just looking for criminal behavior but predicting its intent.
The only way the K5 makes a difference is if people assume it’s already doing such nefarious…