A student at Kenwood High School in Baltimore County didn’t know what he was inviting when he munched on Doritos after football practice.
“They made me get on my knees, put my hands behind my back, and cuffed me,” Taki Allen said of the police in about “eight cop cars” who surged to his location.
“They searched me, and they figured out I had nothing,” Allen recalled. “Then, they went over to where I was standing and found a bag of chips on the floor. I was just holding a Doritos bag — it was two hands and one finger out, and they said it looked like a gun. . . .
“The first thing I was wondering was, was I about to die? Because they had a gun pointed at me.”
The school’s security system is “AI-powered.”
It “saw” a gun, not Doritos plus finger.
An alert went out before the security system’s finding had been confirmed. The alert was soon cancelled, but the school principal didn’t know this when she called the police, who in turn acted with leap-first/look-afterward brio.
We can’t blame AI. We cannot blame insensate artificial intelligence, so-called, any more than we can blame knives and guns for the way these inanimate objects “act.” The humans in this case bungled bigtime. They should reform.
Steps to take include never acting on the basis of unverified AI claims and never using drunken, hallucinogenic AI as one of your call-the-cops triggers to begin with.
This is Common Sense. I’m Paul Jacob.
Illustration created with Krea and Firefly
See all recent commentary
(simplified and organized)
See recent popular posts
4 replies on “The Dorito Bandito Threat”
I shall hope for the school district, the principal, and the city each to suffer devastation in court.
That was my first thought. Could get interesting.
Humans programmed the AI that was in use. Computers can only perform the tasks they’re programmed to do. Whoever wrote the code that let AI equate three fingers to a gun needs more education. If the school principal wasn’t informed that the alert was cancelled, that is the fault of whoever was monitoring the security system. The principal would have taken the blame for any harm that resulted from a delay in notifying police. See if they modify their procedures going forward.
Pat, while better code as such is sometimes the solution, AI learns through training, and often the way to better AI is more training.
However, in either case, the improvement comes at cost, and is not always worth that cost. Consider, by analogy, how the other tools that you use (including home appliances and cars) can be made more safe, but that safety comes at cost.
As with other tools, often the best approach is not to make the tool idiot-proof, but to keep the tool out of the hands of idiots.
In this case, plainly the tools were in the hands of idiots.