OK, you may say I’m over-sensitive, but a headline today from Google’s blog that others may chuckle about (“Noodle on this: Machine learning that can identify ramen by shop“) left me profoundly worried about some engineers’ tone-deaf insensitivity to growing public concern about privacy and security.
This is not going to be pleasant for many readers, but bear with me — IMHO, it’s important to the IoT’s survival.
As I’ve written before, I learned during my work on corporate crisis management in the 80’s and 90’s that there’s an all-too-frequent gulf between the public and engineers on fear. Engineers, as left-brained and logical as they come (or, in Myers-Briggs lingo, ISTJs, “logical, detached and detailed” and the polar opposite of ENFP’s such as me, ” caring, creative, quick and impulsive” ) are ideally-suited for the precision needs of their profession — but often (but not always, I’ll admit…) clueless about how the rest of us respond to things such as the Russian disruption of our sacred political institutions via Facebook or any of the numerous violations of personal privacy and security that have taken place with IoT devices lacking in basic protections.
The situation is bad, and getting worse. In one Pew poll, 16% or less of Americans felt that a wide range of institutions, from companies to government, were protecting their information.
Engineers are quick to dismiss the resulting fear because it isn’t logical. But, as I’ve written before, the fact fear isn’t logical doesn’t mean it isn’t really real for many people, and can cloud their thought processes and decision-making.
Even worse, it’s cumulative and can ensnare good companies as well as bad. After a while, all the privacy and security violations get conflated in their minds.
Exhibit A for this insensitivity? The despicable memo from Facebook VP Andrew Bosworth:
““Maybe someone dies in a terrorist attack coordinated on our tools. And still we connect people. The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is *de facto* good.”
Eventually he, begrudgingly, apologized, as did Mark Zuckerberg, but, IMHO that was just facesaving. Why didn’t anyone at Facebook demand a retraction immediately, and why did some at Facebook get mad not at Bosworth but instead at anyone who’d leak such information? They and the corporate culture are as guilty as Bosworth in my mind.
So why do I bring up the story about identifying the source of your ramen using AI, which was surely written totally innocently by a Google engineer who thought it would be a cute example of how AI can be applied to a wide range of subjects? It’s because I read it — with my antennae admittedly sharpened by all the recent abuses — as something that might have been funny several years ago but should have gone unpublished now in light of all the fears about privacy and security. Think of this little fun project the way a lot of the people I try to counsel on technology fears every day would have: you mean they now can and will find out where I get my noodles? What the hell else do they know about me, and who will they give that information to???
Again, I’m quite willing to admit I may be over-reacting because of my own horror about the nonchalance on privacy and security, but I don’t think so.
That’s why I’ll conclude this screed with a call for all IoT engineers to undergo mandatory privacy and security training on a continuing basis. The risk of losing consumer confidence in their products and services is simply too great for them to get off the hook because that’s not their job. If you do IoT, privacy and security is part of the job description.
End of sermon. Go about your business.