As I wrote last week in my post about data-driven versus legacy business culture, I’ve been researching the issue of big data ethics. Now, I have a confession: big data ethics make my brain hurt. The issues and questions are so big! People, companies, corporations, governments, and global geopolitics are all going to change as a consequence of how we answer those questions.
My brain is hurting right now.
It's Time To Examine Big Data Ethics
But if you’re a marketer planning to leverage big data analytics for brand activation, you’re going to hurt a lot more than your brain if you don’t start examining big data ethics questions, and soon. If you don’t think through the issues and questions that pertain to you, you will end up damaging your brand’s reputation.
Last month’s much-talked-about 60 Minutes segment on the astonishing amount that data brokers know about you--importantly, not just what they collect about you, but what sophisticated algorithms let them infer about you--is just the beginning. As more and more people come to understand the Web-cookied, location-based, sensor-instrumented digital envelope in which they live their lives--and who is watching it--they will begin to ask questions themselves.
And if you’re buying digital ads, they may be coming to you with their questions. As noted angel investor Esther Dyson--the tech industry’s conscience these last three decades--told me in an interview on big data ethics, “The advertising community has been woefully unforthcoming about how much data they’re collecting and what they're doing with it. And it’s going to backfire on them, just as the Snowden revelations backfired on the NSA.”
Companies Need To Openly Discuss Data Privacy Policies
To get you started on the right track, here’s the prescriptive advice emerging from my research so far. Maybe your brain won’t have to hurt as much as mine.
- Explicitly align corporate values with what they do and don’t do with big data.
- Openly discuss their policies relating to data privacy, personally identifiable customer information, and data ownership.
- Be prepared to have lots of internal disagreements because ethics are variable, personal issues.
Dyson says, “Ethics don’t change. Circumstances change, but the same standards apply.” When I tell her about Kord’s idea to connect company values to big data actions, she tells me, “Connecting company values to your big data activities is another way of saying the circumstances have changed but the same standards apply.” Touché!
Transparency Is Key
Another writer and thinker on big data ethics is Jonathan King, vp of cloud strategy & business development at CenturyLink Technology Solutions (formerly Savvis, the cloud, managed services, and data center co-location provider). He and his writing partner, Neil Richards, a law professor and recognized expert in privacy and First Amendment law, advise focusing on four areas:
• Privacy: They say it isn’t dead and it’s not just about keeping information hidden. “Ensuring privacy of data is a matter of defining and enforcing information rules--not just rules about data collection, but about data use and retention.”
• Shared private information: King and Richards say you can share information and still keep it confidential. Again, this relies on the information rules mentioned in bullet one. And Dyson agrees: “The notion of security by obscurity no longer applies,” she says.
• Transparency: They say, “For big data to work in ethical terms, the data owners (the people whose data we are handling) need to have a transparent view of how our data is being used--or sold.”
• Identity: This is a really big brain-hurter. They say, “Big data analytics can compromise identity by allowing institutional surveillance to moderate and even determine who we are before we make up our own minds.”
That identity issue is the “My Tivo Thinks I’m Gay” problem writ large. I ran into an aspect of it last month. I researched the Broadway play “Tales From Red Vienna” and decided it wasn’t for me. But now I can’t seem to visit any media site anywhere without seeing an ad for it. I’m going to have to delete my cookies just from wanting to learn about a play!
But seriously, how will we learn the things we don’t know we need to know if big data analytics predict our future based on past performance, limiting the real-life serendipity that we all were heir to before this digital envelope encapsulated us?
When Data Should And Should Not Be Used
Michael Hickins, the editor of The Wall Street Journal’s CIO Journal, who I talked with about big data culture in last week’s post, says your CIO is probably thinking hard about these issues.
“CIOs are very aware of the weapon of mass disillusionment that could be in their hands with big data,” says Hickins. “There is no better way to turn off your customers than to be seen to violate their privacy--to use information they consider private without their consent. The really smart CIOs understand that getting people to sign off on terms of service is not enough; it doesn’t mean you have people’s consent to do everything you want to do.”
Like last week's post, I’ll close with wisdom from Dyson. She believes that bullet number three above, transparency, is pretty darn close to a panacea when it comes to big data ethics.
Says Dyson: “Transparency is tremendously useful, because if you don’t know what’s proper, be transparent and somebody will tell you. When in doubt, be transparent. And if you don’t know whether you should tell somebody something, tell them and you’ll find out. Transparency cures most of the problems.”