A Privacy Challenge for the Enterprise

by Dr. Edward Lee Lamoureux
Bradley University

What is the future of privacy in America?

Of all the myriad cultural features seeing change in recent years, few, if any, eclipse the dramatic shift in the perceptions about—and realities of—privacy in America. Long regarded as a fundamental right, personal privacy now appears fungible, perhaps even on the verge of extinction. What should we make of this seismic shift, and what is your role and the role of your enterprise in the face of today’s privacy challenges?

Attitude Vs. Behavior
Privacy presents an attitude-versus-behavior paradox. While most Americans “say” (by way of surveys) that they care about privacy and want it protected, they seldom act in ways that protect their own privacy. While paying some attention to privacy settings and policies, we utilize technologies for convenience, information and entertainment without much regard to the personal information we give up in the process.

Regardless of the privacy implications, we continue to use credit cards, smartphones and the Internet in ways that share our personal data with an enormous number of (mostly unknown) entities. The next big thing in technology—the so-called “Internet of Everything”—will continue to expand this march of personal data into networks and databanks, both public and private. Because Americans generally act like it doesn’t matter, some maintain that privacy is dead. Yet when asked, Americans across all age ranges say privacy still matters to them: a paradox, for sure.

Two primary reasons for this schizophrenia over privacy (We want to protect it, but we can’t or won’t) are technological complexity and device/system profusion. Virtually every website and device comes with “terms of service” (TOS) or an “end-user license agreement” (EULA) cast in dense legalese for the benefit of the enterprise. TOS and EULAs are legal contracts that bind users, customers and clients (and to a lessor degree, the enterprises that present them). Jill and Joe Citizen will rarely read them, and when they do, they do not understand most of what’s there. In the current environment, TOS and EULAs are typically set to “opt-in;” in other words, simply landing on a website, making a cellphone call, or swiping a credit or debit card opts the user into the agreement.

Faced with these complexities, citizens don’t really have control over their orientation to privacy: there are just too many considerations and settings for a “normal” person to control.

Solving the Paradox
Assuming that privacy matters, but citizens cannot (or don’t know how to) protect their privacy, one looks for entities to close the gap. Could the government solve this paradox? Can we depend on market solutions? I propose that, just as asking citizen-users to “solve” the problem is inadequate, neither the government nor market forces will solve it either.

It’s pretty obvious that we cannot expect government intervention to solve the privacy paradox. Largely, the laws we have don’t work well enough, for a number of reasons. First, most privacy laws apply only to federal and state governments. Second, the privacy laws in place are a patchwork of state and federal regulations about specific aspects of privacy. Third, we live in a time when there’s little taste for more regulation, even to solve a tough problem like this. Politically, additional regulation is a nonstarter. Further, in light of revelations from the WikiLeaks and Snowden scandals, combined with a heightened, post-9/11 sense of security needs, few trust the government to act strongly to protect our privacy.

Left to profit alone, the market doesn’t solve the problem either; in fact, it may be more responsible for the loss of privacy than any other factor. When the Defense Department and federal government stopped funding the Internet, commercial interests took over. Google’s solutions for organizing and searching the Web work only in the presence of data collection about users—the Web only works, profitably, with targeted marketing. Every enterprise is being pushed to “engage” its customers with interactive media, which means collecting information about them, and there’s a huge data economy based on leveraging this information. The information economy reaches across all data and communication industries, market segments and platforms: the Web/Internet (including ISPs), cellphones, cable television, credit and financial services, the retail sector, business-to-business, etc.

The biggest problem now might be that the information collected by private enterprise and that collected by the government are too often merged into what has effectively become one enormous database, which includes your digital dossier. In this environment, neither the government nor the market can be expected to “solve” a problem on which they thrive for their very existence and benefit.

So what’s the answer to this paradox? Just give up and admit that privacy is an antiquated idea… D.O.A. in the present and for the future?

An Outline of Principles
In the early 1970s, government regulators saw the advance of computers, networks and databases and knew that changes would be required if privacy was to be protected. In response, the U.S. Department of Housing, Education and Welfare developed The Fair Information Practices (1973). Unfortunately, these proposals were only put into law within a very narrow range, primarily covering medical and healthcare information. But these straightforward, common-sense principles provide an exemplary model for enterprises exhibiting social responsibility in their privacy practices. The code proposed five basic standards:

  • There must be no personal-data record-keeping systems whose very existence is secret.
  • There must be a way for an individual to find out what information about him/her is in a record and how it is used.
  • There must be a way for an individual to prevent personal information obtained for one purpose from being used or made available for other purposes without his/her consent.
  • There must be a way for an individual to correct or amend a record of identifiable information about him/her.
  • Any organization creating, maintaining, using or disseminating records of identifiable personal data must assure the reliability of the data for their intended use and must take reasonable precautions to prevent misuse of the data.

These principles aren’t difficult to understand. They were appropriate when regulators and lawmakers first realized that massive amounts of personal data would someday become widely available via technology, and they still make perfect sense. Generally speaking, we can see that these principles work: they are used to protect our medical information, and most folks are glad they work as well as they do. But they don’t get applied to much else. We wrote them into law and applied them to medical and healthcare information (and to a lesser extent, in some aspects of the financial sector), but we didn’t adopt them elsewhere, and we have no way to “enforce” them in the private sector.

If “the people,” the government and the market cannot solve the privacy paradox, who can?

You can. The owner or proprietor of an enterprise is ultimately responsible for its orientation toward privacy and the ways data provided by the public is treated. The buck stops with you.

Here’s the challenge: Does your organization follow the principles outlined in this code… even a little? Or does it hide behind the legal language of the TOS/EULA—language that really only protects you and your organization and doesn’t follow the principles articulated in the code? Do your practices offer clients and customers the opportunity to opt into the various ways you use their data, or must they take specific actions to opt out, without the real ability to do so (beyond not using your product or service)?

Is your excuse for not following the code based on socially-responsible reasoning? Or does it simply come down to, “It’s too darn hard to do all that stuff, and nobody really cares anyway”?

It’s certainly true that not following the code is easier and more advantageous to your bottom line than following it. But since we seem unable to protect ourselves, we don’t want the government’s overprotection, and the “bottom-line-only” market has failed to protect us, it seems the only solution left is the social responsibility exercised by our community and business leaders.

In the long run, does your enterprise benefit from taking advantage of your customers? Or from not caring about them enough to follow common-sense standards to protect their information from abuse and misuse?

The Pickpocket Scenario
Imagine the following scenario. You own a business. A customer walks in the front door, and you ask them to empty the contents of their pockets onto a table. Once they’ve done so, you take them into another room and provide your product or service. While you are both gone, however, a bunch of folks—not sure who or how many—rifle through the contents on the table, grabbing what they want. After your transaction, the customer returns to collect what is left of their belongings on their way out the door.

Oh, one more thing: the folks who raided the table left a small “tip” for you in an unmarked envelope. Your customer didn’t see that.

We both know what you are saying as you read this: “Absurd. I wouldn’t do that to my customers.” But of course, when you ask them to “like you on Facebook” or “follow you on Twitter,” that’s exactly what you are doing: exposing their personal contents (information) to a vast, data-driven marketplace, out of their conscious awareness and control, for the benefit of your operation. You gain free advertising, public relations, “engagement” with your customers, and maybe even a revenue stream from your social networks. They get… well... we tell them they get an “optimized web (or shopping or cultural) experience,” but what they really get is their pocket picked by data collection practices they don’t understand and cannot control. And this scenario doesn’t even begin to address the nuts and bolts of what you plan to do with the information you collect.

The five standards of the Fair Information Practices code are not complex or difficult to understand. They don’t represent government intrusion or the collapse of the free marketplace. They stand as a symbol of our mutual respect for each other’s fundamental rights to control information about ourselves. You are their guardian. And with your actions, the die is cast for the future of privacy in America. iBi

Dr. Edward Lee Lamoureux is a professor in the Department of Interactive Media and Department of Communication at Bradley University.


Source URL: http://ww2.peoriamagazines.com/ibi/2014/apr/privacy-challenge-enterprise