Skip to main content
Mar 04, 2012

Time for a privacy policy review

Data management and privacy policies must work to benefit company and customers.

Facebook and Google have made damaging errors in the minefield of personal information, privacy and business. Both have been sued for their use of customers’ data and have entered into agreements with the Federal Trade Commission (FTC) to change their practices.

Facebook’s most recent misstep came when it changed its website to include Timeline, a format that displays all information that the user has ever shared on Facebook on their Facebook page. Timeline changed users’ privacy settings without their consent, prompting the Electronic Privacy Information Center (EPIC) to write to the FTC asking if the move violated the company’s privacy agreement.

‘With Timeline, Facebook has once again taken control over the user’s data from the user and has now made information that was essentially archived and inaccessible widely available without the consent of the user,’ the letter stated.

Google recently announced it was overhauling its privacy policy, reducing over 60 policies to one. By integrating the data it gathers from across its many products, Google says it will be able to offer users new services, like a mobile service that could tell a driver to take an exit off the freeway because the traffic is going to make her late for her appointment. Members of Congress and some users have raised privacy issues, but EPIC’s FTC letter cited only antitrust concerns.

These issues – how companies can legally gather, store, use and profit from user data without damaging their brand – are not unique to companies such as Facebook and Google. As Andy Serwin, partner at Foley & Lardner and executive director of privacy and information think tank the Lares Institute, notes, ‘Anyone who has customers and employees faces data and privacy issues.’

Collecting customer information presents a real business opportunity, but, Serwin says, most companies ‘don’t understand all the value that their data can deliver for their customers and business’. For example, Serwin notes, ‘Customer data can be used to spot new customers, reduce costs and improve quality.’

Asked to name some companies that handle information well, Serwin lists GE, American Express and CVS Caremark. He also discusses the strengths of each privacy program and the executive who runs it: ‘GE has probably every form of data in a broad range of sectors, and its security and compliance program under Nuala O’Connor Kelly is consistently great,’ he says. ‘The company is particularly good at assessing third parties before transferring data to them. American Express’s business is heavily regulated, and Andy Ross does a great job at complying – he has excellent business sense about risk and does a very good job of using information to maximize customer value. At CVS Caremark, Ken Mortensen also does a great job of managing a lot of information in a complex environment and generating value from it.’

Defining privacy

Unfortunately, the compliance challenges posed by employee and customer data are complex even when it appears the lines are clear. The rules continue to evolve, often fueled by changes in technology.

As Serwin explains, industries being ‘driven by changes in technology will have data and privacy issues unique to their businesses. And as technology develops and collection and analytics capabilities improve, those issues will continue to arise.’

With social media becoming such a big part of our lives, and with companies relying on automation to interact with customers, the matter of what should remain private during those interactions is still being clarified. Nuala O’Connor Kelly, senior counsel, information governance, and chief privacy leader for GE, points out a particularly tricky area for employers – what she calls ‘the blurring line between the personal and the workspace, caused by the fluidity of technology.’

Employers are finding it harder to know where to draw the line between when employees speak for themselves and when they speak for the company. Policies need to be set because companies risk brand destruction and possible legal liability if they don’t strike the right balance.

For example, the Food and Drug Administration monitored its employees’ personal email accounts on work computers during work hours, and is now being sued for it. The reason: the employees were potential whistleblowers and the emails related to that activity. Even though the government computers inform users that there is ‘no reasonable expectation of privacy’ when they log on, advocates are pushing to find a base level of privacy under almost all situations.

‘Employees do have a right to take off their company hat and have a private space, even in the online community,’ O’Connor Kelly says. ‘No longer is all work done at a desk on a company-owned computer; no longer is all private speech done in the privacy of one’s living room. The technology and the geography are not completely determinative, and are no longer completely clear.’

Global differences

Another major challenge for companies is helping management understand that the concept of ‘privacy’ differs internationally.
‘The most important aspect for a global company is realizing that privacy means very different things around the world,’ O’Connor Kelly says. ‘Being sensitive and accountable to those differences while maintaining operational efficiency and consistency is a very significant risk area and compliance challenge.’

In the US, use restrictions are sector-based; healthcare and financial services are examples of industries with custom data management laws. This is not the approach every country takes, however. The EU just came out with its latest privacy directive, which takes a different approach to the US. ‘The growing disconnect between the US and the EU on how information should be handled is going to be a key issue going forward – specifically, what disclosures companies must make when customers agree that companies can track their internet usage,’ Serwin explains.

Privacy in the digital era often raises litigation so thorny that even the Supreme Court is struggling to draw lines. In the recently decided US v. Jones case, which dealt with the use of GPS devices on a suspect’s car without issuing a warrant, the court said that a warrant must be granted before a GPS could be used, but did not agree on the reason why. Drawing the lines on this issue was particularly difficult because the case challenges concepts that are core to past decisions.

In her concurrence, Justice Sotomayor explained that the court might have ‘to reconsider the premise that an individual has no reasonable expectation of privacy in information voluntarily disclosed to third parties. This approach is ill suited to the digital age, in which people reveal a great deal of information about themselves to third parties in the course of carrying out mundane tasks.’

As Justice Sotomayor noted, GPS tracking is particularly invasive and could be used to figure out a person’s ‘political and religious beliefs, sexual habits, and so on’. Although the concerns were raised in a criminal case, GPS data-mining is increasingly common. Consider Google’s ability to suggest, without a request, that a user take an exit on the highway to skip traffic. Because of the complexity of the issues, a few of the justices called on Congress to legislate policy. Indeed, O’Connor Kelly thinks that decision ‘may set the stage for even more legislative action than we had expected in the US this year. I think we should expect to see significant legislative proposals that address these issues.’

Taking action

Given the complexity of compliance, and the myriad legitimate revenue opportunities offered by customer data, what’s a company to do?

‘When drafting new policies and products, I always advise my internal clients to think very clearly about what data is needed and why,’ says O’Connor Kelly. ‘Also, I encourage product and program leaders to think about data use today – but also what might be possible in the future. We’ve seen misses with companies that inadequately disclose data use, or when data use or collection changes over time.

You want your conversation with your customers – through privacy policies, websites and other notices – to be fair and transparent, and to engender trust. You need to tell people what you’re going to do with their data, and you have to stay true to that promise.’

Serwin advises that ‘the first step for any business is to really understand the data it has so it can build a compliance program that’s appropriate for the risks and sensitivity of the data’. Doing that enables a business ‘to figure out how it can maximize its revenue from the data’.

Next, Serwin says companies need to assess their data and their brand. ‘If the company’s brand is based around trust and it deals with consumers, it might be wise to go significantly beyond what the law requires in terms of data security,’ he explains.

O’Connor Kelly suggests companies look ‘at the entire data life cycle, soup to nuts – how you bring in a piece of data, from where, from whom; how you keep it safe and protected and appropriately accessible while you have it, and then how you dispose of it at the appropriate time. The rules around that life cycle are what we call information governance.’

Serwin agrees, noting that the kind of analysis O’Connor Kelly calls for enables the company to build privacy into everything it does – what Serwin calls ‘privacy by design’.

The bottom line is that as the rules change and the opportunities grow, every business needs to take a serious and thorough look at how it protects its data, and how to better profit from it.

A chief privacy officer’s perspective on governance
GENuala O’Connor Kelly, senior counsel, information governance, and chief privacy leader for GE, pictured left, sat down with Corporate Secretary and shared her perspective on privacy policy and information governance.

What does a chief privacy officer do?

My job is to be the eyes and ears of the company on the use, handling and protection of personal data across the organization. I have to ensure that our operations are in sync with the expectations of our customers, employees and business partners about how their personal information will be handled by the company. It’s a little bit like serving as the conscience of the company on the responsible use of personal information. It’s a big job when done right.

There’s an ethical component, a legal and compliance component, and an operational component – making sure privacy is embedded in our business processes, in our software, information-handling and product development. It requires an understanding not only of the business imperatives and objectives but also of our external consumers’ expectations and attitudes on privacy. Leaders in this area need to be aware of both.

What are the most challenging aspects of data management?

I would say data security. There are countless threats to systems, including very smart hackers – it’s a constant cat-and-mouse game. Another challenge is matching up the needs of the business and the expectations of the consumer. Fundamentally, allowing individual control and being clear and transparent about how data is managed and protected is not only the right thing to do, but also builds trust in the company.

What does a good information governance program look like?

To me, the core elements are:

• People: ensuring that you have people that are accountable on information governance and privacy at both the leadership and business/program level.

• Strong policies: not only having strong policies in place on data use, but also actively educating and updating your employees about those policies.

• Clear and simple processes: in order to operationalize data privacy principles, clear processes on data use and access by your employees are necessary, and those processes need to be auditable, scalable and enforceable.

• Technology: all the promises that you make externally and internally need to be embedded in the systems themselves. The policies need to talk the talk; the systems need to walk the walk of data protection and privacy. A big part of my job has been analyzing new technologies for use by employees, and both welcoming their adoption and understanding the risks involved.

One example is the use of social media, either for personal use by employees or as a marketing vehicle for products. We’ve looked very closely at both the risks of inadvertent disclosure of protected information and the profound communications use of such data as a marketing tool. Social media hold both promise and peril, but overall, this is the direction that communications – and our employees – are going in, and we need to be present and embrace it.

Are there any challenges to the job that you want to call attention to?

Emerging technologies and mobile devices offer great opportunities for communication and partnership with our customers and clients, but also create privacy and security challenges when you’re trying to secure the data. As our increasingly mobile 24/7 workforce – think of salespeople working with an iPad or even an iPhone to demo products – uses these devices, we need to challenge ourselves to support their productivity, but also ensure the security and confidentiality of the data and the system. That’s a tall order, but it’s an exciting time to be working on these issues.

The challenge that new data systems present is really one of speed, scale and scope. Data can flow faster and wider, and be aggregated in larger quantities, than ever before, but companies – and individuals – shouldn’t be bamboozled by the technology. There’s still a right thing and a wrong thing to do. We are stewards of personal data – while we may aggregate, and make our systems, our programs and our company better, and add value, at the end of the day we need to be accountable, accessible and transparent to customers about our use of their data.

Secure privacy important to customers

A recent KPMG study of over 9,600 consumers in 31 countries finds that there is a growing level of concern regarding privacy and security, particularly when using new services or technologies. As a result, ‘trust’ has become one of the biggest competitive advantages for products and services across almost all industry groups.

•    90 percent of respondents voiced some level of concern about the security of their personally identifiable information, with almost half saying they were ‘very concerned’.

•    62 percent are still willing to have their online usage tracked by advertisers.

•    When asked who they trust most with their online data, 56 percent of respondents said their financial institutions, 30 percent said secure payment sites such as PayPal, and 7 percent were most likely to trust their retailers.

Source: KPMG, The converged lifestyle study






Abigail Caplovitz Field

Abigail is a freelance writer and lawyer based in New York.