The California Consumer Privacy Act: Why (and How) to Start Preparing Now

By now most companies have heard about the California Consumer Privacy Act (“CCPA”).  Privacy as an inalienable right has been enshrined in the California Constitution since 1972, and California has developed a reputation as being at the forefront of state privacy legislation. California is also known for grassroots-driven legislation through the ballot initiative process known as “Propositions.” The Cambridge Analytica scandal led to a combination of the two – a proposed data privacy law with extremely burdensome obligations and draconian penalties garnered enough signatures to appear on the ballot in November 2018.  To prevent this from happening, the California legislature partnered with California business and interest groups to introduce and pass the California Consumer Privacy Act. It went from introduction to being signed into law by the governor of Californiain a matter of days in June of 2018, resulting in withdrawal of the ballot initiative. No major privacy legislation had ever been enacted as quickly. The law will become effective as early as January 1, 2020 – the effective date is six (6) months after the California Attorney General releases implementing and clarifying regulations which are expected sometime in 2019. Given the speed at which it was enacted, CCPA has numerous drafting errors and inconsistent provisions which will need to be corrected. In addition, as of the date of this article, the implementing regulations are not yet released.

Other states have introduced statutes similar to CCPA, and there is some discussion in Congress about a superseding national data privacy law. Because of this, companies may want to look at CCPA compliance from a nationwide, and not California, perspective. For companies hoping that CCPA will be scaled back or repealed, that’s not likely to happen.  The clock is ticking for businesses to develop and implement a compliance plan.When determining what compliance approach to take, consider the wisdom of the “Herd on the African Savanna” approach to compliance –the safest place to be in a herd on the African savanna is right in the center. It’s almost always the ones on theoutside which get picked off, not the ones in the center. The ones more likely to be “picked off” through an investigation or lawsuit are the ones at thefront of the herd (e.g., those who desire to be viewed as a leader in compliance) and the ones at the back of the herd (e.g., those who start working on compliance too late or don’t make serious efforts to be in compliance). For many companies, being in the center of the herd is the optimal initial position from a compliance perspective. Once additional compliance guidance is released, e.g., through clarifying regulations, press releases, or other guidance from the state Attorney General, companies can adjust their compliance efforts appropriately.

In this article, I’ll talk through steps that companies may want to consider as a roadmap towards CCPA compliance.(This is a good place to note thatthe information in this article does not constitute legal advice and is provided for informational purposes only. I summarize and simplify some of CCPA provisions for ease of discussion; you should look at the specific language of the statute to determine if a provision applies in your case. Consult your own internal or external privacy counsel to discuss the specifics of CCPA compliance for your own business.)

 

A Quick CCPA Refresher

The first problem with the “California Consumer Privacy Act” is its name. It applies to personal information collected about any California resident (not just consumers) in either a business-to-consumer (“B2C”) or business-to-business (“B2B”) context.It applies to almost every business entity that collects personal information about California residents, their affiliates, their service providers, and other third parties with which personal information is shared or disclosed. The use of “service provider” and “third party” are somewhat similar under the CCPA – both are businesses to which a company discloses a person’s confidential information for a business purpose pursuant to a written contract. The difference between the two is whether the information is being processed on behalf of the disclosing company. For example, SalesForce would be a service provider – it is processing personal information on behalf of your company. However, if the company with whom you share personal information processes it for its own benefit, not yours, it’s a “third party” under the CCPA.

“Personal Information” is defined extremely broadly under CCPA, in some ways even more broadly than under the EU’s General Data Protection Regulation (“GDPR”).  It is information that identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particularperson or household. It includes but is not limited to IP address, geolocation data, commercial information, professional and employment-related information, network activity, biometric information, audio/video, and behavioral analytics about a person.  CCPA covers businesses which “collect” a California resident’s personal information, defined as actively or passively obtaining, gathering, accessing or otherwise receiving personal information from a person, or by observing the person’s behavior. It also covers “selling” personal information, which is another bad choice of a defined term – “selling” personal information under the CCPA includes selling, renting, sharing, disclosing, or otherwise communicating (by any means) personal information to another business or third party for monetary or other valuable consideration, with some significant exceptions.

CCPA creates five (5) rights for California residents:

  • TheRight to Know – California residents have a general right to know the categories and purposes of personal information collected, sold, or otherwise disclosed about them.
  • TheRight to Access and Portability– California residents have a specific right to know how, why and what personal information about them is being collected, sold or disclosed, and if information is provided electronically, to receive it in a portable format.
  • TheRight to Deletion – California residents have a right to request the deletion of personal information about them collected by a business, with some exceptions.
  • TheRight to Opt Out– California residents have a right to say “no” to a company’s sale, sharing or transfer of their personal information with third parties.
  • TheRight to Equal Service & Pricing – California residents have a right to equal service and pricing whether or not they choose to exercise their CCPA rights.

Creating and Implementing a CCPA Compliance Plan

For companies that have not already gone through a GDPR compliance effort, CCPA compliance can seem daunting. However, creating a solid compliance plan and getting started now maximizes the chance your company will be in good shape for CCPA once it becomes effective. Here are some things to consider as you create and implement a CCPA compliance plan for your business. (Please note that if your company has already gone through a GDPR compliance effort, some of these may already be fully or mostly in place.)

1. Identify CCPA Champions within your business

An important initial step towards CCPA compliance is to identify the person(s) within your company that will lead the CCPA compliance effort. CCPA compliance will require a cross-departmental team. This often includesLegal(to advise on and help to interpret the statutory and regulatory requirements and to monitor for new developments both on CCPA and similar federal and state legislation, and to create a compliance plan for the business if the company does not have a data governance team);DevelopmentandInformation Technology(to implement the necessary technical and operational systems, processes, and policies which enable CCPA compliance);Sales leadership (for visibility into CCPA compliance efforts and to help manage inbound requests for CCPA compliance addenda);Customer Support (as CCPA requires customer support personnel be trained on certain aspects of CCPA compliance);Security(if your company has a Chief Information Security Officer or other information security person or team);Data Governance(if your company has a data governance person or team); and anexecutive sponsor (to support the CCPA compliance efforts within the C-Suite). Depending on your company, there may be other involved groups/parties as well.

2. Determinehow CCPA applies to your business

A key early step in CCPA compliance is determininghow CCPA applies to your business.  There are different compliance requirements forcompanies that collect personal information,companies that process personal information as a service provider for other companies, andcompanies that “sell” personal information or disclose it for a business purpose.

  • Does your business collect information directly from California residents, e.g., through online web forms, through customer support contacts, from employees who are California residents, through creation of an account, etc.? If so, it must comply withCCPA requirements for companies that collect personal information (a “data controller” in GDPR parlance).
  • Does your business receive and process personal information on behalf of customers or other third parties?  If so, it must comply withCCPA requirements for companies acting as a service provider (a “data processor” in GDPR parlance.) If you are a service provider, you must ensure your service offerings enable customers to comply with their own obligations under CCPA.  If not, expect a lot of requests for assistance from your customers, which could result in significant manual effort.
  • Does your business (a) “sell” personal information to affiliates, service providers or third parties (other than for the excluded purposes under CCPA), and/or (b) disclose or otherwise share personal information with an affiliate, service provider or third party for operational or other notified purposes?  If so, it must comply withCCPA requirements for companies that “sell” personal information or disclose it for a business purpose. It’s important to note that under Section 1798.40(t)(2) of the CCPA, there are certain exceptions that when satisfied mean a company is not “selling” personal information under the CCPA. For example, a business does not “sell” personal information if a person uses that business’s software or online system, or gives consent for a business, to disclose their personal information to a third party, as long as that third party is also obligated not to “sell” the personal information. As another example, a business does not “sell” personal information when it shares it with a service provider, as long as certain conditions are met.

3. Inventory your data assets, data elements and data processes

One of the most important steps in CCPA compliance, and data privacy compliance in general, is to conduct a data inventory.  For CCPA purposes, consider inventorying yourdata assets (programs, SaaS solutions, systems, and service providers in which data elements are stored for use in data processes),data elements (elements of personal and other information stored in data assets), anddata processes (business processes for which data elements are stored and processed in data assets).  This inventory should also collect information on service providers and other third parties with whom data elements are shared or disclosed by your business, and the purposes for which information is shared or disclosed. Companies should try to complete this inventory as quickly as possible.  The CCPA compliance team should work to create a list of internal individuals who should complete this inventory; once all responses are received, the compliance team should consolidate the responses into a master table.

The data inventory is a snapshot in time.  It’s also important to refresh the data inventory on a regular basis, e.g., quarterly, to capture changes to data collection and usage over time.

4. Cease any unnecessary collection and/or storage of personal information (“data minimization”)

Once the data asset/element/process inventory is created, businesses should be encouraged to use the opportunity to conduct a “data minimization” exercise.  One of the central principles of data privacy isdata minimization – limiting the collection and storage of personal information to only what is directly necessary and relevant to accomplish a specified business purpose.  The CCPA compliance team should consider both (a) identifying data elements without an associated data process, which I call “orphaned data elements,” and purging and ceasing the further collection of stored orphaned data elements; and (b) identifying data collection which is not associated with an associated business purpose, which I call “orphaned data transfers,” and cease any further orphaned data transfers and terminate the associated contracts.  Also consider validating that record retention policies are being followed so that data is promptly irretrievably deleted once it is no longer needed for a business purpose.

5. Implement a compliance plan for the 5 CCPA privacy rights

The heart of a CCPA compliance plan is implementing compliance with the 5 privacy rights created by the CCPA.  A solidly-constructed plan should cover compliance requirements where the business acts as a data collector, a service provider, or a company selling or otherwise disclosing personal information.

a. The Right to Know

One of CCPA’s core requirements is to publicly provide the necessary disclosure of the CCPA privacy rights, including the specific methods by which a person can submit requests to exercise those rights.  Many companies will likely add this to their privacy policy, as well as any California-specific disclosures already made by the business. Don’t forget this disclosure needs to be added not only to websites, but to mobile apps.  The disclosures must include a list of the categories of personal information collected, sold or disclosed for business purposes during the previous 12 months, as well as a list of the business purposes for which the categories are used. If this information is collected as part of the data inventory, it can greatly simplify the process of creating this disclosure.  The implementation plan should include a process for updating these disclosures as needed to reflect a rolling 12-month period.

b. The Right to Access and Portability

Another key requirement is the implementation of a process to respond to verifiable requests from data subjects for the following information covering the 12-month period preceding the request date.  Companies will need to provide information including:

  • Thecategories of personal information collected about that person
  • Thecategories of sources from which the personal information is collected
  • Thebusiness/commercial purpose for collecting/selling personal information
  • Thecategories of third parties to which their personal information is shared/disclosed
  • Thespecific data elements collected about that person for the 12-month period preceding the request date (this could be read to conflict with data destruction policies or data minimization best practices, but I suspect that destruction policies or data minimization best practices will trump this disclosure requirement)
  • If a person’s personal information is sold or is disclosed for a business purpose by your business unit,additional information must be provided

If the data inventory is done right, it can be a source of data for this response (except for the requestor’s specific data elements). Companies must provide at least 2 methods for a person to submit a verifiable request – a toll-free number and website address are both required.  The California Attorney General will release regulations on how to “verify” a request. Information must be disclosed within 45 days of receipt of the request, but there is a process under CCPA to extend the time period to 90 days if necessary. If information is provided electronically, they must be provided in aportable format (e.g., an .xml file). The team that is responsible for fulfilling verified requests should be trained on how to prepare a response, and should test it before the CCPA effective date to validate that the process is working properly. You can’t require someone to have an account with you in order to submit a request. Don’t forget to train your website and customer service personnel on how to handle consumer requests.

Also, if you are a service provider, your clients will look to you to ensure they are able to pull information from your systems necessary for them to comply with the right to access and portability. Don’t overlook including in your compliance plan a review of your customer portal and interfaces (e.g., APIs) to ensure customers are able to satisfy their CCPA compliance obligations.

c. The Right to Deletion

Another key requirement is to implement a process todelete collected personal information of a person if that person submits a verified request for deletion, and todirect your service providers to do the same. Note that this does not apply to third parties with whom information has been shared or disclosed who are not service providers.  As with the right to access and portability, you can’t require someone to have an account with you to exercise this right.

There are many important exclusions to the right of deletion.  These include:

  • Completing a transaction with, providing a good or service requested by or reasonably anticipated under a business relationship with, or otherwise needed to perform a contract with, a person
  • Security purposes
  • Debugging and error resolution
  • Conducting formal research (many conditions apply)
  • “Solely internal uses” that are reasonably aligned with a person’s expectations based on the person’s relationship with the business
  • Compliance with a legal obligation
  • Internal uses in a lawful manner that is compatible with the context in which the person provided the personal information
  • Other limited exceptions under CCPA

d. The Right to Opt Out

This one may be the most challenging for many companies to implement.  It applies if a business “sells” personal information to third parties, or otherwise shares personal information with a third party (e.g., a data sharing agreement).   CCPA appears to provide that an opt-outwould not apply to information provided to a company’s own service providers to further the company’s own business purposes, as long as there are certain contractual requirements in place with the service provider.

CCPA requires companies to implement a “Do Not Sell My Personal Information” opt-out page linked to from the homepage footer on a website, and from their mobile apps.  (The description of a person’s rights under CCPA in section (a) above should include a description of the right to opt out.) Creating a process to verify requests (pending guidance from the California Attorney General) is especially important here since opt-out requests can be submitted by a person or that person’s “authorized representative.”  Once a request has been verified, the personal information of the data subject cannot be shared with third parties (e.g., by associating an opt-out flag with the personal information) until the person later revokes the opt-out by giving the business permission to share his or her personal information with third parties. However, you cannot ask for permission for at least 12 months from the opt-out date.  Companies must train their customer service representatives on how to direct persons to exercise their opt-out rights.

e. The Right to Equal Service and Pricing

As part of a CCPA compliance plan, businesses should consider ways to make sure that they do not charge more or otherwise “discriminate” against a person who chooses to exercise one of their CCPA rights.  A business can offer financial incentives to persons for the collection, sale or retention of their personal information, as long as that incentive is notunjust, unreasonable, coercive or usurious.

5. Verify you haveCCPA-compliant written contracts in place with service providers and third parties receiving personal information

Personal information governed by CCPA may only be disclosed to a service provider or third party under a written contract. Businesses should work with their internal or external Legal resource to validate that written contracts are in place with all service providers and third parties to which personal information is disclosed, and that there is a valid business purpose for disclosing the personal information. If no written agreement exists, work with your Legal resource to negotiate and execute a CCPA-compliant agreement. For existing written agreements, a CCPA contract addendum will likely be required which adds into the agreement the obligations and commitments required under CCPA. Don’t forget to look at any data sharing with your corporate affiliates which is likely under an inter-company agreement.

6. Prepare for compliance requests where your company is a service provider

If your company is a service provider to other businesses, you should expect to start receiving questions about, and contract amendments/addenda related to, CCPA.  It’s the inverse of #5 above. Consider how to most efficiently handle these requests. Some companies may want to consider whether to have a standard CCPA compliance addendum for use with customers, or to have a CCPA compliance statement on a public facing website that can be referred to as needed.  Work with Sales and account managers to educate them as to why the company cannot accept a customer’s own CCPA addenda, which may include more than just CCPA compliance terms.

 7. Take steps to permit continued use of de-identified personal information

Finally, a CCPA compliance plan should include implementation of appropriate steps as needed so your company can continue to usede-identified personal information (an information record which is not reasonably capable of being identified with, relating to, describing, being associated with or being directly/indirectly linked to the source person) andaggregated personal information (information relating to a group or category of persons from which individual identities have been removed, and which is not linked or reasonably linkable to aperson or device).  CCPA talks about de-identified data only with respect to the following requirements, but the same safeguards and processes would likely apply to aggregated personal information.

  • Implement technical safeguards to prohibit re-identification of the person to whom de-identified personal information pertains.
  • Implement business processes that specifically prohibit re-identification of de-identified personal information.
  • Implement business processes to prevent the inadvertent release of de-identified personal information.

Your company can only use de-identified personal information as long as it makesno attempt to re-identify the de-identified personal information (whether or not that attempt is successful).  If your company begins re-identifying personal information, cease any use of de-identified personal information immediately.

8. Review your security procedures and practices, and consider encryption and data redaction options

Finally, business are encouraged to review their security procedures and practices to ensure they are reasonable and appropriate to protect personal information in their possession or control. CCPA creates a private right of action for consumers whose un-encrypted or un-redacted personal information is subject to an unauthorized “access and exfiltration,” theft, or disclosure as the result of a business’s violation of its duty to implement and maintain reasonable security procedures and practices to protect personal information appropriate to the nature of the information. For this private right of action, CCPA specifically uses a different definition of personal information, the one found in California Civil Code § 1798.81.5(d)(1)(A). Here, “personal information” means a person’s first name or first initial and last name coupled with the person’s (i) social security number, (ii) driver’s license number or California identification card number, (iii) account number, credit or debit card number, in combination with any required security code, access code, or password that would permit access to an individual’s financial account, (iv) medical information, and/or (v) health insurance information, where such information is not encrypted or redacted. Any private right of action is sure to spawn a cottage industry of class action lawsuits. If your company collects and/or receives personal information as defined above, consider a review of your company’s security procedures and practices to ensure that they are reasonable and appropriate to protect such personal information given the nature of the information.

In 2016, the California Attorney General issued a Data Breach Report in which the Attorney General stated that “[t]he 20 security controls in the Center for Internet Security’s Critical Security Controls identify a minimum level of information security that all organizations that collect or maintain personal information should meet. The failure to implement all the Controls that apply to an organization’s environment constitutes a lack of reasonable security.” Given this, all companies are encouraged to review the Center for Internet Security’s Critical Security Controls to ensure that they meet the California AG’s minimum definition of “reasonable security.”

The California Attorney General’s report included other recommendations, such as use of multi-factor authentication on consumer-facing online accounts containing sensitive personal information, and the consistent use of strong encryption to protect personal information on laptops, portable devices, and desktop computers. Companies may want to evaluate whether implementing encryption at rest on servers, workstations, and removable media, and/or redacting personal information (e.g., through tokenization and deletion of the source data or another data redaction technique), would make sense as a part of its security procedures and practices.

 

Eric Lambert is counsel for the Transportation division of Trimble Inc., an geospatial solutions provider focused on transforming how work is done across multiple professions throughout the world’s largest industries. He supports the Trimble Transportation Mobility and Trimble Transportation Enterprise business units, leading providers of software and SaaS fleet mobility, communications, and data management solutions for transportation and logistics companies. He is a corporate generalist and proactive problem-solver who specializes in transactional agreements, technology/software/cloud, privacy, marketing and practical risk management. Eric is also a life-long techie, Internet junkie and avid reader of science fiction, and dabbles in a little voice-over work. Any opinions in this post are his own. This post does not constitute, nor should it be construed as, legal advice.

The Why of Privacy: 4 Reasons Privacy Matters to People, and Why Companies Need to Know Them

While almost all companies collect and use their customers, visitors and users’ personal information, primarily online and through in-person customer interactions such as point-of-sale transactions, the privacy landscape is in a near-constant state of turbulence and flux. There is the steady flow of data breach reports affecting companies of almost every size and market segment. New data privacy laws, rules and regulations continue to be introduced and enacted around the world, such as the US-EU Privacy Shield program, the EU General Data Protection Regulation (GDPR), and Argentina’s draft Data Protection Bill, placing new legal obligations and restrictions on the collection and use of personal information. Challenges continue to be raised against laws which are perceived to overreach or conflict with privacy rights, such as the continued challenges to the Privacy Shield program and EU’s Model Contract Clauses.

The one constant in this turbulent landscape is that consumers’ awareness of data privacy and security continues to grow. Given this, it is important to step back from the day-to-day privacy developments and look at a more fundamental question. It is axiomatic in the world of privacy that privacy matters to people, but why it matters is more complicated. People often argue about why privacy is important to individuals, but there is no “one-size-fits-all” answer. Privacy matters to different people in different ways, so there are many equally valid reasons why privacy is important to individuals.

Understanding the “why of privacy” is also critically important to businesses and other organizations. By now, most companies understand the importance of providing notice of their privacy collection practices and choice with respect to the use of collected information. A company collecting, processing and/or controlling personal information that understands the reasons privacy matters to the data subjects whose data they collect and use can design more effective privacy practices and policies attuned to the needs of their data subjects, such as by creating customer privacy profiles for use in product design and testing.  This follows the “privacy by design” framework advocated by the Federal Trade Commission and helps increase trust in the company’s commitment to data privacy and security, which is critical to the success of every company in today’s world and can provide a competitive advantage.

The reason why privacy matters differs from person to person. However, I believe these reasons can be grouped into four core categories: (1) privacy is a right, (2) privacy is an entitlement, (3) privacy is an expectation, and (4) privacy is a commodity. I’ll explore each of them in turn.

Privacy is a Right

Persons falling into this first category value privacy as an irrevocable right guaranteed to all. People living in countries with constitutional data privacy protections often fall into this category. For example, the European Union Charter of Fundamental Rights recognizes the right to data protection and the right to privacy as fundamental human rights. In some countries, it has been implied through interpretation of constitutional and legal rights, such as the right to privacy found by the U.S. Supreme Court and the right to privacy recognized under the Canadian Charter of Rights and Freedoms even though it does not specifically mention privacy. In August 2017, a unanimous Supreme Court of India held that privacy is a fundamental right as an integral part of the Right to Life and Personal Liberty guaranteed under Article 21 of the Constitution of India.  The 1948 United Nations’ Universal Declaration of Human Rights states that people have a fundamental human right not to “be subjected to arbitrary interference with their privacy, family, home or correspondence.”

  • People in this category are more likely to take a very rigid view of privacy trumping all other interests, including business interests, and may be less willing to “trade” any of their privacy for other benefits such as increased security.
  • People in this category tend to expect that any consent given to use personal information must be clear, unambiguous, express, and fully revocable and that use of the information must be specifically limited to the grant of rights or as otherwise expressly permitted by law, which creates a significant burden for businesses and other organizations collecting and using personal information.
  • Privacy as a right is an individual view– the rights of the individuals to protect their personal information are paramount to almost all other rights by others to use or access that personal information.

Privacy is an Entitlement

Persons falling into this second category value privacy as something to which they are entitled under laws, rules and regulations applicable to them. There are many laws, either comprehensive data privacy laws such as Canada’s PIPEDA or sectoral laws such as the privacy laws enacted in the United States, whose prohibitions or restrictions on privacy practices may be viewed by individuals as creating privacy obligations to which they are entitled. An example is the U.S. Children’s Online Privacy Protection Act, which among other things prohibits the collection of personal information from children under 13 without verifiable parental consent. Some parents view COPPA as creating an entitlement for their children to be left alone unless the parent consents to the collection of personal information from their children.

  • Similar to privacy as a right, people in this category are likely to view privacy as trumping other interests, including business interests, and may be less willing to give up privacy for other benefits.
  • They tend to expect that any consent given to use personal information must be fully compliant with legal requirements, and that use of the information must be specifically limited to those use rights expressly permitted by law, which creates a burden for businesses and other organizations collecting and using personal information.
  • As with privacy as a right, privacy as an entitlement is an individual view, where a individual’s entitlement to privacy outweighs other interests in a person’s personal information.
  • A key differentiator between privacy as a right and privacy as an entitlement is that an entitlement can be revoked, e.g., through changes to the law, whereas a right is irrevocable. While some might argue that a judicially-recognized right to privacy should be an expectation, I believe that the recognition by a country’s supreme court that privacy is a right, which is unlikely to be overturned or legislatively reversed, should be considered a right.

Privacy is an Expectation

Persons falling into this third category value privacy as something they expect to receive, whether or not they have a right or entitlement to it. New technologies (such as drones and biometric identifiers) and practices (such as marketing strategies) tend to be ahead of laws specifically governing them, and people in this category expect to receive privacy protections regardless of whether existing laws or other rights cover the technology or practice. They may also expect societal norms with respect to privacy to be followed by businesses and other organizations, whether or not stricter than applicable legal requirements. There are also certain expectations of privacy that are generally recognized within a given society. For example, in the United States, many people have an expectation of privacy in their own home and other private areas such as a public bathroom stall. If a person or organization interferes with this expectation of privacy, there may be legal liability for invasion of privacy under state laws. There are other expectations of privacy on a per-situation basis, such as a private conversation between two individuals.

  • People in this category believe that third parties, such as companies and government entities, should recognize that their expectation of privacy trumps those third parties’ desire (or rights) to access and use their personal information, but also understand that the expectation of privacy has limits. For example, a person should not have an expectation of privacy in a public place (e.g., a public sidewalk), and there is no right of privacy that extends to a person’s garbage placed on the street for collection.  In the United States, there is also no expectation of privacy in the workplace.
  • An expectation of privacy can be breached by a superior interest by a third party. For example, if a court approved surveillance of someone suspected of engaging in illegal activity, any expectation of privacy that person may have that his conversations are private is superseded by the government’s interest in preventing and prosecuting crime.
  • People in this category also generally do not question or challenge the terms of a privacy policy or other agreement granting rights to use or collect their personal information. People in this category also tend to expect businesses and other organizations collecting and/or using their personal information will not unreasonably collect or use their personal information, and will respect usage opt-out requests.
  • Privacy as an expectation is a middle-of-the-road view, in which the individual view of privacy as paramount is tempered with the understanding that in some cases the general or specific value of allowing a third party to receive and use their personal information outweighs the personal interest.

Privacy is a Commodity

Persons falling into this fourth category value privacy as a commodity that they are willing to exchange for other benefits, goods or services. We live in an information economy, where data has been commoditized. To many companies a core or important part of their product or service offering (i.e., part of the general value of the product or service) or business strategy is the ability to monetize personal, aggregate, and/or anonymous data collected through its use. Companies argue that the value derived from data monetization is factored into the value and cost of the product or service. Other companies offer something of specific value, such as registering for an extended product warranty, for sharing personal information such as an email address or demographic information. Many people give businesses some rights to use their personal information simply by visiting a webpage, requesting information from them, or purchasing goods or services from them in which they agree to be bound by the company’s privacy policy or terms of use/terms of sale. We also live in a world where many people are willing to sacrifice some privacy in exchange for increased security against terrorism and other potential physical and cyber threats. People falling into this category have a strong understanding of the trade-off between privacy and other benefits.

  • People in this category are more willing to give third parties the right to use their information as long as the thing they receive in return is valuable enough to them – they view their personal information as currency. If a company or organization offers something of value, they are very likely to agree to share personal information with that company or organization. These are the kind of people who don’t really care that they’re receiving targeted ads while surfing online.
  • Conversely, if they do not believe they are receiving value in return for their personal information, people in this category are more likely not to share their information.
  • Privacy as a commodity is a transactional view, meaning that the an individual is willing to allow a third party to receive and use their personal information if the general or specific value of allowing that third party to receive and use the information outweighs their personal interest in keeping their information.
  • It may require a greater transfer of value to convince someone viewing privacy as a right, entitlement or expectation to treat it as a commodity.

 

As a closing thought, these four reasons why privacy matters to people are not mutually exclusive, meaning that there are additional sub-categories of people for whom two or more of these reasons are important. For example, it is possible for someone to view privacy as both an entitlement and a commodity. Such a person would expect that while they have the ability to exchange their personal information for something of value, it must always be a voluntary exchange – they would reject any need to trade away their personal information. Businesses who take the time to understand the “why of privacy” will find themselves better positioned to create sample customer profiles based on their customers’ privacy values, leading to more robust privacy practices, processes and policies and a potential competitive advantage on privacy in the marketplace.

Eric Lambert has spent most of his legal career working in-house as a proactive problem-solver and business partner. He specializes in transactional agreements, technology/software/e-commerce, privacy, marketing and practical risk management. Any opinions in this post are his own. This post does not constitute, nor should it be construed as, legal advice. He is a technophile and Internet evangelist/enthusiast. In his spare time Eric enjoys reading and implementing and integrating connected home technologies and dabbles in voice-over work.

The Augmented World — Legal and Privacy Perspectives on Augmented Reality (AR)

You’ve likely heard that Augmented Reality (AR) is the next technology that will transform our lives. You may not realize that AR has been here for years. You’ve seen it on NFL broadcasts when the first down line and down/yardage appear on the screen under players’ feet. You’ve seen it in the Haunted Mansion ride in Disneyland when ghosts seem to appear in the mirror riding with you in your cart. You’ve seen it in cars and fighter jets when speed and other data is superimposed onto the windshield through a heads-up display. You’re seeing it in the explosion of Pokémon Go around the world. AR will affect all sectors, much as the World Wide Web did in the mid-1990s. Any new technology such as AR brings with it questions on how it fits under the umbrella of existing legal and privacy laws, where it pushes the boundaries and requires adjustments to the size and shape of the legal and regulatory umbrella, and when a new technology leads to a fundamental shift in certain areas of law. This article will define augmented reality and the augmented world, and analyze its impact on the legal and privacy landscape.

What is “augmented reality” and the “augmented world?”

One of the hallmarks of an emerging technology is that it is not easily defined. Similar to the “Internet of Things,” AR means different things to different people, can exist as a group of related technologies instead of a single technology, and is still developing. However, there are certain common elements among existing AR technologies from which a basic definition can be distilled.

I would define “augmented reality” as “a process, technology, or device that presents a user with real-world information, commonly but not limited to audiovisual imagery,augmented with additional contextual data elements layered on top of the real-world information, by (1) collecting real-world audiovisual imagery, properties, and other data; (2) processing the real-world data via remote servers to identify elements, such as real-world objects, to augment with supplemental contextual data; and (3) presenting in real time supplemental contextual data overlaid on the real-world data.” The real world as augmented through various AR systems and platforms can be referred to as the “augmented world.” AR and the augmented world differs from “virtual reality” (VR) systems and platforms, such as the Oculus Rift and Google Cardboard, in that VR replaces the user’s view of the real world with a wholly digitally-created virtual world, where AR augments the user’s view of the real world with additional digital data.

“Passive” AR (what I call “first-generation AR”) is a fixed system — you receive augmented information but do not do so interactively, such as going through the Haunted Mansion ride or watching your television set. The next generation of AR is “active,” meaning that AR will be delivered in a changing environment, and the augmented world will be viewed, through a device you carry or wear. Google Glass and the forthcoming Microsoft HoloLens are examples of “active AR” systems with dedicated hardware; when worn, the world is augmented with digital data superimposed on the real-time view of the world. However, AR has found ways to use existing hardware — your smartphone. HP’s Aurasma platform is an early example of an active AR system that uses your smartphone’s camera and screen to create digital content superimposed on the real world. What AR has needed to go fully mainstream was a killer app that found a way for AR to appeal to the masses, and it now has one — Pokémon Go. Within days of its launch in early July, TechCrunch reported that Pokémon Go had an average daily user base of over 20 million users. Some declared it the biggest “stealth health” app of all time as it was getting users out and walking.

Active AR has the capacity to change how people interact with the world, and with each other. It is an immersive and engaging user experience. It has the capacity to change the worlds of shopping, education and training, law enforcement, maintenance, healthcare, and gaming, and others. Consider an AR system that shows reviews, product data, and comparative prices while looking at a shelf display; identifies an object or person approaching you and makes it glow, flash, or otherwise stand out to give you more time to avoid a collision; gives you information on an artist, or the ability to hear or see commentary, while looking at a painting or sculpture; identifies to a police officer in real time whether a weapon brandished by a suspect is real or fake; or shows you in real time how to repair a household item (or how to administer emergency aid) through images placed on that item or on a stricken individual. For some, the augmented world will be life-altering, such as a headset as assistive technology which reads road signs aloud to a blind person or announces that a vehicle is coming (and how far away it is) when the user looks in the vehicle’s direction. For others, the ability to collect, process and augment real-world data in real time could be viewed as a further invasion of privacy, or worse, technology that could be used for illegal or immoral purposes.

As with any new technology, there will be challenges from a legal and digital perspective. A well-known example of this is the Internet when the World Wide Web became mainstream in the mid-1990s. In some cases, existing laws were interpreted to apply to the online world, such as the application of libel and slander to online statements, the application of intellectual property laws to file sharing over peer-to-peer networks, and the application of contract law to online terms of use. In others, new laws such as the Digital Millennium Copyright Act were enacted to address shortcomings of the existing legal and regulatory landscape with respect to the online world. In some instances, the new technology led to a fundamental shift in a particular area of law, such as how privacy works in an online world and how to address online identity theft and breaches of personal information. AR’s collection of data, and presentation of augmented data in real time, creates similar challenges that will need to be addressed. Here are some of the legal and privacy challenges raised by AR.

  • Rethinking a “reasonable expectation of privacy.” A core privacy principle under US law is that persons have a reasonable expectation of privacy, i.e., a person can be held liable for unreasonably intruding on another’s interest in keeping his/her personal affairs private. However, what is a “reasonable expectation of privacy” in a GoPro world? CCTV/surveillance cameras, wearable cameras, and smart devices already collect more information about people than ever before. AR technology will continue this trend. As more and more information is collected, what keeping “personal affairs private” looks like will continue to evolve. If you know someone is wearing an AR device, and still do or say something you intend to keep private, do you still have a reasonable expectation of privacy?

What is a “reasonable expectation of privacy” in a GoPro world?

 

  • Existing Privacy Principles. Principles of notice, choice, and “privacy by design” apply to AR systems. Providers of AR systems must apply the same privacy principles to AR as they do to the collection of information through any other method. Users should be given notice of what information will be collected through the AR system, how long it will be kept, and how it will be used. Providers should collect only information needed for the business purpose, store and dispose of it securely, and keep it only as long as needed.

AR systems add an additional level of complexity — they are collecting information not just about the user, but also third parties. Unlike a cellphone camera, where the act of collecting information from third parties is initiated by the user, an AR system may collect information about third parties as part of its fundamental design. Privacy options for third parties should be an important consideration in, and element of, any AR system. For example, an AR system provider could ensure users have the ability to toggle the blocking of third party personal data from being collected or augmented, so personal information is only augmented when the user wants it to be. AR system providers may also consider an indicator on the outside of the device, such as an LED, to let third parties know that the AR system is actively collecting information.

Additionally, AR may create interesting issues from a free speech and recording of communications perspective. Some, but not all, court rulings have held that the freedom of speech guaranteed by the First Amendment extends to making recordings of matters of public interest. An AR system that is always collecting data will push the boundaries of this doctrine. Even if something is not in the public interest, many states require the consent of both parties to record a conversation between them. An AR system which persistently collects data, including conversations, may run afoul of these laws.

  • Children’s Privacy.It is worth a special note that AR creates an especially difficult challenge for children’s privacy, especially children under 13. The Children’s Online Privacy Protection Act (“COPPA”) requires operators of online services, including mobile apps, to obtain verifiable parental consent before collecting any personal information from children under 13. “Personal information” includes photos, videos, and audio of a child’s image or voice. As AR systems collect and process data in real time, the passive collection of a child’s image or voice (versus collection of children’s personal information provided to a company through an interface such as a web browser) is problematic under COPPA. AR operators will need to determine how to ensure they are not collecting personal information from children under 13. I expect the FTC will amend the COPPA FAQ to clarify their position on the intersection of AR and children’s privacy.
  • Intellectual Property. Aside from the inevitable patent wars that will occur over the early inventors of AR technologies, and patent holders who believe their patent claims cover certain aspects of AR technologies, AR will create some potentially interesting issues under intellectual property law. For example, an AR system that records (and stores) everything it sees will invariably capture some things that are protected by copyright or other IP laws. Will “fair use” be expanded in the augmented world, e.g., where an album cover is displayed to a user when a song from that is heard? Further, adding content to a copyrighted work in the augmented world may constitute a prohibited derivative work. From a trademark perspective, augmenting a common-law or registered trademark with additional data, or using a competitor’s name or logo to trigger an ad about your product overlaid on the competitor’s name or logo, could create issues under existing trademark law.
  • Discrimination.  AR systems make it easy to supplement real-world information by providing additional detail on a person, place or thing in real time. This supplemental data could intentionally or inadvertently be used to make real-time discriminatory decisions, e.g., using facial or name recognition to provide supplemental data about a person’s arrest history, status in a protected class, or other restricted information which is used in a hiring or rental decision. An AR system that may be used in a situation where data must be excluded from the decision-making process must include the ability to automatically exclude groups of data from the user’s augmented world.

The world of online digital marketing and advertising will expand to include digital marketing and advertising in the augmented world. Imagine a world where anything — and I mean anything — can be turned into a billboard or advertisement in real time. Contextual ads in the augmented world can be superimposed anytime a user sees a keyword. For example, if you see a house, imagine if an ad for a brand of paint appears because the paint manufacturer has bought contextual augmented ads to appear in an AR system whenever the user sees a house through the augmented world.

Existing laws will need to be applied to digital marketing and advertising in the augmented world. For example, when a marketing disclaimer appears in the online world, the user’s attention is on the ad. Will the disclaimer have the same effect in an augmented environment, or will it need to be presented in a way that calls attention to it? Could this have the unintended consequence of shifting the user’s attention away from something they are doing, such as walking, thereby increasing the risk of harm? There are also some interesting theoretical advertising applications of AR in a negative context. For example, “negative advertising” could be used to blur product or brand names and/or to make others more prominent in the augmented world.

  • The Right of Publicity.  The right of publicity — a person’s right to control the commercial use of his or her name, image, and likeness — is also likely to be challenged by digital marketing in the augmented world. Instead of actively using a person’s likeness to promote a product or service, a product or service could appear as augmented data next to a person’s name or likeness, improperly (and perhaps inadvertently) implying an endorsement or association. State laws governing the right of publicity will be reinterpreted when applied to the augmented world.
  • Negligence and Torts. AR has the capacity to both further exacerbate the problem of “distracted everything,” paying more attention to your AR device than your surroundings, as some users of Pokémon Go have discovered. Since AR augments the real world in real time, the additional information may cause a user to be distracted, or if the augmented data is erroneous could cause a user to cause harm to him/herself or to others. Many have heard the stories of a person dutifully following their GPS navigation system into a lake. Imagine an AR system identifying a mushroom as safe to eat when in fact it is highly poisonous. Just as distracted driving and distracted texting can be used as evidence of negligence, a distracted AR user can find him/herself facing a negligence claim for causing third party harm. Similarly, many tort claims that can arise through actions in the real world or online world, such as liable and slander, can occur in the augmented world. Additionally, if an AR system augments the real world in a way that makes someone think they are in danger, inflicts emotional distress, or causes something to become dangerous, the AR user, or system provider, could be legally responsible.
  • Contract liability. We will undoubtedly see providers of AR systems and platforms sued for damages suffered by their users. AR providers have and will shift liability to the user through contract terms. For example, Niantic, the company behind Pokémon Go, states in their Terms of Use that you must “be aware of your surroundings and play safely. You agree that your use of the App and play of the game is at your own risk, and it is your responsibility to maintain such health, liability, hazard, personal injury, medical, life, and other insurance policies as you deem reasonably necessary for any injuries that you may incur while using the Services.” AR providers’ success at shifting liability will likely fall primarily to tried-and-tested principles such as whether an enforceable contract exists.

None of the above challenges are likely to prove insurmountable and are not expected to slow the significant growth of AR. What will be interesting to watch is how lawmakers choose to respond to AR, and how early hiccups are seized on by politicians and reported in the press. Consider automobile autopilot technology. The recent crash of a Tesla in Autopilot mode is providing bad press for Tesla, and fodder for those who believe the technology is dangerous and must be curtailed. Every new technology brings both benefits and potential risks. If the benefits outweigh the risks on the whole, the public interest is not served when the legal, regulatory and privacy pendulum swings too far in response. Creating a legal, regulatory and privacy landscape that fosters the growth of AR, while appropriately addressing the risks AR creates and exacerbates, is critical.

Safe Harbor Framework for EU to US Personal Data Transfers May Not Be “Adequate” After All

This week, the Advocate General of the European Court of Justice (ECJ) issued a preliminary and non-binding assessment in an ECJ case recommending that the ECJ find the US-EU Safe Harbor Framework to be invalid.

For US companies with European subsidiaries that regularly need to transfer data back to the US home office, one of the primary data privacy considerations is compliance with the EU’s Data Protection Directive. Each EU member state has adopted their own data protection law based on the Directive. The Directive covers personal data in the European Economic Area (the EU, Iceland, Liechtenstein and Norway).

Under Article 25 of the Directive, the transfer of personal data to a country or territory outside of the EEA is prohibited unless that country or territory can guarantee an “adequate” level of data protection in the eyes of the EU.  In some cases, the EU will declare a country to have “adequate” protections in place (e.g., Canada based on their national PIPEDA data privacy law).

The US is one of the countries that is not deemed “adequate” by the EU.  (The US does not have a comprehensive national privacy law like Canada or the EU, but instead uses a “sectoral” approach to regulate data privacy.)  Because of this, the EU controller of the personal data must ensure that the US company receiving the data has an adequate level of protection for personal data to permit the data transfer.  This can be achieved in a number of ways, including:

  • The Directive defines a number of situations in which adequacy is presumed statutorily, such as where the data subject consents to the transfer, the transfer is necessary for the performance of, or conclusion of, the contract between the data subject and data controller, or it is necessary to protect the vital interests of the data subject.
  • A company’s Board of Directors can adopt binding corporate rules requiring adequate safeguards within a corporate group to protect personal data throughout the organization.
  • The EU entity and US entity can enter into an approved contract (utilizing a model contract terms approved by the EU) with provisions ensuring data is adequately protected.
  • The transfer is to a US entity which participates in the Safe Harbor Framework, a program agreed upon by the US and EU in 2000 under which US companies that self-certify that their data protection policies and practices are in compliance the requirements of the Framework are deemed to have an “adequate” level of data protection for EU data transfer purposes.  Over 5,000 companies have certified their compliance with the Safe Harbor Framework.

Edward Snowden’s revelations regarding US government surveillance programs and practices created many questions regarding whether the Safe Harbor Framework was truly “adequate” for EU purposes, since regardless of a company’s own policies and practices the US government could access the personal data of EU data subjects stored on US servers.  This week, in a case brought by an Austrian student challenging the transfer of his data to the US by Facebook under the Safe Harbor framework, the Advocate General of the European Court of Justice (ECJ) issued a preliminary and non-binding assessment recommending that the ECJ find the Safe Harbor Framework to be invalid.  The ECJ can ignore the Advocate General’s recommendation, but does so only rarely.

The language of the decision will be very important, as the potential for US government surveillance of and access to personal data of EU data subjects stored in the US goes beyond the Safe Harbor framework.  A broad decision could create problems for the ability of US companies to achieve adequacy for EU data transfer purposes, regardless of the adequacy approach used — US government surveillance could be determined to trump any adequacy approach taken by US companies in the eyes of the EU. However, a finding that the US government’s surveillance practices call into question the adequacy the transfer of data to US companies in general could cause major headaches and disruptions for US businesses, and would have political and economic ramifications. It will be interesting to see how deep down this rabbit hole the ECJ is willing to go.

Companies which participate in the Safe Harbor Framework should immediately start looking at alternative choices for achieving “adequacy” in the eyes of the EU to allow for continued data transfers.  Companies should also look at whether any of their vendors rely on safe harbor in the performance of obligations, and contact them regarding their contingency plans if Safe Harbor is found to be invalid. If the ECJ adopts the Advocate General’s recommendation, it is unclear whether they will provide any grace period to all companies to implement an alternative approach.  Public reporting companies participating in the Safe Harbor framework may also want to consider whether this uncertainty should be cited in their risk factors for SEC reporting purposes.

FTC opens their nationwide tour to promote Start with Security

It’s not the latest group on tour with a band name and album name that needed a lot more thought.  Earlier this year, the FTC announced that they would be releasing guidance for businesses on data security.  In June, they did just that, releasing a guide called Start with Security: A Guide for Business.  It’s subtitled “Lessons Learned From FTC Cases” for a reason — it uses the 50+ FTC enforcement actions on data security to provide ten lessons companies should learn when approaching to security to avoid others’ missteps that led to enforcement actions, and practical guidance on reducing risks.  The lessons are:

  1. Start with security.  The FTC has long advocated the concept of “privacy by design,” meaning companies should bake an understanding of and sensitivity to privacy into every part of the business, making it part of the design process for new products and processes.  The FTC is advocating a similar concept of “security by design.” Guidance:  don’t collect personal information you don’t need (the RockYou enforcement action); don’t use personal information when it’s not necessary (Accretive and foru International); don’t hold on to information longer than you have a legitimate business need for it (BJ’s Wholesale Club).
  1. Control access to data sensibly.  Keep data in your possession secure by controlling access to it – limit access to those with a need to know for a legitimate business purpose (e.g., no shared user accounts, lock up physical files). Guidance: don’t let employees access personal information unless they need to access it as part of their job (Goal Financial); don’t give administrative access to anyone other than employees tasked administrative duties (Twitter).
  1. Require secure passwords and authentication.  Use strong password authentication and sensible password hygiene (e.g., suspend password after x unsuccessful attempts; prohibit common dictionary words; require at least 8 characters; require at least one upper case character, one lower case character, 1 numerical character, and 1 special character; prohibit more than 2 repeating characters; etc.)  Guidance: require complex and unique passwords (Twitter); store passwords securely (Guidance SoftwareReed ElsevierTwitter); guard against brute force attacks (Lookout ServicesTwitter, Reed Elsevier); protect against authentication bypasssuch as predictable resource location (Lookout Services).
  1. Store sensitive personal information securely (“at rest”) and protect it during transmission (“in motion”). Use strong encryption when storing and transmitting data, and ensure the personnel implementing encryption understand how you use sensitive data and can determine the right approach on a situation-by-situation basis.  Guidance: Keep sensitive information secure throughout the data life-cycle (receipt, use, storage, transmission, disposal) (Superior Mortgage Corporation); use industry-tested and accepted methods (ValueClick); make sure encryption is properly configured (FandangoCredit Karma).
  1. Segment your network and monitor who’s trying to get in and out.  Be sure to use firewalls to segment your network to minimize what an attacker can access.  Use intrusion detection and prevention tools to monitor for malicious activity.  Guidance: segment your network (DSW); monitor activity on your network (Dave & Buster’sCardsystem Solutions).
  1. Secure remote access to your network. Make sure you develop and implement a remote access policy, implement strong security measures for remote access, and put appropriate limits on remote access such as by IP address and revoking remote access promptly when no longer needed.  (The compromise of a vendor’s system via phishing, leading to remote network access, is how the Target breach started.)  Guidance: ensure remote computers have appropriate security measures in place, e.g., “endpoint security” (Premier Capital LendingSettlement OneLifeLock); put sensible access limits in place (Dave & Buster’s).
  1. Apply sound security practices when developing new products. Use “security by design” to ensure data security is considered at all times during the product development life-cycle.  Guidance: Train engineers in secure coding (MTS, HTC America, TrendNet); follow platform guidelines for security (HTC AmericaFandangoCredit Karma); verify that privacy and security features work (TRENDnetSnapchat); test for common vulnerabilities (Guess?).
  1. Make sure your service providers implement reasonable security measures. Make sure you communicate your security expectations to your service providers and vendors, and put their feet to the fire through contractual commitments and auditing/penetration testing. Guidance: put it in writing (GMR Transcription); verify compliance (Upromise).
  1. Put procedures in place to keep your security current and address vulnerabilities that may arise.  Data security is a constant game of cat-and-mouse with hackers – make sure to keep your guard up.  Apply updates to your hardware and software as they are issued, and ensure you are spotting vulnerabilities in, and promptly patching, your own software. Have a mechanism to allow security warnings and issues to be reported to IT.  Guidance: update and patch third-party software (TJX Companies); heed credible security warnings and move quickly to fix them (HTC AmericaFandango).
  1. Secure paper, physical media, and devices.  Lastly, while the focus these days seems to be on cybersecurity, don’t forget about physical security of papers and physical media.  Guidance: securely store sensitive files(Gregory NavoneLifelock); protect devices that process personal information(Dollar Tree); keep safety standards in place when data is en route (AccretiveCBR Systems); dispose of sensitive data securely (Rite Aid,CVS Caremark,Goal Financial).

As this guidance is based on what companies did wrong or didn’t do that led to FTC enforcement actions, it will be interesting to see how the FTC treats a company that suffers a data breach but demonstrates that they used reasonable efforts to comply with the FTC’s guidance.  I suspect the FTC will take a company’s compliance with this guidance into consideration when determining penalties in an enforcement action. The guidance is very high-level, so companies must rely on their IT and Legal teams to determine what steps, processes and protocols need to be implemented in alignment with the FTC’s guidance.

In addition to publishing the guide, the FTC has embarked on a conference series aimed at SMBs (small and medium-sized businesses), start-up companies, and developers to provide information on “security by design,” common security vulnerabilities, secure development strategies, and vulnerability response.  The first conference took place September 9 in San Francisco, CA; the second will take place November 5 in Austin, TX.

The FTC also announced a new website at which they’ve gathered all of their data security guidance, publications, information and tools as a “one-stop shop”.  You can find it at http://www.ftc.gov/datasecurity.

Podcast – the in-house perspective on trade secrets, privacy, and other topics

I recently had the privilege of being interviewed for IP Fridays®, a podcast series by Ken Suzan (of counsel and a trademark attorney at the Minneapolis office of Barnes & Thornburg LLP, and Dr. Rolf Claessen, partner at Patent Attorneys Freischem in Cologne, Germany.  We discussed the in-house perspective on a variety of topics, including trade secrets, copyrighting software code, and privacy.  Head to IPFridays.com if you’d like to listen, or click here to head straight to the podcast.

AppChoices – Behavioral Advertising Controls Gone Mobile

Online behavioral advertising (also known as “interest-based” advertising and “targeted” advertising) is the use of information collected about an individual’s online behavior (e.g, web browsing history) to serve online advertisements through ad networks tailored to that individual’s interests. Online behavioral advertising is broken into two categories — first party (online ads served on a website based on an individual’s online behavior on that website) and third party(online ads served on a website based on an individual’s online behavior on other websites). Online behavioral advertising is designed to increase the click-through rate by serving ads of greater interest to consumers.  Studies have shown that a majority of consumers prefer targeted online ads over irrelevant ones.  However, behavioral advertising also raises privacy concerns, as to deliver targeted advertising to an individual you need to collect information about that individual (and the scope of collected information could be broad, potentially including sensitive information).

Back in 2009, the FTC released a report on online behavioral advertising recommending industry-self regulation of third party online behavioral advertising (and implying they would step in if industry self-regulation was ineffective).  In response to the FTC’s report, a group of advertising and marketing trade associations including the Direct Marketing Association, Interactive Advertising Bureau, Better Business Bureau, and Network Advertising Initiative formed the Digital Advertising Alliance.  The DAA developed the “AdChoices” program to provide consumers with the ability to control whether data about them can be used for third party online behavioral advertising purposes.

The primary consumer-facing aspects of the AdChoices program are (1) the DAA Icon, an “i” in a triangle, which companies can use to provide more prominent notice of that company’s interest-based advertising practices; and (2) the Consumer Choice page, a web page introduced in 2010 through which consumers can opt out of the collection and use of web viewing data for online behavioral advertising and other applicable uses.  It’s a good idea for companies to include a link to the Consumer Choice page in their privacy policy.

Since 2010, more and more advertising (including behavioral advertising) is served through ad-supported mobile apps. As a result, last week the Digital Advertising Alliance (“DAA”) introduced two enhancements to the AdChoices program to extend it to mobile apps:

  • The AppChoices mobile application, available for Android and Apple devices, that gives consumers the ability to opt out of the collection of app usage data for online behavioral advertising and other applicable uses.  The AppChoices app can be downloaded from major app stores.  The DAA hosts a page with app store links at http://www.aboutads.info/appchoices.
  • The Consumer Choice page for Mobile Web, an updated and mobile-optimized version of the current Consumer Choice page.

The purpose of the DAA is to demonstrate to the FTC that industry self-regulation of behavioral advertising works.  The industry groups forming the DAA know that if they fail in their mission, the FTC will step in to regulate behavioral advertising.  FTC regulations on behavioral advertising would likely be more onerous than the current self-regulatory principles, and may favor privacy protections over the benefits of targeted advertising to consumers and businesses. This is why businesses should be rooting for the DAA to succeed, and should support their efforts. Look for a major push from the DAA and its member groups to drive increased adoption and usage of both current and new self-regulatory tools in the marketplace.  Companies should consider including updating their privacy policies to include information about the AppChoices download page as well as a link to the Consumer Choice page.