The California Consumer Privacy Act: Why (and How) to Start Preparing Now

By now most companies have heard about the California Consumer Privacy Act (“CCPA”).  Privacy as an inalienable right has been enshrined in the California Constitution since 1972, and California has developed a reputation as being at the forefront of state privacy legislation. California is also known for grassroots-driven legislation through the ballot initiative process known as “Propositions.” The Cambridge Analytica scandal led to a combination of the two – a proposed data privacy law with extremely burdensome obligations and draconian penalties garnered enough signatures to appear on the ballot in November 2018.  To prevent this from happening, the California legislature partnered with California business and interest groups to introduce and pass the California Consumer Privacy Act. It went from introduction to being signed into law by the governor of Californiain a matter of days in June of 2018, resulting in withdrawal of the ballot initiative. No major privacy legislation had ever been enacted as quickly. The law will become effective as early as January 1, 2020 – the effective date is six (6) months after the California Attorney General releases implementing and clarifying regulations which are expected sometime in 2019. Given the speed at which it was enacted, CCPA has numerous drafting errors and inconsistent provisions which will need to be corrected. In addition, as of the date of this article, the implementing regulations are not yet released.

Other states have introduced statutes similar to CCPA, and there is some discussion in Congress about a superseding national data privacy law. Because of this, companies may want to look at CCPA compliance from a nationwide, and not California, perspective. For companies hoping that CCPA will be scaled back or repealed, that’s not likely to happen.  The clock is ticking for businesses to develop and implement a compliance plan.When determining what compliance approach to take, consider the wisdom of the “Herd on the African Savanna” approach to compliance –the safest place to be in a herd on the African savanna is right in the center. It’s almost always the ones on theoutside which get picked off, not the ones in the center. The ones more likely to be “picked off” through an investigation or lawsuit are the ones at thefront of the herd (e.g., those who desire to be viewed as a leader in compliance) and the ones at the back of the herd (e.g., those who start working on compliance too late or don’t make serious efforts to be in compliance). For many companies, being in the center of the herd is the optimal initial position from a compliance perspective. Once additional compliance guidance is released, e.g., through clarifying regulations, press releases, or other guidance from the state Attorney General, companies can adjust their compliance efforts appropriately.

In this article, I’ll talk through steps that companies may want to consider as a roadmap towards CCPA compliance.(This is a good place to note thatthe information in this article does not constitute legal advice and is provided for informational purposes only. I summarize and simplify some of CCPA provisions for ease of discussion; you should look at the specific language of the statute to determine if a provision applies in your case. Consult your own internal or external privacy counsel to discuss the specifics of CCPA compliance for your own business.)

 

A Quick CCPA Refresher

The first problem with the “California Consumer Privacy Act” is its name. It applies to personal information collected about any California resident (not just consumers) in either a business-to-consumer (“B2C”) or business-to-business (“B2B”) context.It applies to almost every business entity that collects personal information about California residents, their affiliates, their service providers, and other third parties with which personal information is shared or disclosed. The use of “service provider” and “third party” are somewhat similar under the CCPA – both are businesses to which a company discloses a person’s confidential information for a business purpose pursuant to a written contract. The difference between the two is whether the information is being processed on behalf of the disclosing company. For example, SalesForce would be a service provider – it is processing personal information on behalf of your company. However, if the company with whom you share personal information processes it for its own benefit, not yours, it’s a “third party” under the CCPA.

“Personal Information” is defined extremely broadly under CCPA, in some ways even more broadly than under the EU’s General Data Protection Regulation (“GDPR”).  It is information that identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particularperson or household. It includes but is not limited to IP address, geolocation data, commercial information, professional and employment-related information, network activity, biometric information, audio/video, and behavioral analytics about a person.  CCPA covers businesses which “collect” a California resident’s personal information, defined as actively or passively obtaining, gathering, accessing or otherwise receiving personal information from a person, or by observing the person’s behavior. It also covers “selling” personal information, which is another bad choice of a defined term – “selling” personal information under the CCPA includes selling, renting, sharing, disclosing, or otherwise communicating (by any means) personal information to another business or third party for monetary or other valuable consideration, with some significant exceptions.

CCPA creates five (5) rights for California residents:

  • TheRight to Know – California residents have a general right to know the categories and purposes of personal information collected, sold, or otherwise disclosed about them.
  • TheRight to Access and Portability– California residents have a specific right to know how, why and what personal information about them is being collected, sold or disclosed, and if information is provided electronically, to receive it in a portable format.
  • TheRight to Deletion – California residents have a right to request the deletion of personal information about them collected by a business, with some exceptions.
  • TheRight to Opt Out– California residents have a right to say “no” to a company’s sale, sharing or transfer of their personal information with third parties.
  • TheRight to Equal Service & Pricing – California residents have a right to equal service and pricing whether or not they choose to exercise their CCPA rights.

Creating and Implementing a CCPA Compliance Plan

For companies that have not already gone through a GDPR compliance effort, CCPA compliance can seem daunting. However, creating a solid compliance plan and getting started now maximizes the chance your company will be in good shape for CCPA once it becomes effective. Here are some things to consider as you create and implement a CCPA compliance plan for your business. (Please note that if your company has already gone through a GDPR compliance effort, some of these may already be fully or mostly in place.)

1. Identify CCPA Champions within your business

An important initial step towards CCPA compliance is to identify the person(s) within your company that will lead the CCPA compliance effort. CCPA compliance will require a cross-departmental team. This often includesLegal(to advise on and help to interpret the statutory and regulatory requirements and to monitor for new developments both on CCPA and similar federal and state legislation, and to create a compliance plan for the business if the company does not have a data governance team);DevelopmentandInformation Technology(to implement the necessary technical and operational systems, processes, and policies which enable CCPA compliance);Sales leadership (for visibility into CCPA compliance efforts and to help manage inbound requests for CCPA compliance addenda);Customer Support (as CCPA requires customer support personnel be trained on certain aspects of CCPA compliance);Security(if your company has a Chief Information Security Officer or other information security person or team);Data Governance(if your company has a data governance person or team); and anexecutive sponsor (to support the CCPA compliance efforts within the C-Suite). Depending on your company, there may be other involved groups/parties as well.

2. Determinehow CCPA applies to your business

A key early step in CCPA compliance is determininghow CCPA applies to your business.  There are different compliance requirements forcompanies that collect personal information,companies that process personal information as a service provider for other companies, andcompanies that “sell” personal information or disclose it for a business purpose.

  • Does your business collect information directly from California residents, e.g., through online web forms, through customer support contacts, from employees who are California residents, through creation of an account, etc.? If so, it must comply withCCPA requirements for companies that collect personal information (a “data controller” in GDPR parlance).
  • Does your business receive and process personal information on behalf of customers or other third parties?  If so, it must comply withCCPA requirements for companies acting as a service provider (a “data processor” in GDPR parlance.) If you are a service provider, you must ensure your service offerings enable customers to comply with their own obligations under CCPA.  If not, expect a lot of requests for assistance from your customers, which could result in significant manual effort.
  • Does your business (a) “sell” personal information to affiliates, service providers or third parties (other than for the excluded purposes under CCPA), and/or (b) disclose or otherwise share personal information with an affiliate, service provider or third party for operational or other notified purposes?  If so, it must comply withCCPA requirements for companies that “sell” personal information or disclose it for a business purpose. It’s important to note that under Section 1798.40(t)(2) of the CCPA, there are certain exceptions that when satisfied mean a company is not “selling” personal information under the CCPA. For example, a business does not “sell” personal information if a person uses that business’s software or online system, or gives consent for a business, to disclose their personal information to a third party, as long as that third party is also obligated not to “sell” the personal information. As another example, a business does not “sell” personal information when it shares it with a service provider, as long as certain conditions are met.

3. Inventory your data assets, data elements and data processes

One of the most important steps in CCPA compliance, and data privacy compliance in general, is to conduct a data inventory.  For CCPA purposes, consider inventorying yourdata assets (programs, SaaS solutions, systems, and service providers in which data elements are stored for use in data processes),data elements (elements of personal and other information stored in data assets), anddata processes (business processes for which data elements are stored and processed in data assets).  This inventory should also collect information on service providers and other third parties with whom data elements are shared or disclosed by your business, and the purposes for which information is shared or disclosed. Companies should try to complete this inventory as quickly as possible.  The CCPA compliance team should work to create a list of internal individuals who should complete this inventory; once all responses are received, the compliance team should consolidate the responses into a master table.

The data inventory is a snapshot in time.  It’s also important to refresh the data inventory on a regular basis, e.g., quarterly, to capture changes to data collection and usage over time.

4. Cease any unnecessary collection and/or storage of personal information (“data minimization”)

Once the data asset/element/process inventory is created, businesses should be encouraged to use the opportunity to conduct a “data minimization” exercise.  One of the central principles of data privacy isdata minimization – limiting the collection and storage of personal information to only what is directly necessary and relevant to accomplish a specified business purpose.  The CCPA compliance team should consider both (a) identifying data elements without an associated data process, which I call “orphaned data elements,” and purging and ceasing the further collection of stored orphaned data elements; and (b) identifying data collection which is not associated with an associated business purpose, which I call “orphaned data transfers,” and cease any further orphaned data transfers and terminate the associated contracts.  Also consider validating that record retention policies are being followed so that data is promptly irretrievably deleted once it is no longer needed for a business purpose.

5. Implement a compliance plan for the 5 CCPA privacy rights

The heart of a CCPA compliance plan is implementing compliance with the 5 privacy rights created by the CCPA.  A solidly-constructed plan should cover compliance requirements where the business acts as a data collector, a service provider, or a company selling or otherwise disclosing personal information.

a. The Right to Know

One of CCPA’s core requirements is to publicly provide the necessary disclosure of the CCPA privacy rights, including the specific methods by which a person can submit requests to exercise those rights.  Many companies will likely add this to their privacy policy, as well as any California-specific disclosures already made by the business. Don’t forget this disclosure needs to be added not only to websites, but to mobile apps.  The disclosures must include a list of the categories of personal information collected, sold or disclosed for business purposes during the previous 12 months, as well as a list of the business purposes for which the categories are used. If this information is collected as part of the data inventory, it can greatly simplify the process of creating this disclosure.  The implementation plan should include a process for updating these disclosures as needed to reflect a rolling 12-month period.

b. The Right to Access and Portability

Another key requirement is the implementation of a process to respond to verifiable requests from data subjects for the following information covering the 12-month period preceding the request date.  Companies will need to provide information including:

  • Thecategories of personal information collected about that person
  • Thecategories of sources from which the personal information is collected
  • Thebusiness/commercial purpose for collecting/selling personal information
  • Thecategories of third parties to which their personal information is shared/disclosed
  • Thespecific data elements collected about that person for the 12-month period preceding the request date (this could be read to conflict with data destruction policies or data minimization best practices, but I suspect that destruction policies or data minimization best practices will trump this disclosure requirement)
  • If a person’s personal information is sold or is disclosed for a business purpose by your business unit,additional information must be provided

If the data inventory is done right, it can be a source of data for this response (except for the requestor’s specific data elements). Companies must provide at least 2 methods for a person to submit a verifiable request – a toll-free number and website address are both required.  The California Attorney General will release regulations on how to “verify” a request. Information must be disclosed within 45 days of receipt of the request, but there is a process under CCPA to extend the time period to 90 days if necessary. If information is provided electronically, they must be provided in aportable format (e.g., an .xml file). The team that is responsible for fulfilling verified requests should be trained on how to prepare a response, and should test it before the CCPA effective date to validate that the process is working properly. You can’t require someone to have an account with you in order to submit a request. Don’t forget to train your website and customer service personnel on how to handle consumer requests.

Also, if you are a service provider, your clients will look to you to ensure they are able to pull information from your systems necessary for them to comply with the right to access and portability. Don’t overlook including in your compliance plan a review of your customer portal and interfaces (e.g., APIs) to ensure customers are able to satisfy their CCPA compliance obligations.

c. The Right to Deletion

Another key requirement is to implement a process todelete collected personal information of a person if that person submits a verified request for deletion, and todirect your service providers to do the same. Note that this does not apply to third parties with whom information has been shared or disclosed who are not service providers.  As with the right to access and portability, you can’t require someone to have an account with you to exercise this right.

There are many important exclusions to the right of deletion.  These include:

  • Completing a transaction with, providing a good or service requested by or reasonably anticipated under a business relationship with, or otherwise needed to perform a contract with, a person
  • Security purposes
  • Debugging and error resolution
  • Conducting formal research (many conditions apply)
  • “Solely internal uses” that are reasonably aligned with a person’s expectations based on the person’s relationship with the business
  • Compliance with a legal obligation
  • Internal uses in a lawful manner that is compatible with the context in which the person provided the personal information
  • Other limited exceptions under CCPA

d. The Right to Opt Out

This one may be the most challenging for many companies to implement.  It applies if a business “sells” personal information to third parties, or otherwise shares personal information with a third party (e.g., a data sharing agreement).   CCPA appears to provide that an opt-outwould not apply to information provided to a company’s own service providers to further the company’s own business purposes, as long as there are certain contractual requirements in place with the service provider.

CCPA requires companies to implement a “Do Not Sell My Personal Information” opt-out page linked to from the homepage footer on a website, and from their mobile apps.  (The description of a person’s rights under CCPA in section (a) above should include a description of the right to opt out.) Creating a process to verify requests (pending guidance from the California Attorney General) is especially important here since opt-out requests can be submitted by a person or that person’s “authorized representative.”  Once a request has been verified, the personal information of the data subject cannot be shared with third parties (e.g., by associating an opt-out flag with the personal information) until the person later revokes the opt-out by giving the business permission to share his or her personal information with third parties. However, you cannot ask for permission for at least 12 months from the opt-out date.  Companies must train their customer service representatives on how to direct persons to exercise their opt-out rights.

e. The Right to Equal Service and Pricing

As part of a CCPA compliance plan, businesses should consider ways to make sure that they do not charge more or otherwise “discriminate” against a person who chooses to exercise one of their CCPA rights.  A business can offer financial incentives to persons for the collection, sale or retention of their personal information, as long as that incentive is notunjust, unreasonable, coercive or usurious.

5. Verify you haveCCPA-compliant written contracts in place with service providers and third parties receiving personal information

Personal information governed by CCPA may only be disclosed to a service provider or third party under a written contract. Businesses should work with their internal or external Legal resource to validate that written contracts are in place with all service providers and third parties to which personal information is disclosed, and that there is a valid business purpose for disclosing the personal information. If no written agreement exists, work with your Legal resource to negotiate and execute a CCPA-compliant agreement. For existing written agreements, a CCPA contract addendum will likely be required which adds into the agreement the obligations and commitments required under CCPA. Don’t forget to look at any data sharing with your corporate affiliates which is likely under an inter-company agreement.

6. Prepare for compliance requests where your company is a service provider

If your company is a service provider to other businesses, you should expect to start receiving questions about, and contract amendments/addenda related to, CCPA.  It’s the inverse of #5 above. Consider how to most efficiently handle these requests. Some companies may want to consider whether to have a standard CCPA compliance addendum for use with customers, or to have a CCPA compliance statement on a public facing website that can be referred to as needed.  Work with Sales and account managers to educate them as to why the company cannot accept a customer’s own CCPA addenda, which may include more than just CCPA compliance terms.

 7. Take steps to permit continued use of de-identified personal information

Finally, a CCPA compliance plan should include implementation of appropriate steps as needed so your company can continue to usede-identified personal information (an information record which is not reasonably capable of being identified with, relating to, describing, being associated with or being directly/indirectly linked to the source person) andaggregated personal information (information relating to a group or category of persons from which individual identities have been removed, and which is not linked or reasonably linkable to aperson or device).  CCPA talks about de-identified data only with respect to the following requirements, but the same safeguards and processes would likely apply to aggregated personal information.

  • Implement technical safeguards to prohibit re-identification of the person to whom de-identified personal information pertains.
  • Implement business processes that specifically prohibit re-identification of de-identified personal information.
  • Implement business processes to prevent the inadvertent release of de-identified personal information.

Your company can only use de-identified personal information as long as it makesno attempt to re-identify the de-identified personal information (whether or not that attempt is successful).  If your company begins re-identifying personal information, cease any use of de-identified personal information immediately.

8. Review your security procedures and practices, and consider encryption and data redaction options

Finally, business are encouraged to review their security procedures and practices to ensure they are reasonable and appropriate to protect personal information in their possession or control. CCPA creates a private right of action for consumers whose un-encrypted or un-redacted personal information is subject to an unauthorized “access and exfiltration,” theft, or disclosure as the result of a business’s violation of its duty to implement and maintain reasonable security procedures and practices to protect personal information appropriate to the nature of the information. For this private right of action, CCPA specifically uses a different definition of personal information, the one found in California Civil Code § 1798.81.5(d)(1)(A). Here, “personal information” means a person’s first name or first initial and last name coupled with the person’s (i) social security number, (ii) driver’s license number or California identification card number, (iii) account number, credit or debit card number, in combination with any required security code, access code, or password that would permit access to an individual’s financial account, (iv) medical information, and/or (v) health insurance information, where such information is not encrypted or redacted. Any private right of action is sure to spawn a cottage industry of class action lawsuits. If your company collects and/or receives personal information as defined above, consider a review of your company’s security procedures and practices to ensure that they are reasonable and appropriate to protect such personal information given the nature of the information.

In 2016, the California Attorney General issued a Data Breach Report in which the Attorney General stated that “[t]he 20 security controls in the Center for Internet Security’s Critical Security Controls identify a minimum level of information security that all organizations that collect or maintain personal information should meet. The failure to implement all the Controls that apply to an organization’s environment constitutes a lack of reasonable security.” Given this, all companies are encouraged to review the Center for Internet Security’s Critical Security Controls to ensure that they meet the California AG’s minimum definition of “reasonable security.”

The California Attorney General’s report included other recommendations, such as use of multi-factor authentication on consumer-facing online accounts containing sensitive personal information, and the consistent use of strong encryption to protect personal information on laptops, portable devices, and desktop computers. Companies may want to evaluate whether implementing encryption at rest on servers, workstations, and removable media, and/or redacting personal information (e.g., through tokenization and deletion of the source data or another data redaction technique), would make sense as a part of its security procedures and practices.

 

Eric Lambert is counsel for the Transportation division of Trimble Inc., an geospatial solutions provider focused on transforming how work is done across multiple professions throughout the world’s largest industries. He supports the Trimble Transportation Mobility and Trimble Transportation Enterprise business units, leading providers of software and SaaS fleet mobility, communications, and data management solutions for transportation and logistics companies. He is a corporate generalist and proactive problem-solver who specializes in transactional agreements, technology/software/cloud, privacy, marketing and practical risk management. Eric is also a life-long techie, Internet junkie and avid reader of science fiction, and dabbles in a little voice-over work. Any opinions in this post are his own. This post does not constitute, nor should it be construed as, legal advice.

The Promise of, and Legal Issues and Challenges With, Blockchain and Distributed Ledger Technology

[Originally published in December 2016. Updated on April 7, 2018 to clarify the explanation of blockchain and distributed ledger technology and to add more information on the legal risks and challenges.]

Blockchain and distributed ledger technology is poised to revolutionize many aspects of the world around us. It may prove to be as disruptive and innovative of a force as augmented reality. Many people associate “blockchain” with “Bitcoin,” whose meteoric rise as a cryptocurrency has been well reported. However, they are not one and the same. Bitcoin is an application; blockchain and distributed ledger technology are the methods behind it.  But what is it? How might it change the world? And what legal and other risks does it bring?

What is Distributed Ledger Technology and Blockchain?

The Old – Centralized Ledgers

Centralized ledgers (a database, list, or other information record) have played an important role in commerce for millennia, recording information about things such as physical property, intangible property including financial holdings, and other assets. The most recent innovation in centralized ledgers has been the move from physical ledgers (paper, stone tablets, etc.) to digital ledgers stored electronically. A “centralized ledger” is a ledger maintained and administered in a single, central location (e.g., a computer database stored on a server) accessible by anyone without use of access controls (public) or through an access control layer by persons or organizations with valid login credentials (permissive). This is a “hub-and-spoke” system of data access and management. Centralized ledgers have historically had many benefits, such as minimized data redundancy, limited number of access points to the data for security purposes, centralized administration, and centralized end user access. However, there are also disadvantages, such as greater potential for loss or inaccessibility if the central location suffers a hardware failure or connectivity outage, inability to recover lost data elements, and a dependence on network connectivity to allow access to the ledger by its users.

The New – Distributed Ledgers

Distributed ledgers seek to address these disadvantages by distributing (mirroring) the ledger contents to a network of participants (aka “nodes”) through a software programso that each participant has a complete and identical copy of the ledger, and ensuring all nodes agree on changes to the distributed ledger. Nodes can be individuals, sites, companies/institutions, geographical areas, etc. There is no centralized administrator or “primary node” — if a change is made to one copy of the ledger, that change is automatically propagated to all copies of the ledger in the system based on the rules of the system (called a “consensus algorithm“) which ensures that each distributed copy of the ledger is identical. For example, in Bitcoin, each node uses an algorithm that gives a score to each version of the database, and if a node receives a higher scoring version of the ledger, it adopts the higher scoring version and automatically transmits it to other nodes. Since the distributed ledger software on each node validates each addition to the distributed ledger, it’s extremely difficult to introduce a fraudulent transaction (to put it another way, transactions are audited in real time). Essentially, each node builds an identical version of the distributed ledger using the information it receives from other nodes. The use of distributed models in computing goes back to the origins of the Internet itself — ARPANET, which evolved into what we know today as the Internet, used a distributed model instead of a linear model to manage the transfer of data packets between computer networks.

The software on each node uses cryptographic signatures to verify that it is authorized to view entries in, and make changes to, the distributed ledger. If a participant with rights to modify the ledger (e.g., a digital token giving the participant the right to record a transaction) makes an addition to the ledger using the participant’s secure keys (e.g., a record of a change in ownership of an asset or recording of a new asset), the addition to the ledger is validated by the consensus algorithm and propagated to all mirrored copies of the ledger, which helps to ensure that the distributed ledger is auditable and verifiable. A key difference between centralized and distributed ledgers is that a distributed ledger cannot be forked — if you make a copy of a centralized ledger and store it somewhere else, it will be out of sync with the original copy, whereas each copy of a distributed ledger is kept identical by the client software.

Thus, the five typical characteristics of a distributed ledger are:

  1. distributed copies among nodes via client software;
  2. cryptographic signatures, or “keys,” to allow nodes to view, or add to, the distributed ledger in an auditable and verifiable fashion;
  3. a digital token (better known as a cryptocurrency)used within many distributed ledger networks to allow participants to record ledger entries;
  4. a consensus algorithm to ensure distributed copies of the ledger match among participants without the need for a centralized administrator; and
  5. record permanency so that verified entry accepted to the ledger via the consensus algorithm becomes permanent (it can be corrected via a later addition to the ledger but never removed).

Blockchain

While most press reporting around blockchains equates blockchain with distributed ledgers, a “blockchain” is a specific type of distributed ledger. Each record of new value added to the ledger and each transaction affecting entries in the ledger (which we will collectively call a “block“) includes a timestamp and a cryptographic verification code based on a data signature or “hash” from the previous block which “chains” it to the previous block, forming a “chain of blocks,” or “blockchain,” within the nodes hosting the blockchain. Because each block is cryptographically tied to the previous block via one-way hash, the entire chain is secure – a client can verify that a block in the blockchain validates against the previous block, but it does not allow someone to trace the blockchain forward. If a block in the chain is altered, it changes the hash value and no longer matches the hash stored in later blocks, and the alteration will be rejected by the nodes on the blockchain network. In a blockchain, transactions entered into the system during a specified period of time are bundled together and added to the blockchain as a new block.

There are three primary types of blockchain networks – public, private, and permissioned.

  • Public blockchains allow anyone to participate, and therefore rely more heavily on a strong consensus algorithm to ensure the requisite level of trust between blockchain participants.
  • Private blockchainsare limited to a discrete and specified group of participants, are usually small, and may not require use of a cryptocurrency given the inherent level of trust amount private blockchain participants. Private blockchains often do not require a strong consensus algorithm.
  • Permissioned blockchainsfunction much like public blockchains, but require participants have permission to access, transact on, or create new blocks within a blockchain.

Tennessee’s recent state law on blockchain, Tn. Stat. § 47-10-201, contains a good summary definition.  It defines “blockchain technology” as “distributed ledger technology that uses a distributed, decentralized, shared and replicated ledger, which may be public or private, permissioned or permissionless, or driven by tokenized crypto currencies or tokenless.  The data on the ledger is protected with cryptography, is immutable and auditable, and provides an uncensored truth.”  Arizona’s statutory definition (which predates Tennessee’s) is almost identical, except that “crypto currencies” is replaced with “crypto economics.”

Bitcoin is an early, and famous, example of a public blockchain application. Nodes on the Bitcoin blockchain network earn new bitcoins as a reward for solving a cryptographic puzzle through computing power, or “mining.” Transactions for the purchase and sale of bitcoins are also recorded in a block in the Bitcoin blockchain – the blockchain is the public ledger of all Bitcoin transactions. In other blockchain applications, the cyrptocurrency is used as payment for blockchain transactions.

Blockchain and distributed ledger technology is not intended to fully replace existing centralized ledgers such as databases. If a number of parties using different systems need to track something electronically that changes or updates frequently, a distributed ledger may be a good solution. If those needs are not there, or if there is a continuing need to rely on paper transaction records, a centralized ledger continues to be the better choice. Companies need to ensure there is a compelling ROI and business case before implementing a blockchain development and implementation program.

Smart Contracts

An important concept in blockchain technology is the “smart contract.”  Tennessee’s blockchain law defines a smart contract as “an event-driven program, that runs on a distributed, decentralized, shared and replicated ledger and that can take custody over and instruct transfer of assets on that ledger.” Arizona’s definition is identical other than an additional reference to state.  In other words, a smart contract is a computer program encoded into a blockchain that digitally verifies, executes, and/or enforces a contract without the need for human intervention. Where a traditional contract involves risk that a party will fail to perform (e.g., a shipper delivers products but the recipient fails to make payment for the products), smart contracts are self-executing and self-verifying.  In a smart contract for the purchase of goods tracked via blockchain, the seller and buyer would program a smart contract into the blockchain.  Once the delivery record is added to the blockchain, the smart contract automatically validates the shipper’s performance, and automatically triggers payment from the buyer.  Since execution of a smart contract is part of the blockchain, it is permanent once completed. Blockchain protocols such as Ethereum have developed programming languages for smart contracts.

How Might Blockchain and Distributed Ledgers Change the World?

The impact of new technology presents at first as rapidly disruptive (positively and negatively), but often manifests organically and transparently to change the world over time.

Roy Amara, a former president of the Institute of the Future, said that people overestimate a technology’s effect in the short term and underestimate it in the long run, a statement known as “Amara’s Law.” However, I think a corollary is in order – the impact of new technology presents at first as rapidly disruptive (both positively and negatively), but often manifests organically and transparently to change the world over time at a proportional rate to the maturity of the commercially available applications, to consensus on technological standards, and to decreasing costs to implement (and increasing ROI from implementing) the technology in practical business and consumer situations. For example, RFID technology was touted early on as a “change the world” technology, and it has — but most prominently through integration of the technology organic and innovative improvements to supply chain and inventory management. Social networking is viewed by many as a “killer app” (a catalyst that accelerates the adoption of a new technology) which helped usher in the third Age of the Internet, and it has changed the world by changing how we connect with others. Both took years to become pervasive in society and industry.

Blockchain and distributed ledger networks have the potential to change the way many systems and business processes work across industries. Financial and currency transactions are a prominent emerging application of distributed ledger networks and blockchain technology. Since blockchain and distributed ledger networks are platform-agnostic, a distributed ledger could be stored in different hardware/software configurations across different nodes, reducing the need for expensive and time-consuming upgrades to support the distributed model. For example, a permissioned blockchain model could help an organization such as the US Veterans Administration better manage appointment scheduling across a large number of hospitals and clinics (in fact, a resolution was recently passed in the US House of Representatives promoting just that, “to ensure transparency and accountability.” Industry groups, such as the Blockchain in Transport Alliance (BiTA), have sprung up to help develop and promote industry-specific blockchain standards and applications.

The technology could also be used in applications such as better and more secure management of governmental records and other services; tracking tax collection and receipts; managing assets; identity verification; decentralized voting; managing and tracking inventory levels and B2B/B2C product fulfillment; tracking the “data supply chain” for the flow of data among systems; managing system access controls; protection of critical public and privacy infrastructure; tracking royalties due to artists for the use of their works; and use of smart contracts to digitally create, execute, and enforce agreements between parties via blockchain transactions. Distributed ledger networks have the advantage of being more secure as the consensus algorithm makes it considerably difficult for a cyber-attacker to successfully alter the distributed ledger. It could also allow for greater access transparency, a central tenet of many privacy principles, by allowing individuals to access records in the ledger relating to them or containing their information.

Blockchain and Distributed Ledger Legal Risks and Issues

As with any new technology, blockchain creates some interesting conflicts with existing laws and regulations and raises interesting and complex legal and compliance issues.  These include:

Data privacy issues. Distributed ledger technology such as blockchain is inherently designed to share information among every participant and node. If information in a ledger transaction or block contains private information, such as an account number or company confidential information, it will be visible to every user of every node. This is one of the reasons permissive and privacy distributed ledgers are a focus of many companies seeking to innovate in the space. Additionally, as nodes in a distributed ledger network can be geographically disparate, rules and requirements for the transfer of data between geographies may play a major role. It is also possible that at some point in the future decryption technology will evolve to the point where cryptographic signatures used in blockchain and distributed ledgers may no longer be considered safe.

EU personal data and the “Right to be Forgotten.”  In the EU, personal privacy is considered a fundamental human right under the Charter of Fundamental Rights of the European Union. The General Data Protection Regulation (GDPR) is Europe’s new comprehensive data protection framework that as of May 25, 2018 has the force of law in every EU member state.  Under Article 17 of the GDPR, EU data subjects have a “right to be forgotten” which requires companies to erase personal information about that data subject if certain conditions are met (e.g., the personal data is no longer necessary in relation to the purposes for which they were collected or otherwise processed). This right has cropped up in the United States as well, for example, in California for minors under 18 with respect to websites, social media sites, mobile apps, and other online services under Cal. Bus. & Prof. Code § 22580-81.  The “right to be forgotten” creates a direct conflict with the permanency of blockchain.  Companies should factor the “right to be forgotten” into their blockchain development planning, e.g., consider hashing technologies to pseudonymize personal data before encoding it into a blockchain, or other ways to avoid this conflict.  Developments in blockchain and distributed ledger technology may also arise to address this issue.

Jurisdictional issues.The nodes in a blockchain are often in multiple jurisdictions around the country and/or around the world.  As each is a perfect copy, this can create issues from a jurisdictional perspective.  Legal concepts such as title, contract law, regulatory requirements, etc. differ from jurisdiction to jurisdiction. Does a blockchain network need to comply with the laws of every jurisdiction in which a node is operated?  Cross-border enforcement may become an issue – will one jurisdiction seek to impose its laws on all other nodes of a blockchain network? Blockchain network operators should consider how to specify, in a binding manner, a single choice of law and venue to govern disputes arising from the blockchain network and provide specificity as to compliance requirements.  This jurisdictional issue will likely lead to races between jurisdictions to establish themselves as a “blockchain and distributed ledger friendly” jurisdiction, just as Delaware established itself as a “corporation-friendly” jurisdiction in which many corporations choose to incorporate.  Jurisdictional issues will also impact discovery of data within the digital ledger network, e.g., through subpoenas.  The rules regarding document discovery differ from state to state.  A company seeking to obtain blockchain data through judicial process may have the ability to engage in “forum shopping” to find the most convenient, and friendly, jurisdiction in which to file a document discovery request.

Record retention risks. One of the features of blockchain and distributed ledger networks is record permanency. This permanency may be incompatible with statutory requirements for data to be destroyed and deleted after a period of time, such as credit/debit card data under PCI rules and HR data under various regulatory requirements, and under privacy frameworks such as the GDPR.  It also likely conflicts with a company’s existing record retention policies.  Given these factors, companies looking to introduce blockchain technology should review their record retention policies and create a separate “permanent” category for data stored in blockchain applications.  At the same time, a blockchain is permanent so long as the blockchain itself still exists.

Service Level Agreements.  Many companies include a service level agreement (SLA) in their service agreements, which provides committed minimum service levels at which the service will perform, and often includes remedies for a breach of the SLA.  SLAs are relatively easy to offer when they are limited to a company’s own systems and infrastructure.  However, a blockchain (other than perhaps a small private blockchain) may by its very nature be distributed beyond a company’s own network.  SLAs often exclude from downtime issues outside of its control, e.g., downtime caused by a third party’s hardware or software.  Does a third-party node still fit within this? Many SLAs also address latency, i.e., the time it takes for a system to respond to an instruction. Companies will also need to think about what measure of latency (if any) should apply to transactions via blockchain and other distributed ledgers, and how to address blockchain in their SLAs.

Liability and Force Majeure issues. Companies routinely implement controls (processes and procedures) to manage their systems and operations, which controls may be audited by customers/partners or certified under standards such as SOC 2. But who is accountable for a database distributed across geographies and companies? Use of a distributed ledger system with nodes outside of a company’s systems means ceding some control to an automated process and to a decentralized group of participants in the distributed ledger/blockchain. An error in a record in a distributed ledger becomes permanent and can be corrected but never removed. Is an issue with a third-party node considered a force majeure event which excuses performance under an agreement? Is the type of network (public, private or permissioned) a factor?  Companies will need to think about how blockchain should tie into an agreement’s general force majeure provision, and how to allocate blockchain risk within a contract (through indemnities, limitation of liability, etc.).

Insurance issues.  Any new technology is quickly tested under insurance policies.  Companies will begin to tender claims under their electronic errors and omissions policies, commercial general liability policies, and possibly specialized cyber policies.  As insurance companies build up experience with blockchain claims, companies will likely see new endorsements and exclusions limiting insurance carriers’ liability under standard policies for blockchain-related losses.  This is often closely followed by the emergence of custom policy riders (for additional premium) to provide add-on insurance protection for blockchain-related losses.  Companies implementing blockchain technologies may want to discuss blockchain-related losses with their insurance carriers.

Intellectual property issues.As with any new technology, there has already been a flood of patent applications by companies “staking their claim” in the brave new frontier of blockchain and distributed ledger. While the core technology is open source, companies have created proprietary advancements in which they may assert patent or other intellectual property rights.  Dozens of companies have already obtained blockchain patents.  Technology and other financial companies have undoubtedly already filed large numbers of blockchain patents that are working their way through the Patent and Trademark Office.  As is often the case with new technologies, there will likely be a flurry of patent infringement lawsuits as new patent holders seek to enforce their exclusive rights to their inventions.  Adopters of blockchain using custom applications or non-standard implementations should be especially sensitive as to whether their application or implementation could potentially be infringing filed or issued blockchain patents.  Consulting external patent counsel knowledgeable in blockchain technology will become more and more important for these types of adopters.

Confidentiality issues. Information placed into a node of a public blockchain – even if that node is within a company’s own servers – is no different than putting code into GitHub. The result is that the information enters the public domain. Even with a private or permissioned blockchain, information encoded into the blockchain becomes visible to all participants with access rights.  A company’s use of a blockchain or distributed ledger to store confidential information, such as information subject to an NDA or the company’s own trade secrets, creates a risk of a breach of confidentiality obligations or loss of trade secret protection.  Companies should consider how to prevent confidential and other sensitive company information from being stored in blockchains in a manner that could result in a breach of confidentiality. Additionally, agreements routinely require the return or destruction of the discloser’s confidential information and other provided data and/or materials upon termination or expiration. An exception for data encoded onto a blockchain must be considered.

Discovery and Subpoenas.  Information encoded into a public blockchain may be considered in the public domain.  When litigation arises, will companies be able to push back on a discovery request encompassing data in a blockchain by stating that it is publicly available?  If a person can find the identity of other nodes in a blockchain network, we may see an increase in subpoenas directed to a node for blockchain data within the copy of the blockchain or digital ledger hosted at that node (possibly based on favorable jurisdiction as noted above). Since every node maintains their own copy of a distributed ledger, and no one node owns or controls the data, this may affect the ability of a company to keep information out of third party hands as they may not have the ability to quash a subpoena directed at an independent node.

Application of existing legal structures to blockchain, smart contracts, and distributed ledgers. As is often the case, one of the challenges for lawyers and others is determining how existing laws and regulations will likely be interpreted to fit new technologies such as blockchain and distributed ledger technology; what new laws and regulations may be coming and how permissive or restrictive they may be; and how enforcement and penalties in connection with the new technologies under both new and existing laws will play out. “Smart contracts” that rely on computer algorithms to establish the formation and performance of contracts may challenge the nature and application of traditional legal principles of contract law such as contract formation and termination, and the traditional focus of laws on the acts of persons (not automated technologies), making it difficult for courts to stretch traditional contract law principles to the new technology.

Emerging laws.  It is axiomatic that law lags technology. The companies that immediately benefit from a new disruptive business method such as blockchain are those which seek to innovate applications of the method to monetize it, obtain a first mover advantage, and ideally seize significant market share for as long as possible. Industry groups and trade associations form to seek to promote it, and legislators take notice (especially given the meteoric rise of bitcoin prices during 2017). Legislators often jump to regulate something they don’t fully understand and whose potential is not fully realized, which can impede development and proliferation of the new technology.  A handful of states (including Arizona, Nevada, Tennessee, Delaware, Illinois, Vermont, and Wyoming) have already adopted blockchain-specific legislation, and this number will likely grow substantially in the next couple of years. Fortunately, the legislation enacted to date appears to support, rather than inhibit, blockchain technology. Other states have introduced or enacted legislation to study blockchain technology.

Disruptive technologies such as blockchain and distributed ledger technology bring both benefits and potential risks. If the benefits outweigh the risks on the whole, the public interest is not served when the legal, regulatory and privacy pendulum swings too far in response. The spread of blockchain and other distributed ledger technologies and applications will be dependent on the creation and fostering of a legal, regulatory, and privacy landscape that fosters innovation in the space.

Eric Lambert is the Commercial Counsel for the Transportation and Logistics division of Trimble Inc., an integrated technology and software provider focused on transforming how work is done across multiple professions throughout the world’s largest industries. He is counsel for the Trimble Transportation Mobility (including PeopleNet, Innovative Software Engineering, and Trimble Oil and Gas Services) and Trimble Transportation Enterprise (including TMW and 10-4 Systems) business units, leading providers of software and SaaS fleet mobility, communications, and data management solutions for transportation and logistics companies. He is a corporate generalist and proactive problem-solver who specializes in transactional agreements, technology/software/cloud, privacy, marketing and practical risk management. Eric is also a life-long techie, Internet junkie and avid reader of science fiction, and dabbles in a little voice-over work. Any opinions in this post are his own. This post does not constitute, nor should it be construed as, legal advice.

The Why of Privacy: 4 Reasons Privacy Matters to People, and Why Companies Need to Know Them

While almost all companies collect and use their customers, visitors and users’ personal information, primarily online and through in-person customer interactions such as point-of-sale transactions, the privacy landscape is in a near-constant state of turbulence and flux. There is the steady flow of data breach reports affecting companies of almost every size and market segment. New data privacy laws, rules and regulations continue to be introduced and enacted around the world, such as the US-EU Privacy Shield program, the EU General Data Protection Regulation (GDPR), and Argentina’s draft Data Protection Bill, placing new legal obligations and restrictions on the collection and use of personal information. Challenges continue to be raised against laws which are perceived to overreach or conflict with privacy rights, such as the continued challenges to the Privacy Shield program and EU’s Model Contract Clauses.

The one constant in this turbulent landscape is that consumers’ awareness of data privacy and security continues to grow. Given this, it is important to step back from the day-to-day privacy developments and look at a more fundamental question. It is axiomatic in the world of privacy that privacy matters to people, but why it matters is more complicated. People often argue about why privacy is important to individuals, but there is no “one-size-fits-all” answer. Privacy matters to different people in different ways, so there are many equally valid reasons why privacy is important to individuals.

Understanding the “why of privacy” is also critically important to businesses and other organizations. By now, most companies understand the importance of providing notice of their privacy collection practices and choice with respect to the use of collected information. A company collecting, processing and/or controlling personal information that understands the reasons privacy matters to the data subjects whose data they collect and use can design more effective privacy practices and policies attuned to the needs of their data subjects, such as by creating customer privacy profiles for use in product design and testing.  This follows the “privacy by design” framework advocated by the Federal Trade Commission and helps increase trust in the company’s commitment to data privacy and security, which is critical to the success of every company in today’s world and can provide a competitive advantage.

The reason why privacy matters differs from person to person. However, I believe these reasons can be grouped into four core categories: (1) privacy is a right, (2) privacy is an entitlement, (3) privacy is an expectation, and (4) privacy is a commodity. I’ll explore each of them in turn.

Privacy is a Right

Persons falling into this first category value privacy as an irrevocable right guaranteed to all. People living in countries with constitutional data privacy protections often fall into this category. For example, the European Union Charter of Fundamental Rights recognizes the right to data protection and the right to privacy as fundamental human rights. In some countries, it has been implied through interpretation of constitutional and legal rights, such as the right to privacy found by the U.S. Supreme Court and the right to privacy recognized under the Canadian Charter of Rights and Freedoms even though it does not specifically mention privacy. In August 2017, a unanimous Supreme Court of India held that privacy is a fundamental right as an integral part of the Right to Life and Personal Liberty guaranteed under Article 21 of the Constitution of India.  The 1948 United Nations’ Universal Declaration of Human Rights states that people have a fundamental human right not to “be subjected to arbitrary interference with their privacy, family, home or correspondence.”

  • People in this category are more likely to take a very rigid view of privacy trumping all other interests, including business interests, and may be less willing to “trade” any of their privacy for other benefits such as increased security.
  • People in this category tend to expect that any consent given to use personal information must be clear, unambiguous, express, and fully revocable and that use of the information must be specifically limited to the grant of rights or as otherwise expressly permitted by law, which creates a significant burden for businesses and other organizations collecting and using personal information.
  • Privacy as a right is an individual view– the rights of the individuals to protect their personal information are paramount to almost all other rights by others to use or access that personal information.

Privacy is an Entitlement

Persons falling into this second category value privacy as something to which they are entitled under laws, rules and regulations applicable to them. There are many laws, either comprehensive data privacy laws such as Canada’s PIPEDA or sectoral laws such as the privacy laws enacted in the United States, whose prohibitions or restrictions on privacy practices may be viewed by individuals as creating privacy obligations to which they are entitled. An example is the U.S. Children’s Online Privacy Protection Act, which among other things prohibits the collection of personal information from children under 13 without verifiable parental consent. Some parents view COPPA as creating an entitlement for their children to be left alone unless the parent consents to the collection of personal information from their children.

  • Similar to privacy as a right, people in this category are likely to view privacy as trumping other interests, including business interests, and may be less willing to give up privacy for other benefits.
  • They tend to expect that any consent given to use personal information must be fully compliant with legal requirements, and that use of the information must be specifically limited to those use rights expressly permitted by law, which creates a burden for businesses and other organizations collecting and using personal information.
  • As with privacy as a right, privacy as an entitlement is an individual view, where a individual’s entitlement to privacy outweighs other interests in a person’s personal information.
  • A key differentiator between privacy as a right and privacy as an entitlement is that an entitlement can be revoked, e.g., through changes to the law, whereas a right is irrevocable. While some might argue that a judicially-recognized right to privacy should be an expectation, I believe that the recognition by a country’s supreme court that privacy is a right, which is unlikely to be overturned or legislatively reversed, should be considered a right.

Privacy is an Expectation

Persons falling into this third category value privacy as something they expect to receive, whether or not they have a right or entitlement to it. New technologies (such as drones and biometric identifiers) and practices (such as marketing strategies) tend to be ahead of laws specifically governing them, and people in this category expect to receive privacy protections regardless of whether existing laws or other rights cover the technology or practice. They may also expect societal norms with respect to privacy to be followed by businesses and other organizations, whether or not stricter than applicable legal requirements. There are also certain expectations of privacy that are generally recognized within a given society. For example, in the United States, many people have an expectation of privacy in their own home and other private areas such as a public bathroom stall. If a person or organization interferes with this expectation of privacy, there may be legal liability for invasion of privacy under state laws. There are other expectations of privacy on a per-situation basis, such as a private conversation between two individuals.

  • People in this category believe that third parties, such as companies and government entities, should recognize that their expectation of privacy trumps those third parties’ desire (or rights) to access and use their personal information, but also understand that the expectation of privacy has limits. For example, a person should not have an expectation of privacy in a public place (e.g., a public sidewalk), and there is no right of privacy that extends to a person’s garbage placed on the street for collection.  In the United States, there is also no expectation of privacy in the workplace.
  • An expectation of privacy can be breached by a superior interest by a third party. For example, if a court approved surveillance of someone suspected of engaging in illegal activity, any expectation of privacy that person may have that his conversations are private is superseded by the government’s interest in preventing and prosecuting crime.
  • People in this category also generally do not question or challenge the terms of a privacy policy or other agreement granting rights to use or collect their personal information. People in this category also tend to expect businesses and other organizations collecting and/or using their personal information will not unreasonably collect or use their personal information, and will respect usage opt-out requests.
  • Privacy as an expectation is a middle-of-the-road view, in which the individual view of privacy as paramount is tempered with the understanding that in some cases the general or specific value of allowing a third party to receive and use their personal information outweighs the personal interest.

Privacy is a Commodity

Persons falling into this fourth category value privacy as a commodity that they are willing to exchange for other benefits, goods or services. We live in an information economy, where data has been commoditized. To many companies a core or important part of their product or service offering (i.e., part of the general value of the product or service) or business strategy is the ability to monetize personal, aggregate, and/or anonymous data collected through its use. Companies argue that the value derived from data monetization is factored into the value and cost of the product or service. Other companies offer something of specific value, such as registering for an extended product warranty, for sharing personal information such as an email address or demographic information. Many people give businesses some rights to use their personal information simply by visiting a webpage, requesting information from them, or purchasing goods or services from them in which they agree to be bound by the company’s privacy policy or terms of use/terms of sale. We also live in a world where many people are willing to sacrifice some privacy in exchange for increased security against terrorism and other potential physical and cyber threats. People falling into this category have a strong understanding of the trade-off between privacy and other benefits.

  • People in this category are more willing to give third parties the right to use their information as long as the thing they receive in return is valuable enough to them – they view their personal information as currency. If a company or organization offers something of value, they are very likely to agree to share personal information with that company or organization. These are the kind of people who don’t really care that they’re receiving targeted ads while surfing online.
  • Conversely, if they do not believe they are receiving value in return for their personal information, people in this category are more likely not to share their information.
  • Privacy as a commodity is a transactional view, meaning that the an individual is willing to allow a third party to receive and use their personal information if the general or specific value of allowing that third party to receive and use the information outweighs their personal interest in keeping their information.
  • It may require a greater transfer of value to convince someone viewing privacy as a right, entitlement or expectation to treat it as a commodity.

 

As a closing thought, these four reasons why privacy matters to people are not mutually exclusive, meaning that there are additional sub-categories of people for whom two or more of these reasons are important. For example, it is possible for someone to view privacy as both an entitlement and a commodity. Such a person would expect that while they have the ability to exchange their personal information for something of value, it must always be a voluntary exchange – they would reject any need to trade away their personal information. Businesses who take the time to understand the “why of privacy” will find themselves better positioned to create sample customer profiles based on their customers’ privacy values, leading to more robust privacy practices, processes and policies and a potential competitive advantage on privacy in the marketplace.

Eric Lambert has spent most of his legal career working in-house as a proactive problem-solver and business partner. He specializes in transactional agreements, technology/software/e-commerce, privacy, marketing and practical risk management. Any opinions in this post are his own. This post does not constitute, nor should it be construed as, legal advice. He is a technophile and Internet evangelist/enthusiast. In his spare time Eric enjoys reading and implementing and integrating connected home technologies and dabbles in voice-over work.

EU-US Privacy Shield Update – Roadworthy (For Now), But All Roads May Be Dead Ends

Last July, the new US-EU Privacy Shield framework became effective. The Privacy Shield replaced the International Safe Harbor Privacy Principles (commonly known as “Safe Harbor”) which had been in place since 2000. Under the EU Data Protection Directive, companies can only transfer data outside of the EU to a country deemed to have an “adequate” level of data protection, and the US (which takes a sectoral approach to data privacy and has no comprehensive national data privacy law) is not one of those countries. Given the importance of EU-US data transfer in the global economy, the Safe Harbor principles were developed as an entity-level, instead of country-level, adequacy mechanism, to allow a US company to achieve a level of adequacy (in the eyes of the EU) which allowed EU-US data transfers with that company to take place. Safe Harbor served as an alternative to two other entity-level adequacy mechanisms: standard contract clauses (SCCs, also known as model contract clauses), which are separately required for each EU company transferring data to a US entity making them difficult to scale, and binding corporate rules (BCRs), which require Board of Directors approval and significant time and resources and have only been implemented by very large multinational companies. (There is also an individual-leveladequacy mechanism – direct consent.)

Everything changed in October 2015, when the European Court of Justice (ECJ) released its decision in a case brought against Facebook brought by Austrian citizen Max Schrems. The ECJ held that the Safe Harbor framework did not provide adequate privacy protections to EU individuals, and was therefore invalid. Among other reasons for invalidation, the ECJ found broad US government powers to access data (including data of EU citizens) held by private US companies directly conflicted with the EU’s declaration of data protection as a fundamental human right. Given the importance of the Safe Harbor program in facilitating EU-US data transfers, its invalidation had a far-reaching impact. While the EU agreed to wait a few months before bringing any actions against companies in the Safe Harbor program which did not move to an alternative entity-level adequacy mechanism, US companies faced a difficult choice – switch to an alternative and more difficult/costly approach, such as standard contract clauses, or wait and see whether the EU and US could quickly agree on a Safe Harbor replacement before the EU’s enforcement deadline.

Fortunately, The European Commission and the US government quickly accelerated existing talks on resolving shortcomings of the Safe Harbor principles, leading to the announcement of the Privacy Shield program in February 2016. The European Commission quickly issued a draft adequacy decision for the Privacy Shield program, and despite some misgivings about the program from certain groups the European Union gave its final approval on July 12, 2016. The Privacy Shield program is made up of 7 core principles and 15 supplemental principles. Like Safe Harbor before it, it is a self-certification program, and there are a number of the principles common to both Safe Harbor and Privacy Shield. The Privacy Shield program seeks to address a number of the perceived shortcomings of the Safe Harbor principles, including protection for onward transfer of information by US companies to third parties such as their service providers, multiple ways for individuals to make a compliant about a Privacy Shield-certified company, stronger enforcement mechanisms, and an annual review mechanism. Its intent is to be a replacement entity-level mechanism which addresses the concerns around Safe Harbor cited by the ECJ in the Schrems decision, complies with EU laws, and respects EU citizens’ fundamental rights to privacy and data protection.

Challenges and Headwinds

Since the Privacy Shield program went live in July, over a thousand companies (1,234 as of December 10, 2016, according to the Privacy Shield List) have self-certified under the program. However, the Privacy Shield program, and EU-US data transfers in general, continue to face challenges and headwinds.

  • Legal challenges – déjà vu all over again?After the Privacy Shield program was announced in February 2016, some groups and individuals expressed concerns about the program. When Privacy Shield was approved in July 2016, Max Schrems went on record stating his belief that the Privacy Shield framework was fundamentally flawed and could not survive a legal challenge. As the first legal challenges against Privacy Shield have been filed, we will find out how prescient Mr. Schrems’ comments are. In September, the digital rights advocacy group Digital Rights Ireland filed an action in the EU courts arguing that the EU’s finding of adequacy for the Privacy Shield should be annulled on the basis that the Privacy Shield program’s privacy safeguards are not adequate. In November, a similar challenge was brought by La Quadrature du Net, a French privacy advocacy group. The results of these challenges may result in the Privacy Shield program being very short-lived. Additionally, the ECJ is considering another challenge against Facebook referred to it by the Irish Data Protection Commissioner, this time to standard contract clauses. The proponents in that case are arguing that the same concerns behind the ECJ’s Safe Harbor decision should apply to standard contract clauses. The forthcoming decision in this challenge has the potential to create a precedent that could bring down the Privacy Shield program as well.
  • Other public and private actions may erode Privacy Shield’s validity. On December 1, 2016, a change to Rule 41 of the Federal Rules of Criminal Procedure became effective. The change was intended to give investigators more power to obtain warrants against cyber criminals using botnets or otherwise masking their identity, such as through secure web browsers or virtual private networks. Under the amended rule, law enforcement seeking to use remote access to search media and obtain electronically stored information can obtain a warrant from a magistrate judge located in a district where “activities related to a crime may have occurred” if the actual location of the media or information has been “concealed through technological means.” Since this rule is not limited on its face to servers in the US, without further clarification of the scope of this rule it is possible for it to be used by law enforcement to have a US magistrate judge issue a warrant to search and seize information from servers located in the EU. This global reach would likely be found in direct conflict with the concepts of privacy and data protection as a fundamental human right under the EU’s Charter of Fundamental Rights. Additionally, in early October, reports surfaced that Yahoo! had secretly scanned the email accounts of all of its users at the request of US government officials, which if true would likely be inconsistent with the terms of the Privacy Shield agreement. Opponents of Privacy Shield could use actions such as these as ammunition in their efforts to invalidate the program. In fact, there have already been calls for the European Commission and the EU’s Article 29 Working Party to investigate the Yahoo! scanning allegations, and according to a European Commission spokesperson statement on November 11, 2016, the EC has “contacted the U.S. authorities to ask for a number of clarifications.”
  • Canany EU-US framework be adequate? The legal challenges and public/private actions cited above all lead to one fundamental question that many parties involved in the Privacy Shield program have been hesitant to ask – is there a fundamental, irreconcilable conflict between (1) the United States’ approach to privacy and (2) the EU’s commitment to privacy and data protection as fundamental human rights? If yes, the US’s sectoral approach to data privacy legislation and powers for law enforcement to obtain information from privacy companies and servers may mean that no entity-level mechanism to facilitate EU-US data transfers is “adequate” in the eyes of the EU, meaning that EU-US data transfers are approaching a dead end. While the US government has imposed restrictions on its surveillance activities in the post-Snowden world, it remains very unclear whether anything short of concrete legislation protecting the rights of EU citizens (which would run counter to US counter-terrorism activities), or a modification of the EU’s principles, would be sufficient. I suspect there may be a difference between the view of those in the EU seeking a pragmatic approach (those that believe that the importance of EU-US data transfers, including economic and geopolitical benefits, necessitate some compromise), and those seeking an absolute approach (those that believe that the EU’s belief that data protection is a fundamental human right must trump any other interests). The forthcoming decisions in the challenges to standard contract clauses and the Privacy Shield program will likely help shed light on whether this fundamental conflict is fatal to any entity-level mechanism.
  • Compliance, not certification, with the Privacy Shield principles is what matters.A number of US companies have chosen to tout their Privacy Shield self-certification via blog posting or press release (for examples, see here, here and here). While a press release on Privacy Shield certification can be a useful to demonstrate its global presence and commitment to data privacy, remember that it’s a self-certification process (although some companies are using third-party services such as TrustE to help them achieve compliance). A company’s certification of compliance with the Privacy Shield principles is less important than the processes and procedures they have put in place to manage their compliance. If you need to determine if a company is self-certified under the Privacy Shield program, you can search the database of certified companies at http://www.privacyshield.gov/list, and check their website privacy policy which should contain disclosures affirming and relating to their commitment to the Privacy Shield principles. If you’re a company certified under the Privacy Shield, be prepared to answer questions from EU companies on how you comply with the Privacy Shield principles – you may be asked.

So, what does all this mean?At the moment, Privacy Shield may be a bit rickety, but unless your company can effectively use standard contractual clauses or binding corporate rules, short of direct consent it’s the only game in town for US companies which need to receive data from their EU client, customers and business partners. Even SCCs may be a short-lived solution, meaning many companies may not want to invest the time, effort and expense required to adopt that entity-level approach. Due to the current state of Privacy Shield and EU-US data transfers in general, US companies may want to consider the wisdom of the “Herd on the African Savanna” approach to compliance – the safest place to be in a herd on the African savanna is in the center. It’s almost always the ones on the outside which get picked off, not the ones in the center. Unless there is a compelling business reason to be on the outside of the herd (desire to be viewed as a market leader, willingness to risk doing nothing until clearer direction is available, etc.), the safest place from a compliance perspective is to stick with the pack. While that approach is not for everyone, many companies may feel that being in the center of the herd of US companies dealing with EU-US data transfers is the safest approach while the fate of the Privacy Shield, and EU-US data transfers in general, plays out.

The Fourth Age of the Internet – the Internet of Things

We are now in what I call the “Fourth Age” of the Internet.  The First Age was the original interconnected network (or “Internet”) of computers using the TCP/IP protocol, with “killer apps” such as e-mail, telnet, FTP, and Gopher mostly used by the US government and educational organizations. The Second Age began with the creation of the HTTP protocol in 1990 and the original static World Wide Web (Web 1.0). The birth of the consumer internet, the advent of e-commerce, and 90’s dot-com boom (and bust in the early 2000’s) occurred during the Second Age. The Third Age began in the 2000’s with the rise of user-generated content, dynamic web pages, and web-based applications (Web 2.0). The Third Age has seen the advent of cloud computing, mobile and embedded commerce, complex e-marketing, viral online content, real-time Internet communication, and Internet and Web access through smartphones and tablets. The Fourth Age is the explosion of Internet-connected devices, and the corresponding explosion of data generated by these devices – the “Internet of Things” through which the Internet further moves from something we use actively to something our devices use actively, and we use passively. The Internet of Things has the potential to dramatically alter how we live and work.

As we move deeper into the Fourth Age, there are three things which need to be considered and addressed by businesses, consumers and others invested in the consumer Internet of Things:

  • The terms consumers associate with the Internet of Things, e.g., “smart devices,” should be defined before “smart device” and “Internet of Things device” become synonymous in the minds of consumers.  As more companies, retailers, manufacturers, and others jump on the “connected world” bandwagon, more and more devices are being labeled as “smart devices.”  We have smart TVs, smart toasters, smart fitness trackers, smart watches, smart luggage tags, and more (computers, smartphones and tables belong in a separate category). But what does “smart” mean?  To me, a “smart device” is one that has the ability not only to collect and process data and take general actions based on the data (e.g., sound an alarm), but can be configured to take user-configured actions (e.g., send a text alert to a specified email address) and/or can share information with another device (e.g., a monitoring unit which connects wirelessly to a base station). But does a “smart device” automatically mean one connected to the Internet of Things?  I would argue that it does not.

Throughout its Ages, the Internet has connected different types of devices using a common protocol, e.g., TCP/IP for computers and servers, HTTP for web-enabled devices. A smart device must do something similar to be connected to the Internet of Things. However, there is no single standard communications protocol or method for IoT devices. If a smart device uses one of the emerging IoT communications protocols such as Zigbee or Z-Wave (“IoT Protocols”), or has an open API to allow other devices and device ecosystems such as SmartThings, Wink or IFTTT to connect to it (“IoT APIs”), it’s an IoT-connected smart device, or “IoT device.” If a device doesn’t use IoT Protocols or support IoT APIs, it may be a smart device, but it’s not an IoT device. For example, a water leak monitor that sounds a loud alarm if it detects water is a device.  A water leak monitor that sends an alert to a smartphone app via a central hub, but cannot connect to other devices or device ecosystems, is a smart device.  Only if that device uses an IoT Protocol or support IoT APIs to allow it to interconnect with other devices or device ecosystems is an IoT device.

“Organic” began as a term to define natural methods of farming.  However, over time it became overused and synonymous with “healthy.”  Players in the consumer IoT space should be careful not to let key IoT terminology suffer the same fate. Defining what makes a smart device part of the Internet of Things will be essential as smart devices continue to proliferate.

  • Smart devices and IoT devices exacerbate network and device security issues. Consumers embracing the Internet of Things and connected homes may not realize that adding smart devices and IoT devices to a home network can create new security issues and headaches. For example, a wearable device with a Bluetooth security vulnerability could be infected with malware while you’re using it, and infect your home network once you return and sync it with your home computer or device.  While there are proposals for a common set of security and privacy controls for IoT devices such as the IoT Trust Framework, nothing has been adopted by the industry as of yet.

Think of your home network, and your connected devices, like landscaping.  You can install a little or a lot, all at one or over time.  Often, you have a professional do it to ensure it is done right. Once it’s installed, you can’t just forget about it — you have to care for it, through watering, trimming, etc. Occasionally, you may need to apply treatments to avoid diseases. If you don’t care for your landscaping, it will get overgrown; weeds, invasive plants (some poisonous) and diseases may find their way in; and you ultimately have a bigger, harder, more expensive mess to clean up later on.

You need to tend your home network like landscaping, only if you don’t tend your home network the consequences can be much worse than overgrown shrubbery. Many consumers are less comfortable tinkering with computers than they are tinkering with landscaping.  Router and smart device manufacturers periodically update the embedded software (or “firmware”) that runs those devices to fix bugs and to address security vulnerabilities. Software and app developers similarly periodically release updated software. Consumers need to monitor for updates to firmware and software regularly, and apply them promptly once available.  If a device manufacturer goes out of business or stops supporting a device, consider replacing it as it will no longer receive security updates. Routers need to be properly configured, with usernames and strong passwords set, encryption enabled, network names (SSID) configured, etc.  Consumers with a connected home setup should consider a high-speed router with sufficient bandwidth such as 802.11ac or 802.11n.

The third party managed IT services industry has existed since the Second Age. As connected homes proliferate resulting in complex connected home infrastructure, there is an opportunity for “managed home IT” to become a viable business model.  I expect companies currently offering consumer-focused computer repair and home networking services will look hard at adding connected home management services (installation, monitoring, penetration testing, etc.) as a new subscription-based service.

  • Smart device companies need to think of what they can/can’t, and should/shouldn’t, do with data generated from their devices.  IoT devices and smart devices, and connected home technologies and gateways, generate a lot of data.  Smart/IoT device manufacturers and connected home providers need to think about how to store, process and dispose of this data.  Prior to the Internet of Things, behavioral data was gathered through the websites you viewed, the searches you ran, the links you clicked – “online behavioral data.”  The IoT is a game-changer. Now, what users do in the real world with their connected devices can translate to a new class of behavioral data – “device behavioral data.” Smart/IoT device manufacturers, and connected home providers, will need to understand what legal boundaries govern their use of device behavioral data, and how existing laws (e.g., COPPA) apply to the collection and use of data through new technologies. Additionally, companies must look at what industry best practices, industry guidelines and rules, consumer expectations and sentiment, and other non-legal contours shape what companies should and should not do with the data, even if the use is legal.  Companies must consider how long to keep data, and how to ensure it’s purged out of their systems once the retention period ends.

IoT and smart device companies, and connected home service and technology providers, should build privacy and data management compliance into the design of their devices and their systems by adopting a “security by design” and “privacy by design” mindset. Consumers expect that personal data about them will be kept secure and not misused. They must ensure their own privacy policies clearly say what they do with device behavioral data, and not do anything outside the boundaries of their privacy policy (“say what you do, do what you say”). Consider contextual disclosures making sure the consumer clearly understands what you do with device behavioral data.  Each new Age of the Internet has seen the FTC, state Attorneys General, and other consumer regulatory bodies look at how companies are using consumer data, and make examples of those they believe are misusing it. The Fourth Age will be no different. Companies seeking to monetize device behavioral data must make sure that they have a focus on data compliance.

Key Security Provisions for Vendor/Partner Contracts

One of the most important lessons from the 2013 Target breach was that hackers will look for the weakest link in a company’s security chain when seeking a point of entry. Often, that weakest link is the vendors and partners which integrate with your IT infrastructure or have login credentials to your systems. Target’s HVAC vendor suffered a phishing attack that resulted in hackers obtaining access credentials to Target’s network which they used as their point of entry. Companies are increasingly doing security diligence on their vendors and partners to ensure that if they have access to the company’s network or systems, they will meet minimum security requirements.  It’s critical that your vendors and partners agree to minimum contractual security commitments as well. I often use a “security addendum” with controlling language to ensure that my standard provisions control over any conflicting provisions in the vendor/partner agreement, but will sometimes embed them directly into the contract.

Here are some of the provisions I like to include in vendor and partner agreements:

  • Definitions of Personal Information and Financial Account Information.  It’s important to define what “personal information” and “financial account information” mean.  In many cases, your vendor/partner’s definition of these terms may differ from yours. Ensuring you’re on the same page (e.g., you may consider IP addresses to be personal information, they do not) can be critical in the event there is an unauthorized release of information.  Be careful using a list of information types as the list may change over time; instead, consider a broad definition with examples.
  • Credentials. If you are providing credentials to your vendor/partner to access your network or systems, or that of a third party (e.g., a marketing service, a cloud hosting environment, etc.), ensure they will only use them as required by the contract.  Ensure they fall under the contractual definition of Confidential Information and will be treated as such.  Access to credentials should be limited to those with a “need to know.”
  • Safeguards. I like to include a requirement to implement and follow administrative, physical and technical safeguards (no less rigorous than industry standard) designed to protect information and credentials.  This can be a good catch-all that can be leveraged if the vendor/partner has a problem later on and did not use industry standard security safeguards.  I also like to call out the importance of installing security software patches immediately to reduce the risk of an exploitable security hole.  If the vendor/partner has obtained security certifications (e.g., SSAE16, ISO 27001, etc.) that you are relying on, ensure they provide evidence of current certification upon request and do not let certifications lapse during the term of the Agreement.
  • Anti-Phishing Training.  Over 90% of hacking attacks start with a “phishing” attack. Consider specifically requiring your vendors/partners to provide anti-phishing training to all employees.
  • Payment Account Information.  If the vendor/partner will not be handling payment account information, add an affirmative obligation that the vendor/partner will not access, use, store, or process payment account information. If you are afraid that information might be inadvertently provided to the vendor/partner, consider adding a provision stating that if any payment account information is inadvertently provided to the vendor/partner, as long as they destroy it immediately and notify your company the vendor/partner will not be in breach of the affirmative obligation not to use payment account information.  If your vendor/partner will handle payment account information, ensure you have appropriate language that covers both current and future PCI-DSS (Payment Card Industry Data Security Standard) versions.  If appropriate, add language making clear that payment account information will be stored in active memory only, and not stored or retained on the vendor/partner’s servers (e.g., where the payment information is “tokenized” and/or securely transmitted to your company’s own servers at the time the transaction is processed).
  • Information Security Questionnaire.  Include the right to have the vendor/partner complete a written security questionnaire once a year signed by a corporate officer. Requiring an annual questionnaire can help identify whether your vendors/partners are on top of emerging threats and risks. If you have limited resources to conduct audits, the responses to the questionnaires can help you identify which vendors/partners may be best to audit.  As part of the questionnaire, ask for copies of the vendor/partner’s disaster recovery plan and business continuity plan, and certificate of insurance for the vendor/partner’s cyber security policy if your company is named as an additional insured.
  • Audit Rights.  Include a right to do a security audit of a vendor/partner’s information technology and information security controls. This should include the right to conduct penetration testing of the vendor/partner’s network, ideally on an unannounced basis.  Make sure the vendor/partner is obligated to correct any security discrepancies found at their expense; if they don’t make corrections to your reasonable satisfaction, you should be able to exit the contract.  Ensure you can use internal and third party resources to conduct the training. In addition to a right to audit on a regular basis (e.g., once per year), allow the right to audit after a security breach so you can do your own analysis of how well the vendor/partner has bulletproofed their systems in light of a breach.
  • Security Breach.  Define what a “security breach” is (consider a broad definition that includes security incidents as well).  Ensure the vendor/partner promptly notifies your company in the event of a security breach, ideally by email to a “role” mailbox or to your CIO/CTO.  The vendor/partner should take any triage steps necessary to close the immediate security hole and then thoroughly review and bulletproof its systems and networks.  The vendor/partner should agree to work with your company and any government entities in any investigation of the breach.  Ensure that your company, not the vendor/partner, decides whether and how to communicate with affected individuals.  Ensure the vendor/partner bears the costs associated with a security breach.
  • Preservation Notices and E-Discovery.  If the records of the vendor/partner may be important if litigation is brought against your company, consider adding a clause ensuring that the vendor/partner will comply with any document preservation/litigation hold notice you provide, and that the vendor/partner will reasonably assist with electronic discovery requests.  A “friendly” clause like this can help avoid issues and strain on the partnership if litigation occurs.

Once you have these provisions in your agreement, don’t forget to tie them into your risk allocation provisions. If the vendor/partner carries insurance to protect against security breaches, ensure you are an additional insured and ask for a certificate of insurance annually. Ensure your indemnification section fully covers any breach of security obligations, and consider excluding these from your limitation of liability to the greatest extent possible.

Safe Harbor Framework for EU to US Personal Data Transfers May Not Be “Adequate” After All

This week, the Advocate General of the European Court of Justice (ECJ) issued a preliminary and non-binding assessment in an ECJ case recommending that the ECJ find the US-EU Safe Harbor Framework to be invalid.

For US companies with European subsidiaries that regularly need to transfer data back to the US home office, one of the primary data privacy considerations is compliance with the EU’s Data Protection Directive. Each EU member state has adopted their own data protection law based on the Directive. The Directive covers personal data in the European Economic Area (the EU, Iceland, Liechtenstein and Norway).

Under Article 25 of the Directive, the transfer of personal data to a country or territory outside of the EEA is prohibited unless that country or territory can guarantee an “adequate” level of data protection in the eyes of the EU.  In some cases, the EU will declare a country to have “adequate” protections in place (e.g., Canada based on their national PIPEDA data privacy law).

The US is one of the countries that is not deemed “adequate” by the EU.  (The US does not have a comprehensive national privacy law like Canada or the EU, but instead uses a “sectoral” approach to regulate data privacy.)  Because of this, the EU controller of the personal data must ensure that the US company receiving the data has an adequate level of protection for personal data to permit the data transfer.  This can be achieved in a number of ways, including:

  • The Directive defines a number of situations in which adequacy is presumed statutorily, such as where the data subject consents to the transfer, the transfer is necessary for the performance of, or conclusion of, the contract between the data subject and data controller, or it is necessary to protect the vital interests of the data subject.
  • A company’s Board of Directors can adopt binding corporate rules requiring adequate safeguards within a corporate group to protect personal data throughout the organization.
  • The EU entity and US entity can enter into an approved contract (utilizing a model contract terms approved by the EU) with provisions ensuring data is adequately protected.
  • The transfer is to a US entity which participates in the Safe Harbor Framework, a program agreed upon by the US and EU in 2000 under which US companies that self-certify that their data protection policies and practices are in compliance the requirements of the Framework are deemed to have an “adequate” level of data protection for EU data transfer purposes.  Over 5,000 companies have certified their compliance with the Safe Harbor Framework.

Edward Snowden’s revelations regarding US government surveillance programs and practices created many questions regarding whether the Safe Harbor Framework was truly “adequate” for EU purposes, since regardless of a company’s own policies and practices the US government could access the personal data of EU data subjects stored on US servers.  This week, in a case brought by an Austrian student challenging the transfer of his data to the US by Facebook under the Safe Harbor framework, the Advocate General of the European Court of Justice (ECJ) issued a preliminary and non-binding assessment recommending that the ECJ find the Safe Harbor Framework to be invalid.  The ECJ can ignore the Advocate General’s recommendation, but does so only rarely.

The language of the decision will be very important, as the potential for US government surveillance of and access to personal data of EU data subjects stored in the US goes beyond the Safe Harbor framework.  A broad decision could create problems for the ability of US companies to achieve adequacy for EU data transfer purposes, regardless of the adequacy approach used — US government surveillance could be determined to trump any adequacy approach taken by US companies in the eyes of the EU. However, a finding that the US government’s surveillance practices call into question the adequacy the transfer of data to US companies in general could cause major headaches and disruptions for US businesses, and would have political and economic ramifications. It will be interesting to see how deep down this rabbit hole the ECJ is willing to go.

Companies which participate in the Safe Harbor Framework should immediately start looking at alternative choices for achieving “adequacy” in the eyes of the EU to allow for continued data transfers.  Companies should also look at whether any of their vendors rely on safe harbor in the performance of obligations, and contact them regarding their contingency plans if Safe Harbor is found to be invalid. If the ECJ adopts the Advocate General’s recommendation, it is unclear whether they will provide any grace period to all companies to implement an alternative approach.  Public reporting companies participating in the Safe Harbor framework may also want to consider whether this uncertainty should be cited in their risk factors for SEC reporting purposes.

FTC opens their nationwide tour to promote Start with Security

It’s not the latest group on tour with a band name and album name that needed a lot more thought.  Earlier this year, the FTC announced that they would be releasing guidance for businesses on data security.  In June, they did just that, releasing a guide called Start with Security: A Guide for Business.  It’s subtitled “Lessons Learned From FTC Cases” for a reason — it uses the 50+ FTC enforcement actions on data security to provide ten lessons companies should learn when approaching to security to avoid others’ missteps that led to enforcement actions, and practical guidance on reducing risks.  The lessons are:

  1. Start with security.  The FTC has long advocated the concept of “privacy by design,” meaning companies should bake an understanding of and sensitivity to privacy into every part of the business, making it part of the design process for new products and processes.  The FTC is advocating a similar concept of “security by design.” Guidance:  don’t collect personal information you don’t need (the RockYou enforcement action); don’t use personal information when it’s not necessary (Accretive and foru International); don’t hold on to information longer than you have a legitimate business need for it (BJ’s Wholesale Club).
  1. Control access to data sensibly.  Keep data in your possession secure by controlling access to it – limit access to those with a need to know for a legitimate business purpose (e.g., no shared user accounts, lock up physical files). Guidance: don’t let employees access personal information unless they need to access it as part of their job (Goal Financial); don’t give administrative access to anyone other than employees tasked administrative duties (Twitter).
  1. Require secure passwords and authentication.  Use strong password authentication and sensible password hygiene (e.g., suspend password after x unsuccessful attempts; prohibit common dictionary words; require at least 8 characters; require at least one upper case character, one lower case character, 1 numerical character, and 1 special character; prohibit more than 2 repeating characters; etc.)  Guidance: require complex and unique passwords (Twitter); store passwords securely (Guidance SoftwareReed ElsevierTwitter); guard against brute force attacks (Lookout ServicesTwitter, Reed Elsevier); protect against authentication bypasssuch as predictable resource location (Lookout Services).
  1. Store sensitive personal information securely (“at rest”) and protect it during transmission (“in motion”). Use strong encryption when storing and transmitting data, and ensure the personnel implementing encryption understand how you use sensitive data and can determine the right approach on a situation-by-situation basis.  Guidance: Keep sensitive information secure throughout the data life-cycle (receipt, use, storage, transmission, disposal) (Superior Mortgage Corporation); use industry-tested and accepted methods (ValueClick); make sure encryption is properly configured (FandangoCredit Karma).
  1. Segment your network and monitor who’s trying to get in and out.  Be sure to use firewalls to segment your network to minimize what an attacker can access.  Use intrusion detection and prevention tools to monitor for malicious activity.  Guidance: segment your network (DSW); monitor activity on your network (Dave & Buster’sCardsystem Solutions).
  1. Secure remote access to your network. Make sure you develop and implement a remote access policy, implement strong security measures for remote access, and put appropriate limits on remote access such as by IP address and revoking remote access promptly when no longer needed.  (The compromise of a vendor’s system via phishing, leading to remote network access, is how the Target breach started.)  Guidance: ensure remote computers have appropriate security measures in place, e.g., “endpoint security” (Premier Capital LendingSettlement OneLifeLock); put sensible access limits in place (Dave & Buster’s).
  1. Apply sound security practices when developing new products. Use “security by design” to ensure data security is considered at all times during the product development life-cycle.  Guidance: Train engineers in secure coding (MTS, HTC America, TrendNet); follow platform guidelines for security (HTC AmericaFandangoCredit Karma); verify that privacy and security features work (TRENDnetSnapchat); test for common vulnerabilities (Guess?).
  1. Make sure your service providers implement reasonable security measures. Make sure you communicate your security expectations to your service providers and vendors, and put their feet to the fire through contractual commitments and auditing/penetration testing. Guidance: put it in writing (GMR Transcription); verify compliance (Upromise).
  1. Put procedures in place to keep your security current and address vulnerabilities that may arise.  Data security is a constant game of cat-and-mouse with hackers – make sure to keep your guard up.  Apply updates to your hardware and software as they are issued, and ensure you are spotting vulnerabilities in, and promptly patching, your own software. Have a mechanism to allow security warnings and issues to be reported to IT.  Guidance: update and patch third-party software (TJX Companies); heed credible security warnings and move quickly to fix them (HTC AmericaFandango).
  1. Secure paper, physical media, and devices.  Lastly, while the focus these days seems to be on cybersecurity, don’t forget about physical security of papers and physical media.  Guidance: securely store sensitive files(Gregory NavoneLifelock); protect devices that process personal information(Dollar Tree); keep safety standards in place when data is en route (AccretiveCBR Systems); dispose of sensitive data securely (Rite Aid,CVS Caremark,Goal Financial).

As this guidance is based on what companies did wrong or didn’t do that led to FTC enforcement actions, it will be interesting to see how the FTC treats a company that suffers a data breach but demonstrates that they used reasonable efforts to comply with the FTC’s guidance.  I suspect the FTC will take a company’s compliance with this guidance into consideration when determining penalties in an enforcement action. The guidance is very high-level, so companies must rely on their IT and Legal teams to determine what steps, processes and protocols need to be implemented in alignment with the FTC’s guidance.

In addition to publishing the guide, the FTC has embarked on a conference series aimed at SMBs (small and medium-sized businesses), start-up companies, and developers to provide information on “security by design,” common security vulnerabilities, secure development strategies, and vulnerability response.  The first conference took place September 9 in San Francisco, CA; the second will take place November 5 in Austin, TX.

The FTC also announced a new website at which they’ve gathered all of their data security guidance, publications, information and tools as a “one-stop shop”.  You can find it at http://www.ftc.gov/datasecurity.

Podcast – the in-house perspective on trade secrets, privacy, and other topics

I recently had the privilege of being interviewed for IP Fridays®, a podcast series by Ken Suzan (of counsel and a trademark attorney at the Minneapolis office of Barnes & Thornburg LLP, and Dr. Rolf Claessen, partner at Patent Attorneys Freischem in Cologne, Germany.  We discussed the in-house perspective on a variety of topics, including trade secrets, copyrighting software code, and privacy.  Head to IPFridays.com if you’d like to listen, or click here to head straight to the podcast.

Don’t Overlook Law Firms as Third-Party Data Storage Vendors

There are countless articles providing companies with tips and advice on what to look for, and what to look out for, when engaging with a vendor who will store, process and/or use company data and/or network credentials. Given recent high-profile data breaches attributable to vendors of major companies, there has been a focus on tightening controls on vendors. Many companies have put procedures and requirements in place to ensure that vendors storing company data and network credentials are properly vetted, meet IT and security standards, and commit contractually to protect the company’s valuable information.

Despite this, there is one group of vendors storing data that are overlooked by a large number of companies – law firms. Here are a few reasons why:

  • Engagements don’t follow the usual vendor procurement process. Law firms are generally engaged directly by the General Counsel, other senior attorneys, or senior management. They are usually engaged due to their specialized expertise in a particular area of law in which there is an immediate need, an existing relationship with a member of the legal or management team, or a recommendation by a trusted resource. Law firm engagements often happen at the same time there is a pressing need for their services (e.g., a pending response to a complaint) with little time for a selection process. Quite often, companies don’t use a formal bid process at all when engaging outside counsel.
  • Law firms don’t think of themselves as just another vendor. Law firms generally do not consider themselves to be like other vendors given their specialized role and partnership with companies to provide legal advice and counsel. They are like other service companies in some respects (for example, law firms need to comply with federal, state and local laws, rules and regulations applicable to other companies). Unlike other service companies, the lawyers providing services at a law firm are also bound by rules of professional responsibility with disciplinary measures for noncompliance. These rules include obligations to keep client information confidential. The Model Rules were recently changed to add obligations for law firms to use reasonable efforts to protect client data, and to keep abreast of the benefits and risks associated with relevant technology involved in the practice of law.

When a law firm suffers a major breach exposing customer data and notifies clients in compliance with state breach notification statutes, it will be interesting to see whether lawyers in that firm face disciplinary action under rules of professional responsibility for exposure of client data. If lawyers face discipline as the result of a security breach, it will bring security to the forefront of client-lawyer relationships overnight.

  • Other teams within a company consider law firm relationships as “off limits.” Legal often only reaches out to IT for assistance arranging secure transfer of files to and from law firms, and in connection with discovery requests. It’s very rare that procurement and IT teams reach out to Legal to ask them to run law firms through the same vetting process as other vendors handling company data or system credentials, and its’ equally rare for Legal to proactively request this review of the law firms it engages.

Things You Should Do. When your company engages a law firm, consider the following:

  • Proactively develop internal vetting requirements. Your Legal, IT, Security and Procurement teams should proactively develop a checklist of questions/action items/contractual requirements when engaging counsel. If engaging counsel in a hurry, make sure the firm realizes that your company will do this diligence as soon as possible following engagement.
  • Ask the firm about their security safeguards. When discussing an engagement with prospective counsel, ask them what their technical, administrative and procedural safeguards are for protecting your information (and, if you give them network access, your network credentials). Find out how big their information security team is, and what kind of systems they use. You’re relying on their security safeguards to keep your data safe, so it’s appropriate for you to ask questions about how they secure your data.

Law firms have historically been reluctant to talk about their information security practices.If a firm can’t give you solid information about their information security practices, or can’t give you the name of a person who can answer your IT and security questions, strongly consider looking for alternative counsel.

  • Ask about cyber insurance. Ask whether the firm carries cyber insurance to cover security breaches (more and more firms have it). If they do, ask them to add you as an additional insured as you would with other vendors holding your data.
  • Add a security rider to your law firm engagement letter, security language to your outside counsel guidelines, or both. Add a short rider to your law firm engagement letter with the security language you came up with in advance with your IT and security teams. Consider addressing topics such as security and confidentiality safeguards, requirements to rapidly deploy security patches to their hardware and software, and confidentiality of login credentials to your network.Ensure they are protecting you if there is an unauthorized disclosure of your company data stored through a third party system or provider they use.

Companies often ask counsel to comply with their outside counsel guidelines, and many ask clients to agree to compliance as part of the retainer letter. Include core security language in your engagement letter, and include an paragraph in the retainer letter requiring the law firm to follow the terms of your outside counsel guidelines (and resolving conflicts in favor of the guidelines).

It’s a matter of if, not when, a law firm announces a major security breach. Once that happens, it will cause a seismic shift in how law firms approach data they hold, and how prospective clients engage with them. Law firms that take a proactive approach and make their commitment to data security part of their core client values, and are willing to share their commitment with prospective clients, will find themselves with a leg up on the competition.