The Why of Privacy: 4 Reasons Privacy Matters to People, and Why Companies Need to Know Them

While almost all companies collect and use their customers, visitors and users’ personal information, primarily online and through in-person customer interactions such as point-of-sale transactions, the privacy landscape is in a near-constant state of turbulence and flux. There is the steady flow of data breach reports affecting companies of almost every size and market segment. New data privacy laws, rules and regulations continue to be introduced and enacted around the world, such as the US-EU Privacy Shield program, the EU General Data Protection Regulation (GDPR), and Argentina’s draft Data Protection Bill, placing new legal obligations and restrictions on the collection and use of personal information. Challenges continue to be raised against laws which are perceived to overreach or conflict with privacy rights, such as the continued challenges to the Privacy Shield program and EU’s Model Contract Clauses.

The one constant in this turbulent landscape is that consumers’ awareness of data privacy and security continues to grow. Given this, it is important to step back from the day-to-day privacy developments and look at a more fundamental question. It is axiomatic in the world of privacy that privacy matters to people, but why it matters is more complicated. People often argue about why privacy is important to individuals, but there is no “one-size-fits-all” answer. Privacy matters to different people in different ways, so there are many equally valid reasons why privacy is important to individuals.

Understanding the “why of privacy” is also critically important to businesses and other organizations. By now, most companies understand the importance of providing notice of their privacy collection practices and choice with respect to the use of collected information. A company collecting, processing and/or controlling personal information that understands the reasons privacy matters to the data subjects whose data they collect and use can design more effective privacy practices and policies attuned to the needs of their data subjects, such as by creating customer privacy profiles for use in product design and testing.  This follows the “privacy by design” framework advocated by the Federal Trade Commission and helps increase trust in the company’s commitment to data privacy and security, which is critical to the success of every company in today’s world and can provide a competitive advantage.

The reason why privacy matters differs from person to person. However, I believe these reasons can be grouped into four core categories: (1) privacy is a right, (2) privacy is an entitlement, (3) privacy is an expectation, and (4) privacy is a commodity. I’ll explore each of them in turn.

Privacy is a Right

Persons falling into this first category value privacy as an irrevocable right guaranteed to all. People living in countries with constitutional data privacy protections often fall into this category. For example, the European Union Charter of Fundamental Rights recognizes the right to data protection and the right to privacy as fundamental human rights. In some countries, it has been implied through interpretation of constitutional and legal rights, such as the right to privacy found by the U.S. Supreme Court and the right to privacy recognized under the Canadian Charter of Rights and Freedoms even though it does not specifically mention privacy. In August 2017, a unanimous Supreme Court of India held that privacy is a fundamental right as an integral part of the Right to Life and Personal Liberty guaranteed under Article 21 of the Constitution of India.  The 1948 United Nations’ Universal Declaration of Human Rights states that people have a fundamental human right not to “be subjected to arbitrary interference with their privacy, family, home or correspondence.”

  • People in this category are more likely to take a very rigid view of privacy trumping all other interests, including business interests, and may be less willing to “trade” any of their privacy for other benefits such as increased security.
  • People in this category tend to expect that any consent given to use personal information must be clear, unambiguous, express, and fully revocable and that use of the information must be specifically limited to the grant of rights or as otherwise expressly permitted by law, which creates a significant burden for businesses and other organizations collecting and using personal information.
  • Privacy as a right is an individual view – the rights of the individuals to protect their personal information are paramount to almost all other rights by others to use or access that personal information.

Privacy is an Entitlement

Persons falling into this second category value privacy as something to which they are entitled under laws, rules and regulations applicable to them. There are many laws, either comprehensive data privacy laws such as Canada’s PIPEDA or sectoral laws such as the privacy laws enacted in the United States, whose prohibitions or restrictions on privacy practices may be viewed by individuals as creating privacy obligations to which they are entitled. An example is the U.S. Children’s Online Privacy Protection Act, which among other things prohibits the collection of personal information from children under 13 without verifiable parental consent. Some parents view COPPA as creating an entitlement for their children to be left alone unless the parent consents to the collection of personal information from their children.

  • Similar to privacy as a right, people in this category are likely to view privacy as trumping other interests, including business interests, and may be less willing to give up privacy for other benefits.
  • They tend to expect that any consent given to use personal information must be fully compliant with legal requirements, and that use of the information must be specifically limited to those use rights expressly permitted by law, which creates a burden for businesses and other organizations collecting and using personal information.
  • As with privacy as a right, privacy as an entitlement is an individual view, where a individual’s entitlement to privacy outweighs other interests in a person’s personal information.
  • A key differentiator between privacy as a right and privacy as an entitlement is that an entitlement can be revoked, e.g., through changes to the law, whereas a right is irrevocable. While some might argue that a judicially-recognized right to privacy should be an expectation, I believe that the recognition by a country’s supreme court that privacy is a right, which is unlikely to be overturned or legislatively reversed, should be considered a right.

Privacy is an Expectation

Persons falling into this third category value privacy as something they expect to receive, whether or not they have a right or entitlement to it. New technologies (such as drones and biometric identifiers) and practices (such as marketing strategies) tend to be ahead of laws specifically governing them, and people in this category expect to receive privacy protections regardless of whether existing laws or other rights cover the technology or practice. They may also expect societal norms with respect to privacy to be followed by businesses and other organizations, whether or not stricter than applicable legal requirements. There are also certain expectations of privacy that are generally recognized within a given society. For example, in the United States, many people have an expectation of privacy in their own home and other private areas such as a public bathroom stall. If a person or organization interferes with this expectation of privacy, there may be legal liability for invasion of privacy under state laws. There are other expectations of privacy on a per-situation basis, such as a private conversation between two individuals.

  • People in this category believe that third parties, such as companies and government entities, should recognize that their expectation of privacy trumps those third parties’ desire (or rights) to access and use their personal information, but also understand that the expectation of privacy has limits. For example, a person should not have an expectation of privacy in a public place (e.g., a public sidewalk), and there is no right of privacy that extends to a person’s garbage placed on the street for collection.  In the United States, there is also no expectation of privacy in the workplace.
  • An expectation of privacy can be breached by a superior interest by a third party. For example, if a court approved surveillance of someone suspected of engaging in illegal activity, any expectation of privacy that person may have that his conversations are private is superseded by the government’s interest in preventing and prosecuting crime.
  • People in this category also generally do not question or challenge the terms of a privacy policy or other agreement granting rights to use or collect their personal information. People in this category also tend to expect businesses and other organizations collecting and/or using their personal information will not unreasonably collect or use their personal information, and will respect usage opt-out requests.
  • Privacy as an expectation is a middle-of-the-road view, in which the individual view of privacy as paramount is tempered with the understanding that in some cases the general or specific value of allowing a third party to receive and use their personal information outweighs the personal interest.

Privacy is a Commodity

Persons falling into this fourth category value privacy as a commodity that they are willing to exchange for other benefits, goods or services. We live in an information economy, where data has been commoditized. To many companies a core or important part of their product or service offering (i.e., part of the general value of the product or service) or business strategy is the ability to monetize personal, aggregate, and/or anonymous data collected through its use. Companies argue that the value derived from data monetization is factored into the value and cost of the product or service. Other companies offer something of specific value, such as registering for an extended product warranty, for sharing personal information such as an email address or demographic information. Many people give businesses some rights to use their personal information simply by visiting a webpage, requesting information from them, or purchasing goods or services from them in which they agree to be bound by the company’s privacy policy or terms of use/terms of sale. We also live in a world where many people are willing to sacrifice some privacy in exchange for increased security against terrorism and other potential physical and cyber threats. People falling into this category have a strong understanding of the trade-off between privacy and other benefits.

  • People in this category are more willing to give third parties the right to use their information as long as the thing they receive in return is valuable enough to them – they view their personal information as currency. If a company or organization offers something of value, they are very likely to agree to share personal information with that company or organization. These are the kind of people who don’t really care that they’re receiving targeted ads while surfing online.
  • Conversely, if they do not believe they are receiving value in return for their personal information, people in this category are more likely not to share their information.
  • Privacy as a commodity is a transactional view, meaning that the an individual is willing to allow a third party to receive and use their personal information if the general or specific value of allowing that third party to receive and use the information outweighs their personal interest in keeping their information.
  • It may require a greater transfer of value to convince someone viewing privacy as a right, entitlement or expectation to treat it as a commodity.

 

As a closing thought, these four reasons why privacy matters to people are not mutually exclusive, meaning that there are additional sub-categories of people for whom two or more of these reasons are important. For example, it is possible for someone to view privacy as both an entitlement and a commodity. Such a person would expect that while they have the ability to exchange their personal information for something of value, it must always be a voluntary exchange – they would reject any need to trade away their personal information. Businesses who take the time to understand the “why of privacy” will find themselves better positioned to create sample customer profiles based on their customers’ privacy values, leading to more robust privacy practices, processes and policies and a potential competitive advantage on privacy in the marketplace.

Eric Lambert has spent most of his legal career working in-house as a proactive problem-solver and business partner. He specializes in transactional agreements, technology/software/e-commerce, privacy, marketing and practical risk management. Any opinions in this post are his own. This post does not constitute, nor should it be construed as, legal advice. He is a technophile and Internet evangelist/enthusiast. In his spare time Eric enjoys reading and implementing and integrating connected home technologies and dabbles in voice-over work.

The Augmented World — Legal and Privacy Perspectives on Augmented Reality (AR)

You’ve likely heard that Augmented Reality (AR) is the next technology that will transform our lives. You may not realize that AR has been here for years. You’ve seen it on NFL broadcasts when the first down line and down/yardage appear on the screen under players’ feet. You’ve seen it in the Haunted Mansion ride in Disneyland when ghosts seem to appear in the mirror riding with you in your cart. You’ve seen it in cars and fighter jets when speed and other data is superimposed onto the windshield through a heads-up display. You’re seeing it in the explosion of Pokémon Go around the world. AR will affect all sectors, much as the World Wide Web did in the mid-1990s. Any new technology such as AR brings with it questions on how it fits under the umbrella of existing legal and privacy laws, where it pushes the boundaries and requires adjustments to the size and shape of the legal and regulatory umbrella, and when a new technology leads to a fundamental shift in certain areas of law. This article will define augmented reality and the augmented world, and analyze its impact on the legal and privacy landscape.

What is “augmented reality” and the “augmented world?”

One of the hallmarks of an emerging technology is that it is not easily defined. Similar to the “Internet of Things,” AR means different things to different people, can exist as a group of related technologies instead of a single technology, and is still developing. However, there are certain common elements among existing AR technologies from which a basic definition can be distilled.

I would define “augmented reality” as “a process, technology, or device that presents a user with real-world information, commonly but not limited to audiovisual imagery, augmented with additional contextual data elements layered on top of the real-world information, by (1) collecting real-world audiovisual imagery, properties, and other data; (2) processing the real-world data via remote servers to identify elements, such as real-world objects, to augment with supplemental contextual data; and (3) presenting in real time supplemental contextual data overlaid on the real-world data.” The real world as augmented through various AR systems and platforms can be referred to as the “augmented world.” AR and the augmented world differs from “virtual reality” (VR) systems and platforms, such as the Oculus Rift and Google Cardboard, in that VR replaces the user’s view of the real world with a wholly digitally-created virtual world, where AR augments the user’s view of the real world with additional digital data.

“Passive” AR (what I call “first-generation AR”) is a fixed system — you receive augmented information but do not do so interactively, such as going through the Haunted Mansion ride or watching your television set. The next generation of AR is “active,” meaning that AR will be delivered in a changing environment, and the augmented world will be viewed, through a device you carry or wear. Google Glass and the forthcoming Microsoft HoloLens are examples of “active AR” systems with dedicated hardware; when worn, the world is augmented with digital data superimposed on the real-time view of the world. However, AR has found ways to use existing hardware — your smartphone. HP’s Aurasma platform is an early example of an active AR system that uses your smartphone’s camera and screen to create digital content superimposed on the real world. What AR has needed to go fully mainstream was a killer app that found a way for AR to appeal to the masses, and it now has one — Pokémon Go. Within days of its launch in early July, TechCrunch reported that Pokémon Go had an average daily user base of over 20 million users. Some declared it the biggest “stealth health” app of all time as it was getting users out and walking.

Active AR has the capacity to change how people interact with the world, and with each other. It is an immersive and engaging user experience. It has the capacity to change the worlds of shopping, education and training, law enforcement, maintenance, healthcare, and gaming, and others. Consider an AR system that shows reviews, product data, and comparative prices while looking at a shelf display; identifies an object or person approaching you and makes it glow, flash, or otherwise stand out to give you more time to avoid a collision; gives you information on an artist, or the ability to hear or see commentary, while looking at a painting or sculpture; identifies to a police officer in real time whether a weapon brandished by a suspect is real or fake; or shows you in real time how to repair a household item (or how to administer emergency aid) through images placed on that item or on a stricken individual. For some, the augmented world will be life-altering, such as a headset as assistive technology which reads road signs aloud to a blind person or announces that a vehicle is coming (and how far away it is) when the user looks in the vehicle’s direction. For others, the ability to collect, process and augment real-world data in real time could be viewed as a further invasion of privacy, or worse, technology that could be used for illegal or immoral purposes.

As with any new technology, there will be challenges from a legal and digital perspective. A well-known example of this is the Internet when the World Wide Web became mainstream in the mid-1990s. In some cases, existing laws were interpreted to apply to the online world, such as the application of libel and slander to online statements, the application of intellectual property laws to file sharing over peer-to-peer networks, and the application of contract law to online terms of use. In others, new laws such as the Digital Millennium Copyright Act were enacted to address shortcomings of the existing legal and regulatory landscape with respect to the online world. In some instances, the new technology led to a fundamental shift in a particular area of law, such as how privacy works in an online world and how to address online identity theft and breaches of personal information. AR’s collection of data, and presentation of augmented data in real time, creates similar challenges that will need to be addressed. Here are some of the legal and privacy challenges raised by AR.

  • Rethinking a “reasonable expectation of privacy.” A core privacy principle under US law is that persons have a reasonable expectation of privacy, i.e., a person can be held liable for unreasonably intruding on another’s interest in keeping his/her personal affairs private. However, what is a “reasonable expectation of privacy” in a GoPro world? CCTV/surveillance cameras, wearable cameras, and smart devices already collect more information about people than ever before. AR technology will continue this trend. As more and more information is collected, what keeping “personal affairs private” looks like will continue to evolve. If you know someone is wearing an AR device, and still do or say something you intend to keep private, do you still have a reasonable expectation of privacy?

What is a “reasonable expectation of privacy” in a GoPro world?

 

  • Existing Privacy Principles. Principles of notice, choice, and “privacy by design” apply to AR systems. Providers of AR systems must apply the same privacy principles to AR as they do to the collection of information through any other method. Users should be given notice of what information will be collected through the AR system, how long it will be kept, and how it will be used. Providers should collect only information needed for the business purpose, store and dispose of it securely, and keep it only as long as needed.

AR systems add an additional level of complexity — they are collecting information not just about the user, but also third parties. Unlike a cellphone camera, where the act of collecting information from third parties is initiated by the user, an AR system may collect information about third parties as part of its fundamental design. Privacy options for third parties should be an important consideration in, and element of, any AR system. For example, an AR system provider could ensure users have the ability to toggle the blocking of third party personal data from being collected or augmented, so personal information is only augmented when the user wants it to be. AR system providers may also consider an indicator on the outside of the device, such as an LED, to let third parties know that the AR system is actively collecting information.

Additionally, AR may create interesting issues from a free speech and recording of communications perspective. Some, but not all, court rulings have held that the freedom of speech guaranteed by the First Amendment extends to making recordings of matters of public interest. An AR system that is always collecting data will push the boundaries of this doctrine. Even if something is not in the public interest, many states require the consent of both parties to record a conversation between them. An AR system which persistently collects data, including conversations, may run afoul of these laws.

  • Children’s Privacy. It is worth a special note that AR creates an especially difficult challenge for children’s privacy, especially children under 13. The Children’s Online Privacy Protection Act (“COPPA”) requires operators of online services, including mobile apps, to obtain verifiable parental consent before collecting any personal information from children under 13. “Personal information” includes photos, videos, and audio of a child’s image or voice. As AR systems collect and process data in real time, the passive collection of a child’s image or voice (versus collection of children’s personal information provided to a company through an interface such as a web browser) is problematic under COPPA. AR operators will need to determine how to ensure they are not collecting personal information from children under 13. I expect the FTC will amend the COPPA FAQ to clarify their position on the intersection of AR and children’s privacy.
  • Intellectual Property.  Aside from the inevitable patent wars that will occur over the early inventors of AR technologies, and patent holders who believe their patent claims cover certain aspects of AR technologies, AR will create some potentially interesting issues under intellectual property law. For example, an AR system that records (and stores) everything it sees will invariably capture some things that are protected by copyright or other IP laws. Will “fair use” be expanded in the augmented world, e.g., where an album cover is displayed to a user when a song from that is heard? Further, adding content to a copyrighted work in the augmented world may constitute a prohibited derivative work. From a trademark perspective, augmenting a common-law or registered trademark with additional data, or using a competitor’s name or logo to trigger an ad about your product overlaid on the competitor’s name or logo, could create issues under existing trademark law.
  • Discrimination.  AR systems make it easy to supplement real-world information by providing additional detail on a person, place or thing in real time. This supplemental data could intentionally or inadvertently be used to make real-time discriminatory decisions, e.g., using facial or name recognition to provide supplemental data about a person’s arrest history, status in a protected class, or other restricted information which is used in a hiring or rental decision. An AR system that may be used in a situation where data must be excluded from the decision-making process must include the ability to automatically exclude groups of data from the user’s augmented world.

The world of online digital marketing and advertising will expand to include digital marketing and advertising in the augmented world. Imagine a world where anything — and I mean anything — can be turned into a billboard or advertisement in real time. Contextual ads in the augmented world can be superimposed anytime a user sees a keyword. For example, if you see a house, imagine if an ad for a brand of paint appears because the paint manufacturer has bought contextual augmented ads to appear in an AR system whenever the user sees a house through the augmented world.

Existing laws will need to be applied to digital marketing and advertising in the augmented world. For example, when a marketing disclaimer appears in the online world, the user’s attention is on the ad. Will the disclaimer have the same effect in an augmented environment, or will it need to be presented in a way that calls attention to it? Could this have the unintended consequence of shifting the user’s attention away from something they are doing, such as walking, thereby increasing the risk of harm? There are also some interesting theoretical advertising applications of AR in a negative context. For example, “negative advertising” could be used to blur product or brand names and/or to make others more prominent in the augmented world.

  • The Right of Publicity.  The right of publicity — a person’s right to control the commercial use of his or her name, image, and likeness — is also likely to be challenged by digital marketing in the augmented world. Instead of actively using a person’s likeness to promote a product or service, a product or service could appear as augmented data next to a person’s name or likeness, improperly (and perhaps inadvertently) implying an endorsement or association. State laws governing the right of publicity will be reinterpreted when applied to the augmented world.
  • Negligence and Torts. AR has the capacity to both further exacerbate the problem of “distracted everything,” paying more attention to your AR device than your surroundings, as some users of Pokémon Go have discovered. Since AR augments the real world in real time, the additional information may cause a user to be distracted, or if the augmented data is erroneous could cause a user to cause harm to him/herself or to others. Many have heard the stories of a person dutifully following their GPS navigation system into a lake. Imagine an AR system identifying a mushroom as safe to eat when in fact it is highly poisonous. Just as distracted driving and distracted texting can be used as evidence of negligence, a distracted AR user can find him/herself facing a negligence claim for causing third party harm. Similarly, many tort claims that can arise through actions in the real world or online world, such as liable and slander, can occur in the augmented world. Additionally, if an AR system augments the real world in a way that makes someone think they are in danger, inflicts emotional distress, or causes something to become dangerous, the AR user, or system provider, could be legally responsible.
  • Contract liability. We will undoubtedly see providers of AR systems and platforms sued for damages suffered by their users. AR providers have and will shift liability to the user through contract terms. For example, Niantic, the company behind Pokémon Go, states in their Terms of Use that you must “be aware of your surroundings and play safely. You agree that your use of the App and play of the game is at your own risk, and it is your responsibility to maintain such health, liability, hazard, personal injury, medical, life, and other insurance policies as you deem reasonably necessary for any injuries that you may incur while using the Services.” AR providers’ success at shifting liability will likely fall primarily to tried-and-tested principles such as whether an enforceable contract exists.

None of the above challenges are likely to prove insurmountable and are not expected to slow the significant growth of AR. What will be interesting to watch is how lawmakers choose to respond to AR, and how early hiccups are seized on by politicians and reported in the press. Consider automobile autopilot technology. The recent crash of a Tesla in Autopilot mode is providing bad press for Tesla, and fodder for those who believe the technology is dangerous and must be curtailed. Every new technology brings both benefits and potential risks. If the benefits outweigh the risks on the whole, the public interest is not served when the legal, regulatory and privacy pendulum swings too far in response. Creating a legal, regulatory and privacy landscape that fosters the growth of AR, while appropriately addressing the risks AR creates and exacerbates, is critical.

Safe Harbor Framework for EU to US Personal Data Transfers May Not Be “Adequate” After All

This week, the Advocate General of the European Court of Justice (ECJ) issued a preliminary and non-binding assessment in an ECJ case recommending that the ECJ find the US-EU Safe Harbor Framework to be invalid.

For US companies with European subsidiaries that regularly need to transfer data back to the US home office, one of the primary data privacy considerations is compliance with the EU’s Data Protection Directive. Each EU member state has adopted their own data protection law based on the Directive. The Directive covers personal data in the European Economic Area (the EU, Iceland, Liechtenstein and Norway).

Under Article 25 of the Directive, the transfer of personal data to a country or territory outside of the EEA is prohibited unless that country or territory can guarantee an “adequate” level of data protection in the eyes of the EU.  In some cases, the EU will declare a country to have “adequate” protections in place (e.g., Canada based on their national PIPEDA data privacy law).

The US is one of the countries that is not deemed “adequate” by the EU.  (The US does not have a comprehensive national privacy law like Canada or the EU, but instead uses a “sectoral” approach to regulate data privacy.)  Because of this, the EU controller of the personal data must ensure that the US company receiving the data has an adequate level of protection for personal data to permit the data transfer.  This can be achieved in a number of ways, including:

  • The Directive defines a number of situations in which adequacy is presumed statutorily, such as where the data subject consents to the transfer, the transfer is necessary for the performance of, or conclusion of, the contract between the data subject and data controller, or it is necessary to protect the vital interests of the data subject.
  • A company’s Board of Directors can adopt binding corporate rules requiring adequate safeguards within a corporate group to protect personal data throughout the organization.
  • The EU entity and US entity can enter into an approved contract (utilizing a model contract terms approved by the EU) with provisions ensuring data is adequately protected.
  • The transfer is to a US entity which participates in the Safe Harbor Framework, a program agreed upon by the US and EU in 2000 under which US companies that self-certify that their data protection policies and practices are in compliance the requirements of the Framework are deemed to have an “adequate” level of data protection for EU data transfer purposes.  Over 5,000 companies have certified their compliance with the Safe Harbor Framework.

Edward Snowden’s revelations regarding US government surveillance programs and practices created many questions regarding whether the Safe Harbor Framework was truly “adequate” for EU purposes, since regardless of a company’s own policies and practices the US government could access the personal data of EU data subjects stored on US servers.  This week, in a case brought by an Austrian student challenging the transfer of his data to the US by Facebook under the Safe Harbor framework, the Advocate General of the European Court of Justice (ECJ) issued a preliminary and non-binding assessment recommending that the ECJ find the Safe Harbor Framework to be invalid.  The ECJ can ignore the Advocate General’s recommendation, but does so only rarely.

The language of the decision will be very important, as the potential for US government surveillance of and access to personal data of EU data subjects stored in the US goes beyond the Safe Harbor framework.  A broad decision could create problems for the ability of US companies to achieve adequacy for EU data transfer purposes, regardless of the adequacy approach used — US government surveillance could be determined to trump any adequacy approach taken by US companies in the eyes of the EU. However, a finding that the US government’s surveillance practices call into question the adequacy the transfer of data to US companies in general could cause major headaches and disruptions for US businesses, and would have political and economic ramifications. It will be interesting to see how deep down this rabbit hole the ECJ is willing to go.

Companies which participate in the Safe Harbor Framework should immediately start looking at alternative choices for achieving “adequacy” in the eyes of the EU to allow for continued data transfers.  Companies should also look at whether any of their vendors rely on safe harbor in the performance of obligations, and contact them regarding their contingency plans if Safe Harbor is found to be invalid. If the ECJ adopts the Advocate General’s recommendation, it is unclear whether they will provide any grace period to all companies to implement an alternative approach.  Public reporting companies participating in the Safe Harbor framework may also want to consider whether this uncertainty should be cited in their risk factors for SEC reporting purposes.

FTC opens their nationwide tour to promote Start with Security

It’s not the latest group on tour with a band name and album name that needed a lot more thought.  Earlier this year, the FTC announced that they would be releasing guidance for businesses on data security.  In June, they did just that, releasing a guide called Start with Security: A Guide for Business.  It’s subtitled “Lessons Learned From FTC Cases” for a reason — it uses the 50+ FTC enforcement actions on data security to provide ten lessons companies should learn when approaching to security to avoid others’ missteps that led to enforcement actions, and practical guidance on reducing risks.  The lessons are:

  1. Start with security.  The FTC has long advocated the concept of “privacy by design,” meaning companies should bake an understanding of and sensitivity to privacy into every part of the business, making it part of the design process for new products and processes.  The FTC is advocating a similar concept of “security by design.” Guidance:  don’t collect personal information you don’t need (the RockYou enforcement action); don’t use personal information when it’s not necessary (Accretive and foru International); don’t hold on to information longer than you have a legitimate business need for it (BJ’s Wholesale Club).
  1. Control access to data sensibly.  Keep data in your possession secure by controlling access to it – limit access to those with a need to know for a legitimate business purpose (e.g., no shared user accounts, lock up physical files). Guidance: don’t let employees access personal information unless they need to access it as part of their job (Goal Financial); don’t give administrative access to anyone other than employees tasked administrative duties (Twitter).
  1. Require secure passwords and authentication.  Use strong password authentication and sensible password hygiene (e.g., suspend password after x unsuccessful attempts; prohibit common dictionary words; require at least 8 characters; require at least one upper case character, one lower case character, 1 numerical character, and 1 special character; prohibit more than 2 repeating characters; etc.)  Guidance: require complex and unique passwords (Twitter); store passwords securely (Guidance SoftwareReed ElsevierTwitter); guard against brute force attacks (Lookout ServicesTwitter, Reed Elsevier); protect against authentication bypass such as predictable resource location (Lookout Services).
  1. Store sensitive personal information securely (“at rest”) and protect it during transmission (“in motion”). Use strong encryption when storing and transmitting data, and ensure the personnel implementing encryption understand how you use sensitive data and can determine the right approach on a situation-by-situation basis.  Guidance: Keep sensitive information secure throughout the data life-cycle (receipt, use, storage, transmission, disposal) (Superior Mortgage Corporation); use industry-tested and accepted methods (ValueClick); make sure encryption is properly configured (FandangoCredit Karma).
  1. Segment your network and monitor who’s trying to get in and out.  Be sure to use firewalls to segment your network to minimize what an attacker can access.  Use intrusion detection and prevention tools to monitor for malicious activity.  Guidance: segment your network (DSW); monitor activity on your network (Dave & Buster’sCardsystem Solutions).
  1. Secure remote access to your network. Make sure you develop and implement a remote access policy, implement strong security measures for remote access, and put appropriate limits on remote access such as by IP address and revoking remote access promptly when no longer needed.  (The compromise of a vendor’s system via phishing, leading to remote network access, is how the Target breach started.)  Guidance: ensure remote computers have appropriate security measures in place, e.g., “endpoint security” (Premier Capital LendingSettlement OneLifeLock); put sensible access limits in place (Dave & Buster’s).
  1. Apply sound security practices when developing new products. Use “security by design” to ensure data security is considered at all times during the product development life-cycle.  Guidance: Train engineers in secure coding (MTS, HTC America, TrendNet); follow platform guidelines for security (HTC AmericaFandangoCredit Karma); verify that privacy and security features work (TRENDnetSnapchat); test for common vulnerabilities (Guess?).
  1. Make sure your service providers implement reasonable security measures. Make sure you communicate your security expectations to your service providers and vendors, and put their feet to the fire through contractual commitments and auditing/penetration testing. Guidance: put it in writing (GMR Transcription); verify compliance (Upromise).
  1. Put procedures in place to keep your security current and address vulnerabilities that may arise.  Data security is a constant game of cat-and-mouse with hackers – make sure to keep your guard up.  Apply updates to your hardware and software as they are issued, and ensure you are spotting vulnerabilities in, and promptly patching, your own software. Have a mechanism to allow security warnings and issues to be reported to IT.  Guidance: update and patch third-party software (TJX Companies); heed credible security warnings and move quickly to fix them (HTC AmericaFandango).
  1. Secure paper, physical media, and devices.  Lastly, while the focus these days seems to be on cybersecurity, don’t forget about physical security of papers and physical media.  Guidance: securely store sensitive files (Gregory NavoneLifelock); protect devices that process personal information (Dollar Tree); keep safety standards in place when data is en route (AccretiveCBR Systems); dispose of sensitive data securely (Rite AidCVS CaremarkGoal Financial).

As this guidance is based on what companies did wrong or didn’t do that led to FTC enforcement actions, it will be interesting to see how the FTC treats a company that suffers a data breach but demonstrates that they used reasonable efforts to comply with the FTC’s guidance.  I suspect the FTC will take a company’s compliance with this guidance into consideration when determining penalties in an enforcement action. The guidance is very high-level, so companies must rely on their IT and Legal teams to determine what steps, processes and protocols need to be implemented in alignment with the FTC’s guidance.

In addition to publishing the guide, the FTC has embarked on a conference series aimed at SMBs (small and medium-sized businesses), start-up companies, and developers to provide information on “security by design,” common security vulnerabilities, secure development strategies, and vulnerability response.  The first conference took place September 9 in San Francisco, CA; the second will take place November 5 in Austin, TX.

The FTC also announced a new website at which they’ve gathered all of their data security guidance, publications, information and tools as a “one-stop shop”.  You can find it at http://www.ftc.gov/datasecurity.

Podcast – the in-house perspective on trade secrets, privacy, and other topics

I recently had the privilege of being interviewed for IP Fridays®, a podcast series by Ken Suzan (of counsel and a trademark attorney at the Minneapolis office of Barnes & Thornburg LLP, and Dr. Rolf Claessen, partner at Patent Attorneys Freischem in Cologne, Germany.  We discussed the in-house perspective on a variety of topics, including trade secrets, copyrighting software code, and privacy.  Head to IPFridays.com if you’d like to listen, or click here to head straight to the podcast.