Tuesday, July 25, 2017

Why A Recent Federal Decision Involving A Grocery Store Matters to Most Organizations with Websites and Apps




A recent case has organizations all over the U.S. concerned about litigation over website accessibility.

In the first federal decision of its kind, a federal judge in Florida concluded that Winn-Dixie, a regional grocery store chain, was obligated to make its website accessible to a blind man, and that it failed to do so.

As a result, the court awarded the plaintiff his attorneys' fees and ordered the parties to agree on a compliance deadline by the end of this month.

I've written previously about the trend in demand letters and the uncertainty in the law regarding the applicability of the Americans With Disabilities Act to websites, applications and other online interfaces. 

Background

By way of background, when the Americans with Disabilities Act was first drafted in 1988 (and adopted in 1990), it is unlikely that even a single member of Congress contemplated that it could be applied to the Internet. The ADA (and specifically Title III) was applied to brick-and-mortar facilities and intended to ensure that people with disabilities could access and enjoy them. Common examples are wheelchair ramps and braille menus. In the quarter-century since, almost everything that was once only brick-and-mortar now has a presence on the Internet.

One of the greatest ADA questions of our day is whether the ADA applies to websites, apps, and other online interfaces. Only a few courts have addressed this issue, and the results have been mixed, and sometimes very fact-specific. Courts must decide whether a given website is a "public accommodation" and, if so, whether the website operator has made "reasonable modifications" to make the website available to people with disabilities. 

The ADA is enforced by the U.S. Department of Justice (DOJ) and through private litigation. The DOJ is reviewing organizations' websites to determine whether they comply with the law’s access requirements. In addition, a number of plaintiffs' law firms across the country are filing lawsuits alleging that organizations' websites are in violation of the ADA. Internet companies, including Netflix, have settled cases that alleged their websites were inaccessible to people with disabilities.

There are currently no specific federal standards for websites under the ADA. Since 2010, the DOJ has been telling us that it is in the process of developing regulations for website accessibility, but those standards are not expected until 2018 or later. In the meantime, the DOJ says it expects organizations to make their websites accessible to the disabled. The DOJ has indicated that it considers the Web Content Accessibility Guidelines (WCAG) [2.0 Level AA] to be satisfactory for the time being (and perhaps these standards go further than legally necessary), and many organizations have been working towards compliance with those standards on the assumption that any future DOJ standards will be consistent with them (although there are no promises).

Why the Winn-Dixie Case Matters

The decision in Gil v. Winn Dixie is the first federal court opinion addressing the applicability of the ADA to the website of a brick-and-mortar retailer. While it is not binding throughout the U.S., it sets an important precedent. 

The court concluded that the ADA applied because Winn Dixie's website is “heavily integrated” with and serves as a “gateway” to its physical stores. That's an important consideration for brick-and-mortar retailers, who may want to re-evaluate accessibility in light of this recent development.





Monday, May 8, 2017

Can Young Lawyers Learn Something From Older Lawyers About Managing Their Professional Reputations Online (and Vice Versa)?

Here's an article that was published this week in the North Carolina Lawyer magazine that might be of interest to some of you.


Can Young Lawyers Learn Something From Older Lawyers About Managing Their Professional Reputations Online (and Vice Versa)?


by Matt Cordell, NCBA YLD Chair


When I have the opportunity to give advice to law students and young lawyers, one of the things I try to impress upon them is the importance of their reputations, including their “online reputations.” Usually the comment is quickly met with a knowing nod. Everyone seems to know that their reputation is important. However, having witnessed many lawyers of all ages impair their professional reputations online, I have begun to realize that many of us fail to recognize some aspects of maintaining our online reputations, and I have begun to be much more specific in my advice to younger lawyers.

...Read the rest here.

Sunday, March 5, 2017

A New Chapter


This photo was taken
for the firm's website
when I joined in 2007

In 2005, I met two exceptional people, Don Eglinton and Leigh Wilkinson, during on-campus interviews at my law school.  I could immediately tell from the way they talked about Ward and Smith and its people that there was something special about the firm.   In the years since, I've experienced firsthand the remarkable culture of this firm and the people who make it so special. I have also had the opportunity to work with some incredibly smart, innovative clients in a number of fields, and I've learned a great deal from many of them.  

My practice has evolved over the past decade, and I have found that I very much enjoy practicing in the areas of privacy law, information security law, and technology law, in particular.  A very attractive opportunity has arisen which will enable me to work on these issues on a global scale.

I will be joining the legal department of VF Corp in Greensboro, N.C. If you are unfamiliar with VF, you are likely familiar with its brands, which include The North Face, Lee, Wrangler, Vans, Timberland, Nautica, Smartwool, Reef, Eagle Creek, Eastpak, JanSport, Kipling, and others.  VF has more than 50,000 employees globally and about $12 billion in annual revenue.  The legal department, like the rest of the company, spans the globe.  I will be managing a small group within the legal department handling privacy, information security, and information technology contracting.

Volunteering at a workday at Camp Challenge
(a financial literacy camp for underprivileged kids)
with my Ward and Smith colleagues
just a few months after joining the firm in 2007
Even though I will miss my law partners and clients, I am looking forward to this new challenge and to starting a new phase of my career.  I am also looking forward to spending a little more time with my family.  We will be moving to the Triad area very soon.

I am confident that all of the clients with whom I have worked over the years are in good hands with the other (nearly 100) lawyers at Ward and Smith.

I intend to continue to write about interesting legal developments on my personal blogs: www.BizLawNC.com and www.LawOfPrivacy.com / www.PrivacyLawNC.com.  I hope you'll continue to check back in from time to time. 



Saturday, December 17, 2016

The FCC Creates Privacy, Data Protection, and Data Breach Rules for Internet Service Providers



Image of Federal Communications Commission Seal


The Federal Communications Commission is venturing into new areas of privacy regulation.  By a narrow vote, the FCC has approved new rules that govern how internet service providers ("ISPs") use consumers' information.

 

ISPs long ago realized that customer data is valuable, and are continuing to develop ways to monetize that information.  For example, last month, AT&T explained that a major factor in its decision to bid on Time Warner was the lure of new possibilities in targeted advertising.  Last year, Comcast bought targeted advertising firm Visible World for similar reasons.

 

Efforts by ISPs to monetize user data have triggered concerns among privacy watchdogs and the FCC.  On October 27, 2016, the FCC adopted new rules to control when and how this information can be used and shared.  "It's the consumers' information.  How it is used should be the consumers' choice" said FCC Chairman Tom Wheeler. 

 

According to the FCC, the rules "do not prohibit ISPs from using or sharing their customers’ information – they simply require ISPs to put their customers into the driver’s seat when it comes to those decisions.”  The new rules require specific notices to consumers about:


  • The types of information the ISP collects from them

  • How the ISP uses and shares the information

  • The types of entities with whom the ISP shares the information

The rules also require ISPs to give a degree of control to the consumer.  ISPs will be required to obtain consumer consent (an "opt-in") before sharing certain categories of "sensitive" information, including:


  • Health information

  • Financial information

  • Geo-location

  • Children’s information

  • Social Security numbers

  • Web browsing history

  • App usage history

  • Content of communications

For other categories of information (those not deemed “sensitive," such as an email address or service level), ISPs must still offer users the opportunity to “opt-out” of the use and sharing of their information, with some exceptions.  Customer consent can be inferred for certain uses, such as providing services and for billing and collection activities.

 

ISPs are prohibited from rejecting a customer for refusing to provide a requested consent.  Because it is more profitable for the ISP if the customers permit data use and sharing, the rules permit an ISP to give customers a discount or other financial incentive to provide a requested consent.

 

The FCC has made it clear that its rules “do not regulate the privacy practices of websites or apps, like Twitter or Facebook, over which the FTC has authority.”  Websites and apps currently collect much more data than ISPs, so the practical impact of the rules on consumer privacy is likely to be limited.

 

The new rules impose a requirement that ISPs implement reasonable data security practices, including robust customer authentication and data disposal practices.  The rules also include a data breach notification requirement, which preempts those in existence in 47 states, but only to the extent that the FCC rules are inconsistent with a state's requirements.   

 

The rules become effective with respect to different sections at different times, with all of the rules likely becoming enforceable within one year. 

 

This action by the FCC creates just one more piece in the mosaic of statues, regulations, and treaties that together comprise privacy and data security law. 

Sunday, November 20, 2016

"Cyber Safeguards and Procedures" for Law Firms

I recently spoke about information security issues to a group of approximately 175 attorneys in the Triad, at a continuing legal education event sponsored by Lawyers Mutual.  The session was titled "Cyber Safeguards and Procedures" and focused on data security risks faced by law firms and how they can mitigate those risks.  I was joined by Lawyers Mutual claims attorney Troy Crawford.  If you would like a copy of the slides from this presentation, please email me: email Matt



Photo of Matt Cordell and Troy Crawford on stage speaking to audience
photo by Camille Stell, Lawyers Mutual

Sunday, October 16, 2016

A Few Thoughts on Selecting a HIPAA Privacy and Security Officer


Perhaps your organization is becoming a HIPAA covered entity or a business associate for the first time, and you now understand that your organization will have to comply with HIPAA. One of your first, and most important, tasks will be to designate a Privacy Officer and Security Officer.  This post describes some considerations you should think through when making this decision.

One person or two?
The HIPAA Privacy Rule requires a privacy officer be designated and the HIPAA Security Rule each requires a security officer be designated.  It is legally permissible to have on person designated as both, or split the roles. You'll need to decide whether to combine or bifurcate these roles.  


First, you need to decide whether you have one person within your organization who has the capabilities required for both roles.  The Privacy Officer is responsible for understanding who is allowed to access protected health information (PHI), and will need to answer questions about practices, address requests for information, and handle training and monitoring of other staff. The Security Officer is primarily focused on protecting electronic protected health information (ePHI) from unauthorized access (e.g., meeting encryption requirements, etc.). If the person you would prefer to designate as the Privacy/Security Officer does not have an understanding of the technological aspects of protecting ePHI, there are two solutions: (a) designate someone with the technological understanding to be the Security Officer, or (b) instruct someone with the technological understanding (either inside or outside of the organization) to assist the Privacy/Security Officer.


What is most effective? The benefit of designating two officers is that each can be more specialized, and potentially more effective in their respective areas. However, the risk associated with having two officers is that things that are not clearly just privacy or just security might fall through the cracks if the two do not coordinate well.

What is most efficient? For administrative purposes, it's hard to argue that having one designated officer isn't substantially easier than having two. There is so much overlap in the two areas of responsibility that if you can have one person be responsible for both, it may avoid a lot of duplication of effort. Combining the roles is more common in smaller organizations.

All that said, there's no legally incorrect answer here. Just like the debate over whether a CEO should also be the Chairman of the Board, there are good arguments on either side, and the answer often boils down to the size of the organization and administrative ease.
 

Can (and should) an organization have more than one Privacy Officer or Security Officer?  Some organizations are both a HIPAA "covered entity" (e.g., healthcare provider or sponsor of an employee health plan) as well as a "business associate" (e.g., service provider to a covered entity). Those organizations will need to decide whether the Privacy and Security Officer(s) they designate for themselves as a covered entity should be the same person(s) designated for purposes of the protected health information they acquire as a business associate.  Generally speaking, an organization's obligations as a covered entity are similar to its obligations as a business associate. With the exception of contractual obligations in business associate agreements, the basic legal obligations are almost identical. (The Security Rule obligations to protect ePHI are basically identical. The Privacy Rule obligations are very, very similar.)  


Generally, I don't think there is a compelling reason to have separate Privacy Officers (or Security Officers) for these two capacities in which an organization might be acting, and I don't believe that is a common practice.  I think it is most efficient to have one Privacy Officer and Security Officer who is responsible in both contexts, and who understands the subtle differences in those contexts.  Organizations that find themselves acting as both a covered entity and a business associate should be aware of the distinctions, however, and should have policies and procedures that reflect those distinctions.  Here is one practical example:  Most employees should be shielded from access to PHI that is held by a plan sponsor of an employee benefit plan.  However, within the same organization, far more employees might have a legitimate need to access the PHI of in the capacity as a business associate of other organizations. 


Once you've made this important decision, you can begin building a HIPAA compliance policy and procedures around the basic structure you've chosen. (Let me know if you'd like some help with that.) - Matt






























Sunday, October 9, 2016

Is Your Customer Data Your Greatest Asset or Your Greatest Liability (or Both)?




Customer data can be a treasure trove for an organization.  Many organizations believe customer and prospect data to be their most valuable asset.  Unfortunately, some have discovered that, unless handled with care, it can also be their greatest liability.


Organizations of all kinds collect, store, analyze, use, and share consumer data for myriad reasons.  Consumer data may help an organization maintain contact with a customer or prospective customer.  Properly analyzed, it can often predict customer behavior, allowing an organization to tailor its communications and offerings.  It can reveal patterns that help increase revenue, minimize expenses, and ultimately drive profitability.  Data can be leveraged and monetized by sharing with affiliated and non-affiliated entities.  Given the immense value of consumer data, it is no surprise that some of the most valuable companies in North Carolina and the world are data analytics firms.




Over the past few years, however, it has become widely acknowledged that such valuable data can also be a liability of the greatest magnitude.  The costs of the largest data security breaches have made headlines.  But these sensational headlines sometimes create the misleading impression that only large organizations incur massive costs, and that the losses are solely attributable to hackers.




The Risks, by the Numbers
One of the best sources of information about risks associated with consumer data is NetDiligence's annual study of "cyber insurance" policy claims.  Although the information is limited to incidents for which the targets had insurance coverage, and is limited to covered losses, it is still an excellent source of data.  The most recent study, covering claims data from 2012 to 2015, showed the average insurance claim amount was $673,767, with average legal fees of $434,354.




Smaller Organizations Face Increasing Risks
In the NetDiligence study, organizations were categorized by size (revenue), which provides some interesting insights.  The smallest organizations represented the largest raw number of incidents, probably due to the fact that there are simply more small organizations than there are large ones.  While the three smallest categories of organizations accounted for a combined 71% of the reported incidents in 2015, they were responsible for only 38% of records exposed.  It was surprising, however, that, according to NetDiligence, some of the largest claims came from smaller organizations.  This may be a result of the smaller organizations being less aware of their exposure or having fewer resources to provide data protection and security awareness training for employees.  By contrast, mid- and large-revenue organizations accounted for only 17% of incidents, but were responsible for 60% of the consumer records exposed.  This seems intuitive, because larger organizations would be expected to have more consumer records, on average, than smaller organizations.




Risks Are Spread Across Industries
The NetDiligence study also reveals a good deal about the source of recent risks.  While risks in prior years were concentrated in certain industries, they are becoming less concentrated year by year.  According to the study, recent losses were more evenly dispersed among business sectors, with healthcare reporting the most at 21% and financial services coming in second at 17%.  In other words, the categories of affected data resulting in the highest losses, from all industries, were health information and financial data, but the majority of losses were incurred outside of these two historically most targeted industries.




Vendors: The Weak Link?
Vendors are a common source of privacy and data security risk.  Vendors include service providers and others with access to an organization's data or systems.  In 2015, 25% of claims were attributable to vendors.  Of those claims, approximately half were hacking incidents, with the other half largely accidental or intentional disclosures.  Another interesting observation is that the vendor events exposed significantly more consumer records than events that occurred at the organization itself, indicating that failures by vendors may tend to be more systemic than failures at the level of the primary organization.




Healthcare providers and other HIPAA-covered entities, financial institutions, and defense contractors have long been required to extract certain contractual agreements requiring security protection from their vendors.  Following the breach of a Target vendor resulting in a massive theft of Target's customer data, organizations of all kinds began imposing contractual privacy, security and, importantly, indemnity terms on vendors, and these terms are sometimes heavily negotiated.




Data Use Violations: A Bigger Risk Than Breach?
Data-related liability in the context of nefarious hackers breaching security systems from foreign lands dominate the headlines, but much less dramatic circumstances lead to large numbers of significant incidents every year.  An analysis of what triggered the losses that gave rise to cyber liability claims in 2015 reveals that targeted security breaches are not the only source of loss.
There were many reported causes of claims, and while the most expensive were malicious hacking attacks, the second greatest cause was the wrongful collection of data—in other words, data use (or "privacy") claims.  Data use violations involve the intentional collection, storage, use, or sharing of consumer information in a way that violates the law, a contract, or an individual's right. 


Organizations and individuals throughout the United States are collecting, using, and sharing data in ways that expose them to liability, often without realizing it.  One of the most frequent violations involves collecting consumer information without consent, followed closely by using consumer information for purposes that were not consented to at the time of collection.


An Ounce of Prevention
Perhaps nowhere else is the axiom "an ounce of prevention is worth a pound of cure" more appropriate than in the context of the modern explosion in the collection and use of customer data.  Preventing a data security- or privacy-related loss involves more than just purchasing defensive technology.  According to reports, simply adopting and implementing good policies and procedures for correctly collecting, storing, using, and sharing data would have prevented a large portion of the reported losses.  Data governance policies and precures should be carefully crafted and followed, and should cover the following areas:
  • Document retention and data destruction
  • Consumer consent practices and electronic signatures
  • Payment card information
  • Employee email and telephone monitoring
  • Website and application monitoring and advertising
  • Email marketing
  • Telephone and text message marketing
  • Fax marketing
  • International consumers and international data transfers
  • Password administration and limited access
  • Background checks and credit reports
  • Identity theft and "red flags"
  • Employee and consumer health information
  • Educational records
  • Sharing customer information with affiliates
  • Sharing customer information with non-affiliates
The policies should address the following:
  • Designated categories of data based on sensitivity (low risk, high risk, etc.) and business necessity (critical, valuable, low-value, etc.); and,
  • Established guidelines for collecting, using, storing, and sharing various categories of data.


Telling the World
Organizations frequently publish privacy policy statements to inform their customers and others about their privacy practices.  Financial institutions, healthcare providers, and website operators are all required by law to make such statements publicly available.  Many organizations, unfortunately, misunderstand the purpose of this document.  A privacy policy statement is not the same as an internal policy or procedure; it is a public-facing disclosure that should be simple and flexible.
Organizations are often their own worst enemies in misconstruing the purpose of privacy statements.  They frequently draft and distribute privacy policy statements that include lofty language and make promises the organizations are not required to make, only to later fail to fulfill those unnecessary promises, thereby creating unnecessary liability.  Practices that do not live up to the statements made in a privacy policy statement are the number one source of Federal Trade Commission enforcement actions.  

Not If, But When
It is natural for an organization, just like an individual, to hope that it is immune from risks that others face.  If, however, the federal government, the United States military, and major multinational corporations are susceptible to major privacy and data security incidents, your organization probably is as well.  Therefore, it is most reasonable to think of a data security or privacy incident not in terms of "if," but rather "when."



Breaches and intentional, but unauthorized, data disclosure events trigger reporting obligations to federal and state officials, customers, and sometimes the media, and often result in regulatory enforcement actions and litigation (including class action lawsuits).  There are, however, steps that an organization can take to prepare for such unwelcome events and that can help mitigate resulting losses.  Two of the most important steps an organization can take are:
  • Purchase cyber insurance; and,
  • Adopt a breach response plan.
Cyber insurance is a term that refers to a category of insurance policies that transfer, in return for the payment of a premium, some of the financial risk of a data security incident to an insurance company.  Cyber insurance policies are not standardized, and they vary dramatically in the scope of coverage.  For example, the direct loss of funds from a hacked bank account is almost never covered by a cyber insurance policy, but many potential liabilities and defense costs can be covered.  It can be helpful to have the assistance of a knowledgeable attorney when evaluating cyber insurance coverage options.



Having an incident response plan in place is always a good idea.  Once an incident has occurred, the required timeframes for reporting the incident and mitigating any resulting harm can be very short (sometimes less than a week).  Having a plan in place, and a designated team ready to implement the plan, can make a tremendous improvement in your organization's response and potentially limit losses associated with the incident.  Additionally, incident response assistance (such as forensic computer expertise, call centers, printing and mailing services, and public relations) can be vetted and prices negotiated in advance, with potentially massive savings.




Ready or Not, It's Time
Complying with privacy laws, mitigating risks, and preparing for the possibility of a loss may seem daunting.  Given the scope and magnitude of the risks, however, it is simply a necessity in today's environment.  The task is manageable with some professional guidance, and the peace of mind that preparation can bring is well worth the effort.






Matt Cordell is the leader of the Privacy and Information Security practice group at Ward and Smith, P.A., a full-service law firm with five offices and approximately 100 attorneys across North Carolina.  He is a Certified Information Privacy Professional (CIPP/US) and a member of the International Association of Privacy Professionals.  Matt is also the chair of the NC State Bar privacy and information security specialization exploratory committee. 


Matt Cordell has been frequently rated one of the best lawyers in North Carolina.  Data security lawyer in RTP.  Information security lawyer in Raleigh.  Best North Carolina business lawyer. 







Tuesday, July 5, 2016

Business Associates of HIPAA Covered Entities Beware!


If your organization is a business associate of a HIPAA covered entity (such as a health care provider or employee health benefit plan), you should know that the Department of Health and Human Services' Office of Civil Rights (OCR) is actively pursuing business associates over privacy and information security violations.

Business Associate Fined >$15,000 Per Patient

This past week, Catholic Health Care Services of the Archdiocese of Philadelphia (CHCS) agreed to settle with OCR after alleged violations of the HIPAA Security Rule that came to light after the loss of an iPhone containing protected health information (PHI) of 412 nursing home residents. The settlement requires a monetary payment of $650,000 and a corrective action plan. (For those who have not already done the math, the fine alone will cost CHCS more than $15,000 per patient!)

In announcing the settlement, OCR's Director Jocelyn Samuels emphasized the importance of a comprehensive program: “Business associates must implement the protections of the HIPAA Security Rule for the electronic protected health information they create, receive, maintain, or transmit from covered entities. This includes an enterprise-wide risk analysis and corresponding risk management plan, which are the cornerstones of the HIPAA Security Rule.”  In the case of CHCS, the iPhone was unencrypted and was not password protected. To make matters much worse, OCR learned that CHCS had no policies addressing the loss of mobile devices containing PHI, no security incident response plan, no risk analysis, and no risk management plan.

As part of the settlement, OCR will monitor CHCS for two years to ensure compliance. You can read the Resolution Agreement and Corrective Action Plan on the OCR website at:
http://www.hhs.gov/hipaa/for-professionals/compliance-enforcement/agreements/catholic-health-care-services/index.html.

 
Business Associate Audits
 
This announcement comes just months after the launch of the second phase of OCR's much-anticipated audit program for business associates. Rather than awaiting reports of violations, the OCR is actively auditing business associates. When announcing the audit program, OCR explained the process:
  • First, OCR will contact organizations by email to verify contact information and complete a pre-audit questionnaire.
  • Organizations selected will be subject to either a desk audit, an onsite audit, or both.
  • Organizations will have a about 10 business days to produce requested documents, so there will be insufficient time to create or update HIPAA privacy and security policies, security risk assessments, breach notification documentation, business associate agreements, and other HIPAA documentation after notification.
Business associates should not wait until an audit is initiated.  Now is the time to ensure that HIPAA programs are in place, complete, and up to date.  If this week's CHCS settlement is any indicator, the OCR will be seeking large fines when it uncovers violations.






Matt Cordell is a North Carolina lawyer with expertise in HIPAA and health care privacy and information security. 
 

Tuesday, June 28, 2016

BREXIT: Unchartered Territory for EU and UK Data Protection Standards

My law partner, Deana Labriola, has written a piece about the Brexit and its impact on the GDPR. 


BREXIT: Unchartered Territory for EU and UK Data Protection Standards

| Deana A. Labriola
So what changed on June 23, 2016? Maybe everything, and then again, maybe nothing at all.  The UK is leaving the EU.  While this decision will have far reaching implications for years to follow, it may be far less impactful for data protection laws, at least in the short term.


You can read the rest here:   http://www.wardandsmith.com/articles/brexit-unchartered-territory-for-eu-and-uk-data-protection-standards

Tuesday, June 14, 2016

Don't Be Tardy. Get Schooled on North Carolina's New Education Technology Law Now!

Photo of Education Tech Privacy North Carolina Data Security Lawyer Matt Cordell Best Lawyer Raleigh North Carolina Privacy Attorney RTP North Carolina

New NC Law Enhances Student Privacy Rights and Restricts Providers of Online Educational Resources

Education technology (or "EdTech") organizations will want to pay close attention to a new North Carolina statute that was signed into law a couple of days ago.  On Thursday, June 9, 2016, a new law titled "An Act to Protect Student Online Privacy" was enacted to further protect the privacy of K-12 students in North Carolina.  It becomes effective October 1st, so education technology companies have very little time to prepare before the upcoming school year begins.  They should review their data collection, storage, use and sharing policies and procedures in light of the new law, and adjust their practices if necessary.  In some cases, this may require changing or disabling the features and functions of websites or applications.


Who Is Affected?


The law is primarily aimed at the fast-growing Ed Tech sector.  Organizations may be affected whether or not they have a contract with a school, school board, or the State of North Carolina.  The statute applies to the operators of websites, online services, online applications, or mobile applications who know that the site, service, or application is used primarily for K-12 school purposes.  School boards are also affected, because they should ensure that their contracts with providers of online services require those providers to comply with the new law.
Like the existing student privacy statute, the law applies to public schools only.  Private schools, and their service providers, will remain unaffected.  (If private schools wish to protect the privacy of their students, they must do so by including contractual protections with their service providers.  I would strongly suggest that they do so.)


New Prohibitions


Online operators are prohibited from selling or renting a student's information without parental consent.  They are also generally prohibited from disclosing a student's covered information (defined below) except for six specific purposes.  The permissible disclosures include disclosures to a subcontractor who is contractually prohibited from further disclosure of the information and who agrees to implement reasonable security procedures.


Online operators may not engage in so-called "targeted advertising" (better known as "behavioral advertising") based on information received for "school purposes."  "Targeted advertising" means presenting an advertisement to a student where the advertisement is selected based on information obtained (or inferred over time) from that student's online behavior, usage of applications, or covered information.  Furthermore, they are prohibited from "amassing a profile" of a student except for school purposes.


New Requirements


In addition to proscribing new limitations, the statute imposes two new obligations on online operators.  All operators must "implement and maintain reasonable security procedures" and "protect covered information from unauthorized access, destruction, use, modification, or disclosure."  Operators are also required to delete a student's information at the request of the school board, or when the operator stops providing service to the school board, unless the student's parent consents to the record retention.


Broader Scope of Covered Information


Although the student privacy statute already contained a definition of the term "personally identifiable information," the new statutes creates a significantly more broad definition of the same term that is applicable only for purpose of online privacy protections.  It includes twenty nine (29) categories of information.


Interaction with Existing Law


You may recall that I wrote in mid-2014 about a then-new student privacy law in North Carolina.  You can read that summary here.  Titled "An Act to Ensure the Privacy and Security of Student Educational Records," the law prohibited schools from collecting certain categories of information, restricted the disclosure of personally identifiable student data, required school boards to give parents an annual summary of parental rights and opt-out opportunities, and directed the State Board of Education to make rules regarding privacy standards, audits, breach notification and data retention and destruction policies.  The 2016 law described in this article amends and enhances the 2014 statute.


It should be noted that the federal Children's Online Privacy Protection Act (better known as COPPA) already protects children's online privacy in the educational context as well as in all other contexts.  Any organization affected by North Carolina's new statute should already be in compliance with COPPA, but if it is not, there is no better time than now to become compliant.


Don't Get Sent to the Principal's Office!


Education technology companies and school boards have very little time to revise their policies and practices in order to comply with the new statute.  They should consult with their privacy counsel quickly so that they will not be "sent to the principal's office" when the summer break ends!








You can find more posts like this by Ward and Smith, P.A. attorney and Certified Information Privacy Professional (CIPP/US) Matt Cordell at the North Carolina Privacy and Information Security Law Blog: www.PrivacyLawNC.com.  Matt Cordell practices in the areas of privacy law, information security law, data use law and related consumer protection laws, and has offices in Raleigh, New Bern, Greenville, Wilmington and Asheville.  This article is not intended to give, and should not be relied upon for, legal advice in any particular circumstance or fact situation. No action should be taken in reliance upon the information contained in this article without obtaining the advice of an attorney.