Saturday, November 15, 2014

One More Reason to Handle Consumer Electronic Consents Correctly

From time to time, clients balk when I describe the components of an effective consumer consent to an electronic transaction. They say "I've seen lots of other websites, and they don't require this."

They are correct, in part. Most websites have deficient disclosures and consent language. Most of the time, it does not result in anything catastrophic. That does not make it legal...or smart.

One aspect of consumer electronic transactions that people question most often is affirmative consent. They ask whether it is truly necessary to provide detailed disclosures and obtain affirmative consent from consumers when entering into agreements through electronic means. Affirmative consent means that the consumer expressly agrees to the terms, or "opts in." An example of affirmative consent is the following:
"By clicking the button labelled 'Accept' below, you agree to the terms and conditions of this Agreement and acknowledge that you have read and understand the disclosures provided above."
Most businesses would generally prefer negative consent, or "opt out." An example of negative consent is the following:
"By using this website, you are agreeing to the terms of these Terms and Conditions."
Obviously, getting "negative" consent is easier and cheaper than getting affirmative consent.

However, the (federal) E-SIGN Act and the (state) Uniform Electronic Transaction Act require that if any other statute, regulation, or rule requires that a consumer be given a document or disclosure in writing, then in order to for a consumer to effectively agree to receive it in electronic format, the consumer must affirmatively consent after having been given very specific disclosures. In some circumstances, it may be difficult to identify a specific law requiring a written disclosure in connection with the contemplated transaction. However, there are a number of disclosure requirements contained within the millions of pages of law affecting consumer transactions. Just because you can't think of one off the top of your head doesn't mean none exist. For this reason, I almost always advise my clients to obtain affirmative consent from consumers for online agreements.

In this post, I'm going to give you a real-world example of a situation in which obtaining a proper consumer electronic consent could save a lot of money.

ABC Corp. (fictional) sells products and services to consumers in North Carolina through its website and the telephone. It has collected information from tens of thousands of consumers over the past few years, and stores that information on its database on its own server. Included in the information are the consumers' credit card numbers (so that regular customers will not have to provide all of their information with every order). The credit card numbers are not encrypted on the database. ABC Corp. becomes aware of an incident of unauthorized access to its database. Customer information likely has been accessed, and the available information indicates that the person who accessed the information has nefarious intent.

Under North Carolina law, ABC Corp. is obligated to notify each consumer of the data security breach. The North Carolina Identity Theft Protection Act says that ABC Corp. can notify the consumers via email only if the consumer's consent has been properly obtained in accordance with the E-SIGN Act. If ABC Corp. has records of consumers' email addresses, but has not obtained the proper consent to provide subsequent legally-mandated notices by email, ABC Corp. cannot provide the notice by email. Instead, the Identity Theft Protection Act requires that the notice be provided by mail (if mailing addresses are available). Thererfore, because ABC Corp. has failed to obtain consumer consent in the proper way at the outset, the cost of responding to a subsequent data security breach will be tens of thousands of dollars more as a result printing and postage alone. 

This is just one example of the many ways in which handling consumer consent correctly at the start of a relationship with the consumer can pay off later.

Monday, September 15, 2014

Panel Discussion on Data Security, Breach Response, and Emergency Management


I was honored to be asked  to participate in a panel discussion on business security issues with some top thought leaders in North Carolina. Topics included data security, risk management, breach response, and emergency management. I enjoyed hearing the insights of these three smart, accomplished people. Please feel free to view the video on YouTube and share it with others who might be interested.


Tuesday, September 9, 2014

Social Media for Financial Institutions: Maximizing the Rewards while Minimizing the Risks

(This article was published in the Carolina Banker magazine by the North Carolina Bankers Association in the Fall 2014 issue.)


Social Media for Financial Institutions: Maximizing the Rewards while Minimizing the Risks



By now almost everyone knows that social media has tremendous potential for businesses of all kinds to connect with important constituent groups. The average American spends 37 minutes per day on social media. Facebook alone has more than 1.2 billion users, and a quarter of them log in more than five times per day. Twitter has twice as many users as the United States has citizens. In addition to marketing products and services to customers and prospects, banks now use social media to obtain feedback and market intelligence, recruit and engage employees, and enhance shareholder relationships. These attractive opportunities do not come without risk; fortunately, however, these can be mitigated by an effective social media compliance and risk management program.

Regulatory Attention

A few months ago, the Federal Financial Institutions Examination Council ("FFIEC"), which includes representatives from federal and state regulators, issued guidance for banks regarding the legal, operational and reputational risks associated with social media. Soon, examiners will likely expect banks to have written risk assessments and social media policies and procedures.

The FFIEC guidance addressed many — but not all — of the outstanding banking law questions about social media. Most of the regulations the guidance discusses involves the nature and placement of consumer disclosures, recordkeeping, and other straightforward issues. The guidance also raised more complex issues, however, such as the risk of disparate impact, an anti-discrimination legal theory favored by the Consumer Financial Protection Bureau. Not all of the outstanding questions were addressed by the guidance, however, so good, practical judgment will be needed to apply existing regulations in a new environment. For example, customer privacy issues can arise in social media that require banks to respond to customer communications differently than other businesses might.

Importantly, the guidance states that even banks that do not have any official social media accounts should still consider the risks posed by social media, document the risk assessment, and adopt any policy needed to address identified risks. Risks faced by banks that do not have an official social media account include reputational risks of negative comments and complaints by customers, as well as risks posed by employees' use of social media. The regulators have made clear that a bank may be held responsible for an employee's social media use if it appears the employee is acting on behalf of the bank and the bank has not taken adequate steps to address the risk. (How certain are you that none of your bank's employees are talking about the bank's products and services on their own social media accounts?)

Reputation Management

A widespread concerns among bankers about social media is the potentially damaging effects of publicly-aired customer complaints. This is a real risk, but it is important to note that it is present whether or not a bank has a social media presence. Disgruntled customers can — and do — air grievances on social media and customer review websites whether or not you have a Facebook page or Twitter profile. If your bank has a presence on social media, however, you may have a better opportunity to identify and address those grievances.

Both legal and practical considerations in determining whether, and how, to respond to a public complaint. Well-crafted social media policies and procedures, coupled with a well-trained and savvy team, can effectively handle most public complaints, and may achieve net-positive outcomes. When the commenter can be identified, the recommended approach is usually to simply ask the customer to remove the offending post. If a commenter refuses to remove a false, misleading, or abusive comment voluntarily, you may resort to dealing with platform provider (e.g., Facebook, Twitter, Google, Yelp, etc.). Each platform has terms and conditions that establish unique criteria for removing posts. Understanding these criteria can help you draft a request to the platform that is more likely to result in the removal of an offending comment. A letter sent from a knowledgeable lawyer on behalf of the bank is often helpful.

Spoofing

Social media presents opportunities for others to impersonate or "spoof" the bank. However, this can happen whether or not a bank is active on social media, and in fact, by being active in social media, a bank can actually reduce the likelihood and effectiveness of these nefarious efforts. Fortunately, most social media platforms are generally quick to shut down fraudulent accounts.

Promotions

Social media and promotional contests seem to go together like peanut butter and jelly. They can be useful tools to encourage social sharing of your bank's content. As with any promotional contest, various state and federal laws must be observed, and liability and reputational risks must be mitigated. Also, some social media platforms restrict certain types of promotions. It may be worthwhile to consult a knowledgeable lawyer before beginning any contest or drawing.

Developing a Policy, Procedures, and Implementation Team

The size and complexity of a social media program should be commensurate with the degree of the bank's involvement in social media. For example, a bank that uses only one platform (e.g. Facebook) should have a more focused program. A bank using several media (e.g., Facebook, LinkedIn, Twitter, Yelp, Google +, and YouTube) should have more comprehensive procedures.

The FFIEC advises that a social media program should be designed with participation from specialists in compliance, technology, information security, legal issues, human resources, and marketing. Ideally, a team will be small, with individuals whose expertise spans more than one of these categories. After a program is crafted, it can be implemented by a smaller team or an individual, with support from specialists as necessary.

A recent survey revealed that banks in the southeastern United States have the lowest rates of social media participation in the nation. In some other regions of the country, banks are more than three times as likely to have a social media presence. Given the size of the potential audiences and the high level of user engagement, it seems likely that more banks in our region will implement or expand social media strategies soon. Though all risks cannot be eliminated, a well-crafted plan can manage the risks while maximizing the rewards.

Saturday, August 16, 2014

Directors Should Be Paying Attention to Data Security Practices

Directors should take an active role in managing data security risks rather than leaving it up to management and IT staff, according to recent remarks by SEC Commissioner Luis Aguilar.

Commissioner Aguilar recently delivered a speech at the New York Stock Exchange in which he emphasized that cybersecurity has become a “top concern” and pleaded with corporate directors to “take seriously their obligation to make sure that companies are appropriately addressing those risks.”

The Commissioner reported that U.S. companies experienced a 42% increase from 2011 to 2012 in the number of successful cyber-attacks.  He also pointed out a number of recent high-profile incidents, including the following:
  • The October 2013 cyber-attack on the software company Adobe in which data from more than 38 million customer accounts was breached;
  • The December 2013 cyber-attack on Target, in which the payment card data of approximately 40 million Target customers and the personal data of up to 70 million Target customers was breached;
  • The January 2014 cyber-attack on Snapchat, a mobile messaging service, in which a reported 4.6 million user names and phone numbers were leaked;
  • The multiple cyber-attacks against several large U.S. banks, in which their public websites have been shut down for hours at a time; and
  • The numerous cyber-attacks on securities exchanges. (According to a 2012 global survey of 46 securities exchanges, 53% reported experiencing a cyber-attack in the previous year.)
Commissioner Aguilar said that cybersecurity has become a "top concern" of American companies over a relatively short period of time.  That's good news.  But, according to the Commissioner, directors themselves should be involved in addressing cybersecurity risks.

The essence of Commissioner Aguilar's comments related to the board’s role in corporate governance and overseeing risk management.   He pointed out that since the financial crisis, there has been an increased focus on how boards address risk management.  While acknowledging that primary responsibility for risk management has historically belonged to management, he emphasized that boards are responsible for ensuring that the corporation has established appropriate risk management programs and for overseeing how management implements those programs.  Not surprisingly, he mentioned the SEC's 2009 rule change which calls for the public disclosure of the board's role in risk management (usually in a proxy statement).

In addition to the SEC's rule changes, proxy advisory firms appear to be applying pressure to boards to focus on data security risks.  A prominent proxy advisory firm has recommended that shareholders vote against the election of most of Target's directors because of their alleged “failure…to ensure appropriate management of [the] risks” resulting in Target’s December 2013 breach.

The result of these influences is encouraging: Boards have begun to assume greater responsibility for overseeing the risk management efforts of their companies, according to evidence cited by the Commissioner.  For example, according to a survey of 2013 proxy statements filed by S&P 200 companies, the full boards have almost universally assumed responsibility for the risk oversight of their respective companies.

The Commissioner concluded by expressing his view that "board oversight of cyber-risk management is critical to ensuring that companies are taking adequate steps to prevent, and prepare for, the harms that can result from such attacks. There is no substitution for proper preparation, deliberation, and engagement on cybersecurity issues."

You can read the Commissioner's full remarks here.



(c) Matt Cordell 2013

Tuesday, July 15, 2014

North Carolina Has a New Education Privacy Law

A new education privacy bill was signed into law earlier this month, and became effective immediately.  Formally titled "An Act to Ensure the Privacy and Security of Student Educational Records," (Senate Bill 815, Session Law 2014-50) contains a number of privacy-related provisions.  This post summarizes some of the key aspects of the Act.

Prohibited Information
 
The new statute prohibits schools from collecting or storing the following categories of data:
  • biometric information
  • political affiliation
  • religion
  • voting history 
The term "biometric information" does not appear to be defined by the Act nor in the larger Article or Chapter.  I assume it covers fingerprints, retina scans, and DNA records.  (It is not perfectly clear to me where the line is drawn, however, between "biometric information" and other identifying information.)
 

Restrictions on Information Disclosure

The Act also prohibits schools from sharing "personally identifiable student data," which includes, but is not limited to, the following:
  • A student's name
  • The name of the student's parent or other family member
  • An address of the student or student's family
  • A personal identifier, such as the student's Social Security number or unique student identifier
  • Other indirect identifiers, such as the student's date of birth, place of birth, and mother's maiden name
  • Other information that, alone or in combination, would allow a reasonable person to identify the student with reasonable certainty
  • Other information requested by a person who the Department of Public Instruction or local school administrative unit reasonably believes knows the identity of the student to whom the education record relates
However, "personally identifiable student information" does not include "directory information" if the local board of education has provided parents with notice of an opportunity to opt out of the disclosure of that information [consistent with the Family Educational Rights and Privacy Act ("FERPA," 20 U.S.C. § 1232g)].


image dcJohn / foter.com
Parental Rights and Notices


The Act requires local school boards to provide parents, on an annual basis, with information about how state and federal privacy laws and regulations apply to school records and student data, including parental rights and opt-out opportunities relating to disclosure of directory information (as provided under FERPA) and surveys (covered by the Protection of Pupil Rights Amendment, 20 U.S.C. § 1232h).

New Rules and Procedures

The statute requires the State Board of Education to create more clearly defined rules and procedures for the safeguarding and use of student data.  Among other things, the statute requires the State Board of Education to develop a detailed data security plan that includes the following:
  • Guidelines for authorizing access to the student data system and to individual student data, including guidelines for authentication of authorized access
  • Privacy compliance standards
  • Privacy and security audits
  • Breach planning, notification, and procedures
  • Data retention and disposition policies
  • Data security policies, including electronic, physical, and administrative safeguards such as data encryption and training of employees
Covered Schools

The statute adds language to Article 29 of Chapter 115C of the General Statutes, which applies to public elementary and secondary schools.  Therefore, private schools, colleges, and universities appear to be unaffected.  

Widespread Support

The bill arose from a recommendation by the Joint Legislative Oversight Committee on Information Technology, and was unanimously approved  by both houses of the General Assembly.  

More Information

You can read the full text of the new statute here.



Tuesday, July 1, 2014

What's going on with mugshot publication in North Carolina?

Publishing mugshots has become big business, and is now attracting legislative scrutiny.  Critics point out that an innocent person can be arrested and photographed before the charges are dropped, only to find his or her mugshot in the local press or on the internet.  The mugshot might be seen by a potential employer, customer, girlfriend's dad, etc., resulting in reputational and financial loss.  In the most egregious cases, internet mugshot publishers charge a fee to remove an innocent person's mugshot from their website.

A number of states have enacted laws to curb perceived abuses relating to publication of mugshots, and North Carolina's General Assembly is currently considering similar legislation.

Rep. Tim Moffitt (Buncombe County) introduced a bill to prevent any mugshots for misdemeanor charges (not felonies) from being published unless and until the accused person is convicted.  Moffitt explained that "publishing pictures for all the world to see of people arrested for charges that may not be sustained just serves no public purpose.  It’s not journalism and it's not fair – it’s sensationalism to drive web traffic that plays to the worst part of our natures."  Moffitt's bill would create the following new statutory language:
 
G.S. 15A‑502 is amended by adding a new subsection to read: "(f) A photograph of a person charged with the commission of a misdemeanor or felony taken by a law enforcement officer or agency pursuant to this section is confidential and exempt from disclosure as a public record under Chapter 132 of the General Statutes, except that the photograph may be disclosed to the public if (i) the person is charged with a felony or (ii) the officer or agency determines that release of the photograph is reasonably necessary to secure the public's safety. Any photograph exempt from disclosure under this subsection shall become public upon conviction of the person charged.
Last week, the bill was rewritten and placed in Senate Bill 734.  The new language would require the Administrative Office of the Courts and the Department of Public Safety to study this issue and report back to the General Assembly before the end of 2014.  The revised language reads as follows:
 
The Administrative Office of the Courts and the Department of Public Safety shall study whether or not photographs of individuals charged with a crime should be a public record, including the admissibility of such photographs, posting on the Internet of such photographs prior to conviction, and any other matters related to the use of photographs of charged individuals. The Administrative Office of the Courts and the Department of Public Safety shall report, with recommendations, to the Joint Legislative Oversight Committee on Justice and Public Safety on or before December 31, 2014.
Similar bills in each house would be more limited, focusing on the mug shot publishers who charge to remove images.  It's unclear at this point which, if any, of the bills will be enacted.

Some commentators have little confidence any legislative action will have much effect.  Mugshot publishers have already proven adept at avoiding similar state laws. 


Monday, May 26, 2014

An Introduction to the Law of Electronic Signatures and Electronic Records in North Carolina (Part 3)

In the first part of this series, we explored the history of state and federal legislation governing electronic signatures and records, explained the key terminology, and addressed the fundamental principles undergirding the laws. In the second part, we covered the consent requirement, retention, and authentication. In this third installment, we address the exemptions, exculsions, and exceptions to the general rules.

Exemptions, Exclusions, Exceptions

The purpose of the UETA and the E-SIGN Act was to ensure that electronic signatures and records were given the same legal status as ink signatures and paper records.  However, each piece of legislation contains a number of exceptions.  

One of the most important things to know about the UETA and the E-SIGN Act are the areas in which they do not apply.   

The first exception related to requirements in other laws for a particular method of delivery. If another law requires a record (i) to be posted or displayed in a certain manner, (ii) to be sent by a specified method, or (iii) to contain information that is formatted in a certain manner, the other law controls. For example, if another law requires transmittal by First Class or Certified USPS Mail, you may not rely upon email (but you may email in addition to USPS).

Another important exception to the general validity of electronic signatures and records is for the requirements of the various laws governing the creation and execution of wills, codicils, or testamentary trusts.  It would be a grave mistake to attempt to execute a will using an electronic signature.   


Exemptions from the UETA and E-SIGN Act can become confusing when some--but not all--of an area of law is exempt.  Most of the Uniform Commercial Code is exempt from the UETA and the E-SIGN Act, but the following remain subject to UETA and E-SIGN (and therefore electronic signatures and records are valid):
  • Sales of Goods (UCC Article 2)
  • Leases of Goods (UCC Article 2A)
  • G.S. 25-1-306 (an authenticated record of the settlement of a claim involving the sale of or lease of goods) (This last exemption is found in the UETA only--not in the E-SIGN Act.)
Laws governing adoption, divorce, or other matters of family law are exempt from the E-SIGN Act (though the UETA does not specifically exclude them).
  
In addition, the UETA and E-SIGN Act do not apply to the following:
  • cancellation of utility services;
  • any notice of default, acceleration, repossession, foreclosure or eviction, or the right to cure, under a loan or lease for a primary residence;
  • cancellation of health or life insurance benefits;
  • any notice of a product recall; or
  • the transportation or handling of hazardous materials.
 
The E-SIGN Act and UETA may, or may not, apply to transactions involving government entities.  In North Carolina, transactions with government entities are controlled by the Electronic Commerce in Government Act (G.S. Ch. 66, Article 11A), which allows for the use of the UETA as alternative to the more complex (and secure) procedures described in that Article (which establishes the role of the “certification authority“, a person authorized by the Secretary of State to vouch for the relationship between a signatory and a public agency).

By understanding the circumstances in which electronic signatures and electronic records may--and may not--be used, as well as the requirements imposed by law, we can effectively utilize electronic signatures and records and enjoy the benefits of technology with confidence in their validity and enforceability.



Image by Jomphong via freedigitalphotos.net

Sunday, May 25, 2014

Why California's Online Privacy Laws Matter to Businesses in Every State

 
Image source material Truthout / Foter.com
People sometimes assume that the laws of states in which they do not have a physical presence do not apply to them.  Businesses and other organizations that engage with the public online, however, may be subject to the rules of the states in which their users reside.  I have previously written about how a few states have their own website (or web application) privacy rules, and the widespread view that California's are the most significant. 
 
Because California's online privacy laws are so important to organizations across the country, it is important to monitor relevant legal developments in California, including the actions of California's Attorney General.  This post summarizes recent developments in California affecting website operators and application operators.

  • Early in 2012, California's Attorney General reached a voluntary resolution with Amazon, Apple, Facebook, Google, Hewlett-Packard, Microsoft, and Blackberry, requiring that mobile apps provide privacy policies that users could find in a consistent location before downloading an app.

  • In October of 2012, California's Attorney General sent letters to approximately 100 mobile app developers and companies that were not in compliance with the California Online Privacy Protection Act and gave them 30 days to comply.

  • In December of 2012, the Attorney General filed an enforcement action against Delta Airlines over its mobile application privacy policy statement.

  • In 2013, Attorney General Harris issued Recommendations for the Mobile Ecosystem, which provided app developers with recommendations to develop privacy policies and procedures.


  • In February of 2014, California's Attorney General issued a guide, Cybersecurity in the Golden State, intended to help organizations protect against, and respond to, data breaches and other cyber risks. 

Any organization with an online presence would do well to keep an eye on California's online privacy laws and enforcement actions.  Check the North Carolina Privacy & Information Security Law Blog from time to time for updates on this and other important legal updates.

Saturday, May 24, 2014

An Introduction to the Law of Electronic Signatures and Electronic Records in North Carolina (Part 2)

In the first part of this series, we explored the history of state and federal legislation governing electronic signatures and records, explained the key terminology, and addressed the fundamental principles undergirding the laws. In this part, we address some of the key requirements of the applicable laws.

The Consent Requirement

A fundamental requirement of the E-SIGN Act and UETA is the consent requirement. The E-SIGN Act and UETA do not require any party to a transaction to accept or agree to use electronic signatures or documents; they merely ensure the enforceability of such documents and signatures if the parties agree to use them.
The consent of a commercial party may sometimes be inferred. Many consumer electronic transactions, however, require affirmative consent. To avoid the possibility of a consumer slipping through, it may be prudent to include an opt-in provision in all electronic transactions without regard to whether they are believed to be consumer transactions.  
Another aspect of the consent issue is that although electronic transactions are governed by UETA and the E-SIGN Act, the parties may opt out of many aspects of those laws. In other words, the parties may vary, waive or disclaim most of the provisions of UETA or E-SIGN Act by agreement, if both parties are "commercial" parties. This is not the case when one party is a consumer.  

Consumer Consent

The North Carolina version of the UETA differs from the model UETA drafted by the NCUSL. It contains heightened consumer protection provisions, consisting of disclosures and procedural steps, that were not in the original UETA. These provisions were taken from the E-SIGN Act, although they do not match the E-SIGN Act perfectly. Importantly, if a party to an electronic transaction is a consumer in North Carolina, North Carolina law is deemed to apply regardless of any other contractual provision purporting to apply another state's laws.
When one party to an electronic agreement is a consumer, you must ask whether any statute, regulation, or rule of law requires the transaction be in writing or requires any information relating to the transaction be in writing. The answer is usually "yes." There are so many consumer protection laws and other laws requiring written disclosures or contracts, that it is easy to overlook one. The best practice is often to assume that some requirement of this sort applies to all consumer transactions. In these cases, the customer must be given certain disclosures prior to entering into an electronic transaction, including statements regarding various rights, as well as hardware and software requirements. It should be noted that North Carolina's UETA and the E-SIGN Act do not match perfectly in this regard, and care should be taken to satisfy both. 
When any law requires a consumer be given something related to the transaction in writing, the consumer's consent must reflect the following procedural requirements:
  • If the consumer provides an electronic signature using the other party’s equipment, such as a signature capture pad or a personal computer, the consumer must be given a hard copy of the relevant documents.
  • If any other law requires future notices to the consumer (such as periodic statements, change in terms, etc.), they can be provided electronically, such as by email, but only if the consumer has reasonably demonstrated his or her ability to receive and access the notice in the electronic form that will be used to provide the information that is the subject of the consent. The burden of proof is on the non-consumer party, and a built-in assertion of ability to access might not suffice.

 

The Retention Requirement

If any law requires an electronic document be sent, provided, disclosed, retained or “in writing,” then an electronic form may be used only if the document is “capable of retention.” An electronic record is capable of retention if: (a) the recipient can print or store the electronic record; or (b) it is capable of being accurately reproduced for later reference by all parties entitled to access it. It can be accurately reproduced if it correctly reflects the information set forth in the record at the time it was first generated. The retention requirement applies to both consumer and non-consumer contexts. This does not, however, impose any new requirement to store records if the law does not otherwise require record retention.
Even though the UETA and E-SIGN Act are technologically neutral—that is, no particular format is given preferential treatment—if the format is proprietary, it must be accessible to all who are entitled to access the record. For this reason, widely-available formats are highly recommended. For example, it is easy to provide a link to the free Adobe Acrobat Reader, thereby making the PDF format accessible to a consumer. If a provider of a document chooses to use a proprietary format, the obligation is on that party ensure that the other party can access it.

Authentication

For millenia, people have been denying that a signature or mark was made by them. Clearly, forgeries happen, and just as clearly, people try to get out of agreements they have made by claiming never to have made them. There is nothing new about this problem.
In the context of an electronic signature, technology could make the authentication easier or more difficult. Obviously, it is easy to type anyone's name into the signature block of a document, but that does not mean it is always hard to prove who did the typing. (That is the magic of metadata.)

The laws provide that the attribution of a signature or record to a person may be shown in any manner, including "a showing of the efficacy of any security procedure." Context can also be used. If you meet with someone in person, and they give you their email address, and you then email back and forth with them, you have evidence that an emailed signature is authentic.
In sum, while electronic signatures need to be capable of authentication, the issue is really no more problematic than in the realm of paper documents.
[In Part 3 of this series, we will look at the exceptions, exemptions, and exclusions from the UETA and the E-SIGN Act.]


Image by Jomphong via freedigitalphotos.net

New Guidance on the New Website Privacy Requirements

Those of you who read this blog regularly know that I've previously written about how a few states have their own website privacy rules, and expressed the widely-held view that California's are the most rigorous. I have also explained that websites directed at U.S. audiences generally need to comply with California's strict rules.
Image source material  Truthout / Foter.com
A few weeks ago, I wrote about a new website privacy law that amends California's existing Online Privacy Protection Act, which became effective at the first of the year.
 
Last week, California's Attorney General published some guidance to aid organizations in complying with the recent changes in California privacy law.   The portion of the guidance that relates to the newest requirements offers the following general recommendations:
  • Make it easy for a consumer to find the section in which you describe your policy regarding online tracking by labeling it, for example: "How We Respond to Do Not Track Signals," "Online Tracking" or "Do Not Track Disclosures." 
  • Describe how you respond to a browser’s Do Not Track signal or to other such mechanisms. This is more transparent than linking to a "choice program."
  • State whether other parties are or may be collecting personally identifiable information of consumers while they are on your site or service.
  • Explain your uses of personally identifiable information beyond what is necessary for fulfilling a customer transaction or for the basic functionality of an online service.
  • Whenever possible, provide a link to the privacy policies of third parties with whom you share personally identifiable information.
More specific recommendations are included in the guidance relating to these and other aspects of California privacy law.

Organizations that have a nationwide audience should update their website privacy policy statements in light of the new rules and guidance, if they have not already.

Sunday, May 18, 2014

An Introduction to the Law of Electronic Signatures and Electronic Records in North Carolina (Part 1)

Image by Jomphong via freedigitalphotos.net
Among the hottest topics these days are electronic signatures and electronic records. In this series of articles, we will examine the key aspects of the state and federal laws governing their use in the private sector. This series will focus on the salient legal aspects of:

• Electronic Signatures;

• Electronic Records; and

• Electronic Notarizations.

The Benefits of Electronic Signatures and Records

The ever-growing interest in electronic signatures and electronic records is based on potential benefits that can be compelling in some circumstances. Electronic signatures and records offer the promise of convenience – rapid execution, storage, and recall (including by searchable text) – and accessibility – the ability to access records twenty-four hours per day, seven days per week, three hundred sixty-five days per year. There is also a potentially massive cost savings potential, including lower transaction costs and storage costs.

The Legal Landscape

There are several sources of law that apply to electronic signatures and records in North Carolina. The primary state law applicable to electronic signatures and records in North Carolina is our version of the Uniform Electronic Transactions Act ("UETA") (codified at N.C.G.S. Ch. 66, Article 40). Electronic signatures were first given explicit legal recognition by state statutes, beginning with California, in the 1990s. Soon there became a clear need for uniformity among the emerging state laws. The National Conference of Commissioners of Uniform State Laws, the body responsible for the Uniform Commercial Code, took up the task, creating the Uniform Electronic Transactions Act in in 1999. Forty-seven states, the District of Columbia, Puerto Rico, and the Virgin Islands have now adopted the Uniform Electronic Transactions Act. Three states, Illinois, New York and Washington, have not adopted the UETA, but have statutes pertaining to electronic transactions.


Infographic credit: National Conference of State Legislatures

North Carolina's Electronic Commerce in Government Act (N.C.G.S. Ch. 66, Article 11A) applies when a governmental entity is involved. The ECGA allows for use of UETA standards, but also provides for an alternative employing more secure procedures, which involve a “certification authority“—a person authorized by the Secretary of State to vouch for the relationship between a signatory and a public agency. This is a more complex method of electronic contracting.

The UETA acknowledges that an electronically-notarized signature or record is valid. It does not address how an electronic notarization works. For that, we look to North Carolina's Electronic Notary Public Act (N.C.G.S. Ch.10. Article 2). An electronic notarization is an official act (acknowledgement, verification, etc.) performed by an electronic notary public using their electronic seal and electronic signature on electronic documents. Before performing notarial acts electronically, a notary must register with the Secretary of State. (See N.C.G.S. 10B-106). Registration requires a three hour training course, an examination, and $50 fee. An electronic notarization requires the signer be in the notary’s presence, and telephones, computers, videoconferences, etc. do not qualify, so the advantage over manual signature and notarization is minimal in most circumstances. The anecdotal evidence I have seen indicates that the adoption and use of electronic notarization is minimal at this time.

Congress noted the trend of states adopting the UETA and decided that a federal statute was needed to provide for uniformity and to clearly govern this area in the context of interstate commerce, so it enacted the Electronic Signatures in Global and National Commerce Act (better known as the "E-SIGN Act") in mid-2000. Fortunately for us, the E-SIGN Act and the UETA are very similar, and employ much of the same terminology.

The existence of both state and federal laws covering the same issues raises the question whether the UETA or E-SIGN Act applies in any given instance. The answer is more complicated that perhaps we would wish. Because Congress took action on the E-SIGN Act after states had begun adopting the UETA, it included a conditional reverse preemption provision, which effectively says that if a state adopts the UETA and includes provision more protective of consumer than the E-SIGN Act, the UETA controls; otherwise, the E-SIGN Act controls. Therefore, my recommendation usually is to comply with the more strict of the two in any given instances.

Key Terminology

Understanding the terminology is critical to applying the E-SIGN Act and UETA. Under both acts, the term "electronic signature" means “an electronic sound, symbol, or process, attached to or logically associated with a contract or other record and executed or adopted by a person with the intent to sign the record.” The term is deliberately made as broad as possible, but the key, just as under the common law of signatures, is an act plus intent. Some common examples help illustrate the breadth of the term:

• Manual signature scanned to an image (e.g. to PDF);
• A name typed into an email message (but not always in NC);

• The click of an “Accept” button;
• a voicemail message (for non-consumers, and even then only if the intent to be bound is clearly shown); and
• a “digital signature” (using algorithms, public keys, and private keys).

The reference to digital signatures raises the oft-repeated question whether digital signatures and electronic signatures are the same. A digital signature is a kind of electronic signature in which technology (encryption and keys) is used to verify the party. Not all electronic signatures are digital signatures in the same way that not all pens are fountain pens.

Another key term is "electronic record", which means "a contract or other record created, generated, sent, communicated, received, or stored by electronic means." Essentially the same definition is used under federal and state law. Again, the term is defined as broadly as possible. Common examples will illustrate the breadth of the term:

• Scanned images of physical (paper) documents
• Electronically-generated documents (e.g., MS Word .doc files)
• Electronic records of actions (e.g., file log, history)

Fundamental Principles
 

The fundamental principles undergirding the E-SIGN Act and the UETA are fairly straightforward, and can be summarized in two very succinct statements:

• A record or signature may not be denied legal effect or enforceability solely because it is in electronic form.

• If a law requires a record to be "in writing", an electronic record satisfies the law provided it complies with the E-SIGN Act or UETA and the other requirements of the applicable law.


Two additional key principles help explain some provisions of the E-SIGN Act and UETA for us. The first is technological neutrality. No format is given preferential treatment by the laws. However, if a format is proprietary, it must be accessible to all who are entitled to access the electronic record. Some formats are better-established, and therefore more "accessible" than others (e.g., Adobe's PDF). The second key principle is flexibility. By avoiding a preference for any particular technology, the UETA and E-SIGN Act facilitate technological innovation and limit the need for updates to the respective laws.


[In Part 2 of this series, we will look at the specific requirements of the state and federal laws.]

Understanding the Law of Electronic Signatures and Electronic Records

I recently delivered a continuing education presentation on behalf of the North Carolina Bar Association regarding the legal aspects of electronic signatures and records.

 For those who are interested and were unable to attend, I am providing the slides here.  


Saturday, May 17, 2014

Befor the Aftermath: How to Prepare NOW for the Possibility of a Future Privacy Lapse or Data Security Breach

image by cohdra
Despite a greater focus on prevention than ever before, privacy lapses and data security breaches continue to increase as a source of financial, legal, and reputational risk for a wide array of businesses. According to the Identity Theft Resource Center, a nonprofit group that tracks data security breach reports, there were 614 data security breaches reported in 2013, covering almost 92 million records.

The litany of recent breaches in the headlines includes names of venerable brands and fast-growing technology companies. If even large businesses with significant resources and tech-savvy companies cannot always prevent data security breaches, what are the odds your company will be 100% successful in avoiding a breach? It seems almost irresponsible these days to assume that you can stop every attack and prevent every oversight indefinitely. Instead, every business must face the reality that a breach is possible, and take steps now to address the possibility. This article will focus not on prevention, but upon preparing for an effective response.

Notice Requirements

In the event of a significant breach, approximately 46 states and the District of Columbia have laws that require a business suffering a breach to notify the affected customers, the state attorney general, and the consumer reporting bureaus.

One result of the notices required by these laws is that watchdog groups are better able to monitor breaches. However, this is not the entire picture. Breaches affecting a small number of persons may not be required to be reported and are, therefore, not included in the publicly-available statistics. For example, under North Carolina's Identity Theft Protection Act, only breaches affecting 1,000 or more individuals must be reported to the North Carolina Attorney General and consumer reporting bureaus. Many commentators believe that the majority of data breaches in North Carolina and elsewhere go unreported for this and other reasons.

A Costly Matter

Data security breaches can be very expensive. A study of insurance claims in 2013 conducted by the risk management firm NetDiligence showed that the average total reported cost of a security breach to a business was $954,253, with average legal fees of $574,000. The same study found that 29.3% of data breach-related insurance claims were made by businesses in the health care sector, with 15% in the financial services sector.

Preparing for the Possibility of a Breach

Given the immense financial and reputational risks of a privacy violation or data security breach, and the near impossibility of absolute prevention, it is important for each business to prepare in advance for the possibility of a breach. Prudent breach preparation can help a business to more effectively respond to a breach, mitigate losses and liability, and demonstrate compliance with applicable laws. The following categories of measures are strongly suggested:

• Conduct a Risk Assessment and Document It.

Although breach prevention measures are beyond the scope of this article, evidence of a reasonable risk assessment can be useful in the aftermath of a breach to document that commercially–reasonable steps were taken to identify vulnerabilities and weigh the costs of addressing them.

• Implement Commercially-Reasonable Policies and Security Measures.

After identifying the categories of sensitive information held and the likely sources of risks, a business should then take and document reasonable measures to prevent them. This should include the adoption of effective technological standards and well-thought-out policies and procedures. The scope and rigor of the measures will depend upon the risk profile and resources of the business. Although primarily intended to prevent a breach, the existence and documentation of a reasonable prevention program can help to mitigate liability following a breach.

• Review Policies and Procedures Periodically.

At regular intervals, policies and procedures should be re-evaluated to ensure they remain current or revised to reflect changes in the risk profile and landscape. Again, the scope and frequency of these reviews will depend upon the risk profile and resources of the business.

• Prepare a Response Plan.

Just as every business should have a documented disaster recovery plan, every business that holds sensitive data should have a documented breach response plan (''Response Plan'') ready (and tested) to guide the business's efforts in the hours and days following a breach. Assembling a Response Plan from scratch in the immediate aftermath of a breach wastes valuable time and risks overlooking important matters in the rush to handle an emergency.

The Response Plan need not address every conceivable contingency, but should contain the basic, universal response protocols that will form the basis of the business's response. The Response Plan should be created with the input of security personnel, IT personnel, legal counsel, and senior management.

• Select a Response Team.

Every business is comprised of individuals with unique strengths. In advance of a privacy or security incident, each business should determine who is best suited to perform each task addressed in the Response Plan. Those individuals should be assigned to a response team and trained to implement the Response Plan so that they will be able to ''hit the ground running'' when called upon to respond. The response team should include security, IT, communications/public relations, and legal experts, as well as senior management.

Perform Due Diligence on Third Parties.


Several recent major data security breaches have arisen from the actions of vendors who obtained customer information from another business. The vendor usually has no direct relationship with the customer, and the customers typically sue the business with which they have a relationship instead of, or in addition to, the vendor.

Selecting third-party vendors to handle your customers' information should involve a commercially-reasonable due diligence process to ensure that only responsible vendors are deemed to be eligible to receive customer information. Knowing the right questions to ask is key.

Use Carefully-Crafted Contracts.

Some risks of liability and other losses arising from data security can be reduced through well-drafted contracts with third-party vendors. Many contracts presented to businesses by third-party vendors are woefully inadequate to protect the business if the vendor fails to prevent a breach of the business’s customer data. A review by a lawyer who understands the relevant issues can potentially help a business save large sums in litigation fees and liability in the event of a subsequent breach.

Consider Cybersecurity Insurance.

A number of firms now offer insurance against losses arising from data security breaches. This category of coverage is available as an addition to directors and officers liability insurance coverage (better known as a ''D&O'' policy) or as stand-alone coverage. Coverage terms are not standardized in the way that, for example, homeowner's policies are, and there are usually significant exclusions from coverage. Therefore, it may be useful to have a proposed policy reviewed by legal counsel and technology professionals to ensure that the offered coverage is adequate and that the remaining risks are understood.

Prior Planning Prevents Poor Performance
A business should do all it reasonably can to prevent a privacy or information security breach, but must recognize that some risk of a breach inevitably remains. By taking a few responsible steps in advance, the losses associated with a breach can be mitigated efficiently and effectively. In addition to commercially-reasonable preventative measures, a solid and well-documented Response Plan can go a long way toward helping customers, employees, managers, shareholders, and other stakeholders sleep more soundly at night.




This article was originally published in May 2014 in Legal Currents under the title "Befor the Aftermath: How to Prepare Now for the Possibility of a Privacy Lapse or Data Security Breach."

Sunday, April 27, 2014

New Website Privacy Policy Requirements

Photo credit: Truthout / Foter.com

A new website privacy law has recently become effective and may require you to make a change to your existing website privacy policy.

 
In prior articles regarding website (and application) privacy policies, I've mentioned that a few states have their own website privacy rules, and California's are the most rigorous. If your company's website is directed at California residents (and this includes websites directed at U.S. audiences generally), it will need to comply with California's unique rules. 
 
A new privacy law that amends California's existing Online Privacy Protection Act became effective on January 1, 2014.  The new law requires a website operator to disclose (i) how it responds to “do not track” signals and (ii) whether other parties may collect personally identifiable information when a consumer uses the operator’s Web site or service. 
 
As amended, California's Online Privacy Protection Act now requires the following from an operator of a website or online service that collects personally identifiable information (which is defined very broadly) about residents of California:
  1.  Conspicuously post a privacy policy on its website or online service and comply with that policy.
  2. Identify the categories of information collected.
  3. Identify the parties with whom the operator shares the information.
  4. Describe the process by which users are notified of material changes to the privacy policy.
  5. Describe any process for the review and request of changes to personally identifiable information.
  6. The effective date of the policy.
  7. A description of how the operator responds to web browser “do not track” signals.  (This can be satisfied by a link to a separate disclosure.  Note that there is no legal obligation to honor such signals.)
  8. Disclosure of whether other parties may collect personally identifiable information about the user's activities over time and across different websites ("tracking").
Website privacy policies are gaining increasing attention from governmental entities, consumer groups and plaintiffs' class action attorneys, and is an emerging source of risk for many businesses. Having advised local, national and international businesses on website privacy issues, I believe most of that risk is avoidable if care is taken to observe the patchwork of applicable legal requirements, including the laws of states other than your own.




P.S. Don't confuse the Children's Online Privacy Protection Act ("COPPA") with the California Online Privacy Protection Act ("CalOPPA").  I've written about COPPA here. 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Saturday, March 15, 2014

How Financial Institutions Can Manage Social Media Risks

image by Matt Cordell using Creative Commons content BY-SA 3.0
Bankers, does your bank use social media? Do employees use social media on behalf of the bank? Do you know what examiners will be looking for in a social media risk assessment and a social media risk management program?

The Federal Financial Institutions Examination Council (FFIEC) has published guidance recently that will be used by the Federal Deposit Insurance Corporation (FDIC), the Office of the Comptroller of the Currency (OCC), the Board of Governors of the Federal Reserve System (Board), the National Credit Union Administration (NCUA), and the Consumer Financial Protection Bureau (CFPB) to evaluate financial institutions' compliance with various privacy and other laws and regulations.

What kinds of social media are covered?

The guidance defines "social media" as any form of interactive online communication in which users can generate and share content through text, images, audio, and/or video. Examples include micro-blogging sites (e.g., Facebook, Google Plus, MySpace, and Twitter), forums, blogs (e.g., BizLawNC.com or PrivacyLawNC.com), customer review web sites and bulletin boards (e.g., Yelp), photo and video sites (e.g., Flickr and YouTube), sites that enable professional networking (e.g., LinkedIn), virtual worlds (e.g., Second Life), and social games (e.g., Farmville). These platforms have a wide spectrum of uses, and their user profiles vary.

Financial institutions most often use social media for marketing directly to customers, but it can also be used to provide incentives, collect feedback from the public, recruit employees, and to otherwise engage with prospects and customers. Each of these efforts carries with it particular goals and varying types and degrees of risk.

The FFIEC Guidance states that every financial institution must conduct a risk assessment that addresses the risks raised by its use of social media and maintain a risk management program that is tailored to the risk profile. Every institution using social media should identify, measure, monitor,and control the risks related to social media.

How detailed should the policy statement be? How comprehensive should the procedures be?

The size and complexity of the program should be commensurate with the degree of the institution's involvement in social media, both in terms of depth and breadth. For example, a financial institution that relies heavily on one medium (e.g. Facebook) should have a more focused program. An institution using several media (e.g., Facebook, LinkedIn, Twitter, Yelp, Google +, and YouTube) should have procedures that are more comprehensive.

Who should be involved?

The FFIEC advises that a social media risk management program should be designed with participation from specialists in compliance, technology, information security, legal, human resources, and marketing. A better suggestion, in my opinion, is the inclusion of individuals whose expertise spans more than one of these categories. (Can you think of anyone who might know about more than one of these areas?)

What are the elements of a social media risk management program?

  • A governance structure with clear roles and responsibilities;
  • Policies and procedures (either stand-alone or incorporated into other policies and procedures) regarding the use and monitoring of social media and compliance with all applicable consumer protection laws and regulations;
  • A process for selecting and managing third-party relationships;
  • An employee training program;
  • An oversight process;
  • Audits to ensure ongoing compliance; and
  • Reporting to the board of directors or senior management to enable periodic evaluation of the program.

What are the key areas of risk?


What if we don't use social media at our bank?

Even financial institutions that do not use social media should perform a risk assessment, say the regulators: "a financial institution that has chosen not to use social media should still consider the potential for negative comments or complaints that may arise within the many social media platforms described above, and,when appropriate, evaluate what, if any, action it will take to monitor for such comments and/or respond to them." I have already written about online reputation management at length, and rather than repeat my advice here, I will refer you my earlier post on the subject.

Furthermore, just because an institution does not have an official social media account does not mean individual employees (especially those with business development responsibilities) are not posting on LinkedIn, Facebook, Twitter, and other platforms about, and apparently on behalf of, the the institution. It is unusual these days to find anyone in a sales role who is not active on social media.

Conclusion

The FFIEC Guidance is intended to help financial institutions understand and successfully manage (not eliminate) the risks associated with use of social media. The regulators expect institutions to manage potential risks to themselves and and their customers by identifying areas of risk proactively and adopting and implementing programs to mitigate those risks effectively...and more importantly, so do an increasing number of customers.




Wednesday, February 19, 2014

What You Need to Know about the Children's Online Privacy Protection Act (COPPA)

Online privacy and information security are areas of ever-increasing concern for the Federal Trade Commission, state and federal prosecutors, plaintiff's lawyers, and consumer advocates.  There are now a smattering of laws and regulations that operators of websites, applications, and advertisers must comply with relating to these issues.  Anyone who (a) operates a website designed for kids or (b) operates a website geared to a general audience but who is aware that it is collecting information from someone under 13 should understand and comply with the Children's Online Privacy Protection Act, the FTC's rules, and the FTC's guidance.  

The Children's Online Privacy Protection Act (COPPA) became law almost 15 years ago, but in 2013, the Federal Trade Commission's revisions to the COPPA Rule, which were intended to modernize the Rule, became effective. 

image credit: Mike Licht

What Is the Children's Online Privacy Protection Act Rule?
 
The COPPA Rule requires operators of websites or online services directed to children under 13 years of age (and operators of other websites or online services that have actual knowledge that they are collecting personal information online from a child under 13 years of age, even if not by design) to provide notice to parents and obtain verifiable parental consent prior to collecting, using, or disclosing personal information from children under 13 years of age. The Rule also requires operators to keep secure the information they collect from children, and prohibits them from requiring the disclosure of more personal information than is reasonably necessary.
What Revisions Took Effect in 2013?
The lengthy 2013 revisions were designed to achieve the following:
  • Modify the definition of "operator" to make clear that the Rule covers an operator of a child-directed site or service where it integrates outside services, such as plugins or advertising networks, that collect personal information from its visitors;
  • Modify the definition of "Web site or online service directed to children" to clarify that the Rule covers a plug-in or ad network when it has actual knowledge that it is collecting personal information through a child-directed Web site or online service;
  • Modify the definition of "Web site or online service directed to children" to allow a subset of child-directed sites and services to differentiate among users, and requiring notice and parental consent only for users who self-identify as under age 13;
  • Modify the definition of "personal information" to include geolocation information and persistent identifiers that can be used to recognize a user over time and across different Web sites or online services;
  • Modify the definition of "support for internal operations" to expand the list of defined activities;
  • Streamline and clarify the direct parental notice requirements to ensure that key information is presented to parents in a succinct ‘‘just-in-time’’ notice;
  • Expand the non-exhaustive list of acceptable methods for obtaining prior verifiable parental consent;
  • Create three new exceptions to the Rule’s notice and consent requirements;
  • Strengthen data security protections by requiring operators to take reasonable steps to release children’s personal information only to third parties who are capable of maintaining the confidentiality, security, and integrity of the information;
  • Require reasonable data retention and deletion procedures;
  • Strengthen the FTC’s oversight of self-regulatory "safe harbor" programs; and
  • Institute voluntary pre-approval mechanisms for new consent methods and for activities that support the internal operations of a Web site or online service.
You can read more about the 2013 Rule changes here, here, and here.