Action Item: Benchmark-based Planning for 2020
This series written for The IAPP Privacy Advisor by the team at RadarFirst is about establishing program metrics and benchmarking your privacy incident management program.
We’re nearing the end of the year and approaching an entirely new decade. It’s typically a time for reflection, to consider lessons learned, and to begin planning for a fresh start come Jan. 1.
As privacy professionals, this type of analysis and reflection is second nature, of course. Every privacy incident that comes our way requires we perform an investigation, an assessment to determine if it’s a data breach and then to undergo the process of notification and documentation. Reflecting on what happened, identifying areas to improve and planning for what comes next is kind of in the privacy pro job description.
This reflection got me thinking about this continuing benchmarking article series and the major topics we’ve covered in the past year. We approach this series with the goal of providing industry statistics to better evaluate and fine-tune your privacy program. In many ways, we think of the questions privacy professionals find themselves asking every day and see how we can add clarity with metadata we share from the RadarFirst platform. We hope to arm our colleagues in the field with metrics and data so we can all know if we’re doing well — or if not, so we can identify areas to improve.
With that in mind, I wanted to review the high-level findings from the last year of these articles and consider what major lessons we can take with us into our 2020 planning. Keep in mind that the below benchmarking metrics represent organizations that are employing best practices in incident response, including leveraging technology to ensure consistency, accelerate the decision-making process, and eliminate the risk of over- and under-notifying.
January: Are organizations meeting breach notification timelines?
Increasingly in the states, U.S. state data breach notification regulations are moving from ambiguous affected individual notification timeline language, such as “in the most expeditious time possible, without unreasonable delay,” to hard and fast specified notification time frames: 30 days, 45 days, 60 days. At the time of this article, we reported 89% of notifications to affected individuals met the specified time frame within the jurisdiction. And the remaining 11% that did not meet the time frame were associated with large incidents impacting many individuals across multiple state regulations.
February: How often do breach notification exceptions apply?
As stringent as privacy regulations are, they all allow for notification obligation exceptions to varying degrees. And exceptions shouldn’t be considered a “get-out-of-jail-free” card. If anything, the nuance of when exceptions may apply creates even more regulatory complexity for privacy officers and their teams to navigate. And the data also reveals exceptions to be exceptionally rare; metadata of incidents risk assessed in 2018 indicates that exceptions applied in 2.36% of cases, the most common form being that the data was encrypted and the encryption key was not compromised.
March: What is the breach notification rate under the GDPR to supervisory authorities and data subjects?
RadarFirst metadata from May 25, 2018, to March 2019 indicated that 83% of incidents involving personal data can and should be sufficiently risk mitigated to keep them from hitting the risk or high-risk thresholds for notification under the EU General Data Protection Regulation. That means that only 17% of incidents were reported by RadarFirst users to supervisory authorities. In contrast, organizations using ad-hoc and subjective risk assessment process experience much higher notification ratios creating avoidable regulatory and reputational risk.
An update to this statistic: We recently looked at the 2019 year to date data for this statistic and have found the percentage of breaches that required notification to supervisory authorities in this time period has diminished to just 10%. This further reduction was achieved by RadarFirst users given the consistency and defensibility of their risk assessment and notification decision making.
April: Are organizations making use of the GDPR one-stop shop?
Looking deeper into the adoption of the GDPR, we dove into who typically was the recipient of supervisory authority notifications. Under the GDPR, there is a mechanism called the “one-stop shop” in which, as a rule, organizations with cross-border personal data processing activities only have to deal with one supervisory authority, called the lead supervisory authority, in the event that a data breach impacts data subjects in multiple member states of the EU. The lead supervisory authority is entrusted with coordinating investigation and entitled to involve other supervisory authorities in investigations.The incident metadata reveals that use of the “one-stop shop” has been mixed. Some organizations choose to notify a lead supervisory authority, while others opt to notify each local supervisory authority in addition to their lead supervisory in cross-border breaches.
May: How can you accelerate your breach notification time frame?
RadarFirst metadata for all incidents assessed in 2018 reveals that the average time from incident occurrence to discovery to be 21 days and discovery to notification is 27 days. This clearly demonstrates the value of a thoughtful incident response and risk assessment program and need for a solution for managing incident life cycle. Compare this to the 2019 BakerHostetler Data Security Incident Response Report that showed the average time from incident occurrence to discovery to take an average of 66 days and discovery to notification averaged 56 days for organizations without purpose-built automation. Or, the 2019 Verizon Data Breach Investigations Report that cites 56% of the breaches included in the report took months or longer to discover. The data further indicates that data breaches that take the longest to discover, assess and notify still tend to be the small, everyday breaches that only involve one or two records.
June: As you prepare for the CCPA, what is the current benchmark for breaches reported to California regulators and affected individuals?
There is a prevailing attitude that most incidents trigger breach notification obligations in California due to the state’s stringent breach laws and a lack of a harm standard. Contrary to this sentiment, RadarFirst’s metadata above indicates that since January of 2017, fewer than 6% of the incidents impacting California residents were considered notifiable.
July and August: How long should it take to risk score a privacy incident?
On average, incidents managed within RadarFirst are risk assessed, scored and decisioned in significantly shorter time period than general industry reported standards. Breaking that down into industry averages, we see slight variations, perhaps indicating the regulatory burden within select industries. For example, the health care industry reports the total incident response time frame from occurrence to decision to be a little less than 20 days, while the financial services industry has a 39-day average response time frame.
September: Using data breach benchmarking stats to prove incident response ROI
In updated stats, we reveal that only 6% of incidents in 2019 (January to September) required notification, demonstrating that with incident risk assessment automation and best practices in place, organizations with a strong culture of privacy are able to mitigate the risk of most of the unauthorized disclosures and prove that notification is not required. Additional considerations: What should you look at if you are notifying more frequently than the industry rate? What if you’re notifying less frequently than the industry rate? Over- or under-notifying both pose significant risk to your organization in terms of fines and damage to your brand or reputation.
October: Introducing quarterly privacy incident response benchmarking metrics to evaluate your program
At the beginning of each new quarter, we will provide the previous quarter’s statistics on key benchmarking metrics, so that you will always have the most up-to-date figures to compare your program against, and we can illustrate how incident response is changing over time. The three metrics we will revisit on a quarterly basis include:
- How many incidents rise to the level of a notifiable data breach? (Less than 8% were data breaches that required notification.)
- What is the disposition of incidents? Are they malicious in intent? (Less than 2% of incidents were malicious in nature for Q3 of 2019.)
- How long does it take to discover an incident and provide notification to affected individuals?
The next quarterly report will be published in January 2020’s benchmarking article series in The Privacy Advisor.
Wrapping up 2019; looking ahead to 2020
So that brings us to the final 2019 article in this series. What are the lessons we can take from the year of statistics and research represented above? And what are the key learnings we can take into 2020?
First, we can say unequivocally that the work of privacy professionals is complex and urgent and will only get more complex and more urgent as time goes on. From the ever-shifting data breach regulatory landscape to the shrinking incident response time frames, privacy professionals have their work cut out for them.
Secondly, not all hope is lost. Yes, this work is complex. And yes, the stakes are very real in terms of fines, damage to your reputation and brand, and increased regulatory oversight. But there are best practices that emerge in the field and new ways of efficiently, consistently managing privacy incidents and data breach notification.
As we approach the new year and a new decade, I want to issue a challenge.
When setting your 2020 goals and budgets, consider how you are using data to demonstrate accountability, celebrate your team’s accomplishments, and identify risks and opportunities for further improvement. For example: Wouldn’t it be nice to have a dashboard that helps you track your incident trends, root causes, occurrence to discovery timelines, risk assessment efforts, notification rate and incident closure rate? This is where purpose-built automation and a sophisticated incident response program is essential.
And as you consider the ways data helps you advocate for your program, consider how benchmarking may help you defend your budgetary and resource “asks,” not as a budget-line item request, but as a strategy built on objective, data-driven insights.
In the new year, let’s use data to prove the ROI of privacy programmatic spend.
Topics: Benchmarking Series