The Power of One™ .

The Glass Is Half-Full: Using a Data First Approach to Create Value and Reduce Risk

ALM Law article

DAN PANITZ – June 29, 2023 – A majority of companies today would likely say its people are its most important asset. If we take this to be true, then our corporate data is arguably the next most important asset as it encompasses all the information we store about the company and how it operates.

Our company data tells us who our customers are, how much money we have in receivables, the state of our supply chain and how to chart a course for company growth and prosperity while avoiding pitfalls along the way. While data security for the perimeter continues to evolve in its mission to protect company data from the outside/in, we have only recently begun a new era of continuous improvement for data governance and more effective utilization from the inside/out.

A data first approach focuses on data utilization as a critical part of overall inside/out data security posture management (DSPM). It establishes a path for understanding everything that affects our company data including its access, usage, storage/retrieval and overall security posture and is a guide to how we can continuously improve data intelligence to better manage and grow our business.

This then informs us where sensitive data lives anywhere in the company’s cloud environment, who can access specific data, how it’s being used and what is the quickest way to keeping your organization’s data safe and secure while more effectively leveraging that data for critical company functions. This opens the door to classifying, tagging and monitoring our company data while enabling more efficient corresponding processes to better use and derive higher value from the same.

The headline here is that recent advances for data first approach platforms and processes can be accomplished behind the corporate firewall while the company data rests-in-place. Now we can achieve measurable risk reduction, significant cost avoidance and highly efficient data process opportunities in multiple areas, all while our data remains exactly where we want it, safe and sound within our company.

Why Has Continuous Data Protection & Utilization Improvement Become So Important?

While improved protection and utilization of data is not new to this world, it has skyrocketed in importance from several key factors:

  1. In 2023, approximately 328.77 million terabytes of data are created each day and this number is growing exponentially. Companies then ingest much of this data on an ongoing basis, further expanding from M&A activity.
  2. It has been predicted that ransomware will cost victims $265 billion annually by 2031, and it will attack a business, consumer, or device every 2 seconds. Data security, ransomware and virus risks (global ransomware damage costs reached $20 billion in 2021 – which is 57X more than it was in 2015, with an attack on businesses every 11 seconds in 2021, up from every 40 seconds in 2016).
  3. Increased regulatory compliance* – We witnessed the first public CCPA enforcement action in August 2022 with a $1.2 million fine against Sephora and an astronomical $1.3 Billion dollar fine against Meta by Ireland’s Data Protection Commission on May 22, 2023, marking the beginning to a new era of government enforcement.
  4. Elimination of redundant, obsolete or trivial (ROT) data,* the need for more effective data access control and usage visibility, and data loss exposure of computer systems, platforms and application program interfaces (APIs).
  5. Litigation exposure containment and downstream burden reduction* – Other than regulatory or litigation holds, data should be disposed of at the end of its legitimate business purpose (LBP). An item of high risk, data which we maintain beyond its LBP now becomes discoverable, increasing existing and new claim liability exposure.
  6. Enhanced incident response* – Proactive data first approach ensures that our company will be well-prepared to respond to data breaches and security incidents swiftly and effectively. This increases our company’s ability to detect and respond to potential threats in a timely manner, minimizing any impact.

* Risk reduction and cost avoidance process

Where To Begin Or Improve Upon The Journey

Assuming we have varying levels of experience, knowledge and focus where our corporate data lives, breathes, replicates and takes a nap afterwards (no pun intended), our starting point begins with a data mapping exercise. Think of this as our DNA genomic testing (and periodic re-testing based upon trigger events) to understand the what, where, who, when, why and how occurs for our data throughout varying corporate processes.

Following a data mapping exercise, our next step is to perform a data risk analysis to evaluate and manage data risks proactively, rather than reactively. The deliverable data risk report, particularly one done by experts who work in our vertical (experienced on a cross section of like-situated systems and infrastructure), identifies red flags, yellow flags and green flags for resolution/enhancement in order of priority.

An effective data risk report consists of three primary components: an assessment of data risks, identification of gaps and vulnerabilities, and recommendations for improvement (inclusive of data utilization opportunities). Beyond actionable risk reduction, recommendations for improvement enable a clearer picture of what our data is telling us and how we can accrete value from more effective data utilization.

Following The Path Upward

By applying the right data discovery software, process specific tools, and documentation, a company can learn exactly what data it has, where it is located, when does data movement or replication occur, how it is being collected and used, who has access to it and addresses why for each aspect of our data. Now we know where sensitive information lives in the data flows, and what access/usage (along with corresponding risks/solutions) should be enforced.

To ensure risk remains at a known and acceptable level on a go-forward basis, data minimization best practices should be continually employed. Data minimization means that the data controller (our company) should limit the collection of personal information to what is directly relevant and necessary to accomplish a specified legitimate business purpose. We should also retain the data only as long as is necessary to fulfill that legitimate business purpose.

How Can We Improve Upon Data Processes

Among the many business processes which show continuous improvement from a single data analysis platform behind the corporate firewall (while the data rests-in-place), several top the list:

  • Regulatory & litigation process automation – data discovery to hold, review and produce responsive to regulatory and litigation process is both a cost and resource intensive ongoing event for many of our companies and is an area where we have made notable recent advances. Now, data can be assessed earlier in the process with greater ease and less burden for both litigation claims and internal investigations. The strategic advantage of this alone is exponential.
  • Data retention/disposition enforcement – Data retention and disposition offers significant opportunities to reduce risk, spend and directly affects regulatory/litigation exposure (eg. data privacy, class actions and other).
  • Data access controls, classification/tagging and anomalous data usage behavior can now be better monitored and controlled earlier, improving data loss prevention efficacy, lessening the potential for data breaches and better preparing for incident response.

Next Steps

Once we have our data map and data risk report (with periodic refresh exercises upon trigger events such as infrastructure, data or network material changes or external actions like mergers, acquisitions or divestitures), we have the key to improve any number of corporate processes which may present current pain points, resource burden or otherwise offer increased data intelligence, monetization and value opportunities.

Continuous data first approach improvement using a best-in-class platform behind the corporate firewall as the data rests-in-place can be revolutionary in reducing risk, avoiding cost and lessening spend, while enabling more efficient corresponding processes to better use and derive higher value from our company data, our company’s most important asset next to its people.

Dan PanitzDan Panitz, SVP, Regulatory and Litigation for Trustpoint.One, is an experienced attorney based in New York with more than 25 years of combined legal, technology and corporate advisory experience. Having worked with SEC Enforcement and NASD (now FINRA) Arbitration, Panitz also holds Anti-Bribery & Corruption specialty certifications for the PRC, UK and the United States and has led complex regulatory, litigation and spend analytics support programs for major pharmaceutical and financial services providers globally. Dan can be reached at: or (212) 226-2928