In the digital era, data is everything for businesses. Without data, it becomes difficult for businesses to formulate strategies. Key business areas such as competitive intelligence, business analytics, market research, consumer behavior research, etc., require data. But when more data is leveraged, it creates multiple conflicts, not just retrieving data from unconsenting consumers but also how data is processed.
The good news is that there are many laws relating to processing sensitive information, like the California Privacy Rights Act. Yet, companies are to fully comply with these laws. Either they are unaware of these laws, or the compliances have not been appropriately implemented. Data governance standards have risen. Newer laws and regulations on protecting consumer data have been formulated. It is up to the companies now to start complying with these laws.
Businesses thrive on data. They need to be able to quickly access and process data. But in the process, they must not forget to factor in the privacy concerns of end users. A company’s productivity is hampered by compliance with standards such as GDPR, PCI, HIPAA, and several state regulations. Violations of compliance standards give rise to fines and penalties.
Sometimes, many of the basic considerations for ensuring data privacy are missed in an effort to upgrade technology and revamp processes. Also, the framework to protect sensitive data is not thought through. The structure for managing sensitive data is not well-defined or not even in the scheme of things.
Often, the first step towards defining a framework for defining sensitive data is to understand what sensitive data is. It is not a scalable option to identify every piece of data and determine its sensitivity. The best approach is to create categories on a broad scale. Basic information such as passwords, user lists, system information, medical records, trade secrets, and financial information can be classified as sensitive.
This step implicitly creates categories. Any future information can find its way into any of these categories. Within categories, hierarchies can be created. For example, classified financial data such as credit card information and social security numbers can have hierarchies. Depending on the context, one type of information can have a higher hierarchy than the other and vice-versa.
It is also a good idea to create restricted data that the company cannot access at any cost. This is the data that the company cannot retrieve, access, store, transmit or process. As an example, the credit card information is one type of information a company may not want to retrieve and store.
Any intentional collection of this data would result in an infringement of the company’s data policy.
Data governance and information protection is not a one-time activity. It requires a long-term vision. It is an evolving process. From the nuts and bolts of the process, such as picking up pieces of data and categorizing them, to the more elaborate process that uses technology intervention, the first step is crucial. Assessing the data governance needs of an organization goes a long way in establishing a robust framework. This way, the organization will have created a model for protecting and processing sensitive data from customers, end-users, vendors, suppliers, and several other stakeholders.
There are various ways to store data. But not all the ways to store data might be aligned to a business’s objectives and ideals. So, it is critical to define storage requirements. As an example, using a cloud service may or may not be ideal. If the organization wants total control over their data and not rely on any external entity for their storage requirements, then they may consider using a private data center. Even if data is stored locally, within the company’s network, compliance standards such as HIPAA GDPR, and PCI must be implemented. This is so that the data is interoperable.
In the realm of processing sensitive information, one of its key aspects is sharing data. Controls relating to procedures and technicalities involved in data sharing must be implemented. To start thinking on these lines, it is important to set right the objectives for data sharing. The overbearing presence of technology in such decisions must be ruled out. Often, the presence of a type of technology overshadows decision-making relating to data sharing. The problem with making such decisions is that when the technology or the technology paradigm changes, the data sharing procedures and technicalities must be completely overhauled.
When data is collected over a period of time, it gets archived. After a certain point of time, it is purged. The policy to purge data depends on company requirements. An organization might keep historical data for an indefinite time before purging it. While this strategy is fine with many organizations, the potential for data breaches is high. As much as active data is regulated, so must historical data be regulated. Change management policies have to be implemented. There must be organization-wide standards relating to information archival, removal, deletion, and re-activation.
An organization must assess the control points it wants to use to enforce policy. Security checkpoints must be conceptualized. The security checkpoints act as gateways for the processing and passage of sensitive information. Monitoring systems must be identified – not just technologies, but the objectives of these systems must be spelled out. At a foundational level, a monitoring system must be able to detect anomalies, analyze information flows, and identify unauthorized traffic.
With vast amounts of digital information passing through corporate networks, and the number of users accessing the same, monitoring systems are a bare necessity. But organizations should focus less on technologies, software, and hardware products that achieve this. They must focus on the end-goals of monitoring and data governance enforcement systems and use technology as an enabler.
Either in transit, or when being stored, or even when being processed, data should be encrypted until the point of final access. Strong data encryption technologies can be used. Data encryption should ensure that even in the event of a data breach, data cannot be used until the legitimate authorities grant permission. Currently, many organizations are implementing a zero-trust network model. Here, networks are segmented into small administrative domains. Each domain comprises a handful of nodes such as computers, storage devices, processing devices, network elements, printers etc. Passwordless and password-based access can be granted to such administrative domains depending on the type of user and their permission levels. Such dynamic approaches to authorization coupled with data encryption, ensures confidentiality of sensitive user data.
When processing sensitive information, not all users have the same preferences. Some users might still want to use password-based authentication. When giving such flexibility, using the latest authentication techniques is also important. One of them is the two-factor authentication method. For example, when a login attempt is made, a randomly generated ephemeral number is sent to the user’s mobile device. If the user owns their mobile device, they can key in the password that appears on their mobile device into the login screen and subsequently gain access.
Data organization and data discovery models are required to protect sensitive information. High-volume data needs good taxonomy systems and good data organization. It creates better data models, faster data convergence, and better data discovery. Tools, technologies, and external systems that use data can easily retrieve data. Importantly, all of this leads to better management of data. Without it, data management may not be optimal. It may create gaps. And these gaps could be ill-utilized to compromise sensitive information.
When processing sensitive information, it has to be identified as to which types of data can be handled by external service providers. Only those categories of data that have a confidentiality level of less-severe or non-critical can be handled and processed by external service providers. The terms of agreements and service conditions with the external service providers must be dynamic, situational, context-driven, flexible, and agile.
Even as security systems evolve, so do technologies evolve that want to breach security defenses. So, there are no off-days for sensitive data processing systems. Each day is a day to continuously evolve and be on high alert. One of the ways to achieve this is to have continuous audits. The audits identify potential process gaps, technology weak links, system discrepancies, and loopholes. Identifying them and acting on them is a proactive way to manage security systems. Organizations must be proactive and also ready to be reactive – be prepared with contingency and business continuity plans.
Data security and processing of sensitive information is a long-standing topic. With the onset of laws such as the California privacy rights act, the way sensitive information is processed is being standardized. Organizations now have a better handle on how to consistently and in a standards-based manner process sensitive information.