Tokenization and Encryption: Not Just for Credit CardsPrevent exposure of confidential data and satisfy regulations
By: Patrick Barnett, Incident Response
Tokenization conceals sensitive content, guarding it from unauthorized access by making it mathematically irreversible. Many credit card processors use this technology to protect credit card transactions; for example, by using a cryptographic hash function to create a token from the credit card number, store name, and amount and date of the transaction. These tokens also provide integrity by ensuring that the content cannot be modified by unauthorized individuals. Tokenization can also be used to limit exposure of personally identifiable information (PII), protected health information (PHI), or other types of confidential data, including unstructured data such as long passages of text or entire documents. It can help satisfy regulatory requirements in areas such as the payment card industry (PCI DSS), healthcare (HIPAA-HITECH), finance (GLBA), defense (ITAR), and European data protection (GDPR), and it can dramatically reduce risks associated with storing or processing confidential data.
When combined with security controls such as strong perimeter defenses, a strong security policy, patch management, robust logging and alerting, endpoint protection, and intrusion detection and prevention systems (IDS/IPS), tokenization can substantially mitigate risks. Many organizations’ backend systems rely on confidential data such credit card, Social Security, passport, and driver’s license numbers as unique identifiers to access information for billing, order status, and customer service. Tokenization allows organizations to maintain the functionality of backend systems without exposing confidential data to attackers.
Tokenization can be used with encryption to protect data stored in cloud services or applications. Encryption secures data exchanged with third parties, safeguarding the data and using keys to ensure that only authorized users can view the content. Transport Layer Security (TLS), the foundation of secure data sharing on the Internet, relies on encryption to create a secure tunnel between the end user and the website. Asymmetric key encryption is also an important component of TLS certificates used to validate identity. Depending on the use case, an organization may use encryption, tokenization, or a combination of both to secure different types of data and meet regulatory requirements.
Some organizations might be concerned that tokenization and encryption are too difficult or too expensive to implement, but implementation can be simple — even seamless in many cases — if the tokenization provider designs its solutions to an organization’s needs and minimizes impact to the existing environment.
Step 1: Discover and Convert Legacy Data
An organization first needs to inventory where confidential data resides within or traverses its network. For example, payment processors might not store data, but it could be exposed while in transit. Tokenization service providers and automated tools such as data loss prevention (DLP) software can assist with this process. Data is frequently lost in areas where it was not known or intended to exist. Large companies often use real post-transaction data in back-office applications such as data analysis, marketing, and customer loyalty programs. Stored legacy data can proliferate to spreadsheets, emails, and other documents on computers throughout the network, and all of these locations are subject to regulatory audits even if the data is highly encrypted.
Companies that do not store data long term may opt to skip this step, although SecureWorks® analysts recommend following the data discovery process to verify that sensitive data is not located in unexpected places. A data inventory provides peace of mind and may enable a warranty from the tokenization provider. All organizations should also have a data classification plan.
After the data is discovered and inventoried, it should be converted to token numbers. This conversion serves several purposes: it facilitates regulatory compliance, it saves costs by reducing the need to protect the data and the scope of future audits, and it reduces the organization’s risk posture while protecting customers. The tokenization service provider should be able to conduct the entire data discovery and token conversion process. An organization can submit legacy data files to the provider and receive tokenized data files in return. In most cases, the tokenized data can be substituted for real data in the back-office applications without disrupting business processes.
Step 2: Modify the Message Specification
When adding tokenization, the organization needs to provide recipients with clear instructions regarding what data is being sent upstream, its format, and what should be returned to the data owner. Organizations that have an established method for data processing already use a message specification to tell the recipient about the inbound data (e.g., “Here is a clear text PAN and card number. Send back this number along with the authorization code.”). This message must be modified to include processor-defined tokenization instructions such as, “Send back a token number along with the authorization code.” If an organization also chooses to embed encryption in the upstream data, the message specification should indicate that encrypted data is now replacing what was previously clear text data.
Step 3: Embed Encryption for Additional Security
Encrypting confidential data before sending it to another processor is optional but highly recommended when implementing a tokenization solution. Tokenization service providers typically offer an encryption service for additional security. Organizations that use a secure private line such as a frame relay to transmit data to another entity may not feel the need for additional encryption. Moreover, some companies may have an existing encryption solution and would not need that service from the tokenization provider.
Encryption does not necessarily require an investment in new swipe terminals or cash registers. There are reliable software solutions that add the encryption routine to a point-of-sale (POS) device so that sensitive data is encrypted as close as possible to the point of entry. The data remains encrypted until it is received by the payment processor, where it is decrypted to traverse the processing network and complete the authorization process. When a transaction authorization is returned to the merchant, the response includes an encrypted token number instead of clear text.
Step 4: Adjust Business Processes if Necessary
In this phase of the tokenization process, an organization should review its internal rules and processes for working with data and assess the impact of tokenized or encrypted data. A seamless implementation requires that the organization and the tokenization provider discuss the systems that touch confidential data and understand the business requirements of those systems. This step is particularly important for large organizations that use confidential data for ancillary purposes beyond payment authorization, such as merchants that use credit cards or entities that transmit Social Security numbers or medical information.
For example, tokenization can affect post-authorization bank identification number (BIN) analysis to determine the bank name or card type because the BIN is not maintained with the token number. To continue doing this type of analysis, the merchant would need to slightly modify the application. Tokenization can also affect credit card validation that relies on the Luhn algorithm, also known as a mod 10 check, to validate the authenticity of a credit card number received from the payment processor. This validation fails once token numbers are used in place of real card data because token values are designed to avoid accidentally matching a real credit card number. The company needs to modify its processes in these cases, possibly by turning off this check or removing the code that performs that action.
A data breach can lead to millions of dollars in losses, large drops in stock price, negative impact on branding and reputation, and loss of customer and market confidence. Investing in tokenization and encryption can mitigate these consequences by protecting confidential information such as customer account numbers, Social Security numbers, patients’ medical information, birthdates, telephone numbers, email addresses, and passport information. They can also help organizations satisfy regulatory requirements and save time and money by reducing the scope of yearly assessments and audits.
Applying encryption and tokenization can be straightforward with a service-based solution. Depending on the organization’s environment, implementation can be relatively simple. The tokenization provider should help ensure a seamless experience. Token numbers do not need to break a valid business process. An organization should work with its tokenization provider to understand what processes might be affected and how token numbers can accommodate the business need.