Tokenization has the attention of industries worldwide due to its effectiveness in securing sensitive data sets and reducing PCI compliance scope. Most importantly, people are taking note of the simple fact that if your tokenized environment is breached, all that has been exposed is an unrelated token—not the sensitive data, which is stored outside of your internal systems.
However, not all tokenization solutions are the same, nor do they all function at the same level of effectiveness in your environment. Here, we will compare/contrast tokenization in payment processing (also known as payment service provider tokenization) with an agnostic cloud-tokenization platform.
Tokenization in Payment Processing: What is Your Current State?
Are you utilizing a PSP tokenization solution that was offered to your organization for free? Are you able to tokenize only payment card data (PCI)? What about the rest of the sensitive data sets (PII, NPI, ACH, CSV, HIPAA, etc.) that are currently coursing through your internal systems? Should you enlist a second platform to secure your other sensitive data sets? What are the potential pitfalls of utilizing multiple tokenization platforms? What happens when you want your payment gateway to return your data?
These are all questions you should ask yourself when evaluating your current state for tokenization in payment processing. Many PSPs offer their own credit card payment tokenization, but it's often limited in terms of freedom and flexibility.
Tokenization in Payment Processing: "Free" isn't Free
Your payment processor had such a great pitch, enticing you to try their payment-processing tokenization with a first-year-free offer for their platform. Nothing is free, however. When you partner with a tokenization platform, there should be no lock-in for accessing your own data or receiving your detokenized data for internal initiatives. Your data should always be your data, with the tokenization provider’s role limited to acting as custodian, keeping the data secure and out of your environment.
Tokenization in Payment Processing: Processors Only Tokenize PCI
The vast majority of tokenization solutions from PSPs deal only with PCI—not ACH, PII, PHI data, or the vast amount of other sensitive data sets covered by diverse international rules and regulations that vary from country to country. We have several customers who were unable to tokenize other data sets with their PSPs. Their PSPs acknowledged that they did not want the liability of handling PII or any other sensitive data set outside of PCI.
If your PSP guarantees secure tokenization, then why would they not tokenize your other sensitive data sets? Unfortunately, their platforms were intentionally architected to handle only PCI, so organizations are left to find a separate solution that will safely tokenize the rest of their sensitive data sets. It is uneconomical and much more work to maintain two separate tokenization platforms. Even worse than the economics, though, is having multiple sets of tokens—using different methods for tokenizing data—as this can lead to the risk of commingling token sets and, ultimately, data corruption.
Tokenization in Payment Processing: Cross-Domain Tokenization
Among the main culprits of cross-domain tokenization are payment processor service providers who also supply tokenization. Payment processors can introduce cross-domain tokenization conditions when they use the same token-to-PAN mapping schema for all the merchants they are servicing. For example, Merchant A and Merchant B are both using Payment Processor Z. Processor Z takes a unique PAN and generates the same token for every merchant. So, in this instance, the same token is generated for a PAN used by both Merchant A and Merchant B. If a token from Merchant A is used at Merchant B, it could be processed for a fraudulent payment because it represents the same PAN. A true tokenization solution always generates unique token/PAN combinations for each merchant and stores them in separate locations.
When selecting service providers and payment processors to perform your tokenization, ask about their cross-domain tokenization and cross-vault contamination policies. Before committing to a tokenization vendor, perform a Proof of Concept with the service providers to ensure that the resulting token and PAN data is unique to each token set.
Tokenization in Payment Processing: Data Retrieval
After your first year of “free” tokenization, your PSP will likely hit you with additional hidden fees. That’s on top of the cost of setting up your tokenization solution (integration to your environment), the cost for converting your data to tokens, and the cost of giving your tokens back when you decide you want to utilize your data for third-party analytics or customer storage profiles. If you find yourself looking for a new PSP that better suits your organization, your current PSP might not be contractually obligated to return your tokens, making it even more difficult and expensive to change PSPs and tokenization solutions. So before agreeing to your PSP's tokenization “deals,” scrutinize the contract to ensure that your data remains your data, regardless of the end goals.
Tokenization in Payment Processing: You Have Options
Commingling tokens, data corruption, cost-prohibitive pricing, the inability or unwillingness to tokenize sensitive data sets outside of PCI, and having to pay for your “own” data should cause you to rethink that first year of "free" tokenization. The need to secure PII is not going away, and organizations will have to deploy data security solutions that protect this very sensitive data set.
Moreover, for international organizations new initiatives like the General Data Protection Regulation (GDPR) mandate the securing of PII through data anonymization. Tokenization is a recognized solution for data anonymization, but your PSP could be years away from securing PII—or might never secure PII.
Your PSP is really good at brokering payments, but they can potentially put your organization in harm’s way when trying to act as a holistic data security solution. The good news is that there are flexible tokenization solutions that do not create this litany of expensive and unsafe problems for your organization.