Companies interested in Payment Card Industry (PCI) tokenization are getting a helping hand from the PCI Security Standards Council (SSC), the standards body focused on the PCI Data Security Standard (DSS), PIN Transaction Security (PTS) requirements and the Payment Application Data Security Standard (PA-DSS). The council has published the PCI DSS Tokenization Guidelines Information Supplement to provide greater clarity on how specific technologies relate to the PCI Security Standards and impact PCI DSS compliance.
According to a survey at the start of the year, PCI compliance was expected to significantly increase security-related spending. Another survey from Verizon found that organizations struggle when they have to engage in continuous security activity, and that organizations that had suffered data breaches of cardholder information performed dismally in terms of compliance with most PCI requirements. Almost two thirds, 60%, were using some form of point-to-point encryption.
While encryption helps to protect customer data, tokenization offers a much simpler solution, says Jeremy King, European director for the PCI Security Standards Council. "It's easier to make sure the data is not there in the first place."
With tokenization, the Primary Account Number (PAN) is replaced with a token. For PCI DSS, this involves substituting specific customer data with non-specific token values, which can reduce or remove the need for a merchant to retain sensitive customer information in their environment once the initial transaction has been processed.
The council has been working on the guidelines for at least eight months, and took quite some time because there are some challenges around tokenization, he says. Now that the guidelines are available, they will be run by members, especially at the upcoming community meetings in North America (September) and Europe (October) to get feedback on next steps, if any are required.
The guideline outlines explicit scoping elements for consideration, recommendations on scope reduction, the tokenization process itself, deployment and operation factors, as well as best practices for selecting a tokenization solution. It also defines the domains, or areas where specific controls need to be applied and validated, in which tokenization could potentially minimize the card data environment.
This is a pretty significant development in PAN protection, says analyst Diana Kelley, SecurityCurve. "If the solution is flawed or doesn't have independent validation that it meets the functionality and security bar for PCI, merchants/providers must figure that out for themselves. This is extra work and could lead to adoption of solutions that aren't robust or secure enough to protect the PAN data."
It's also an endorsement of tokenization, she says. "That the council has finally said that tokenization can limit scope is significant. PCI watchers have been talking about this for a long time. But until the official blessing from the council, it was not explicitly defined as a technology that can reduce scope, so a [qualified security assessor] could, theoretically, push back during the [report on compliance] process and say tokenization wasn't reducing scope/protecting the PAN adequately."
See more on this topic by subscribing to Network Computing Pro Reports Strategy: Security via Compliance (subscription required).