Our newsletter #9 about PCI scoping introduced “tokenization” as one acceptable technique to reduce the scope of the cardholder data environment or CDE. Let’s clarify this concept in this newsletter.
The concept of tokenization is quite simple to understand: replacing a valuable asset with a non-valuable one. This is the same principle as when a museum uses replicas for public exhibition while keeping authentic artworks secure in its safe, or how a casino uses tokens while keeping cash secured in the vault, or when you leave your coat in a cloakroom in exchange for a ticket.
Tokenization for PCI: Killing two birds with one stone
PCI isn’t really concerned by the protection of artworks, cash or coats, right? Here, the valuable asset is the cardholder data, and more specifically the PAN (Primary Account Number: the credit card number also known as account number). So tokenization consists of swapping PANs wherever they are stored by a piece of information (token) that will be not be attractive for criminals since the token can't be used for transactions or fraudulent charges, so there is little harm done if it's stolen. PANs could then be eliminated or stored for further reference in an electronic vault located internally or externally.
Actually, the notion of tokenization within the PCI framework was originally introduced in DSS V2.0 as an acceptable solution to comply with requirement 3.4:
- “Render PAN unreadable anywhere it is stored (including on portable digital media, backup media, and in logs)”.
But we didn’t have to wait long to see it used in the context of 3.1:
- “Keep cardholder data storage to a minimum”.
So killing two birds with one stone.
As tokens are replacing the sensitive PANs, any components processing or storing this information could be removed from the scope. The downside is all elements of the tokenization system - including the PAN vault and any system component or process with access to the tokenization system - must be considered an important part of the CDE and therefore in scope for PCI compliance.
Additionally, one should not overlook the effort and cost related to the selection of an appropriate solution supporting all their platforms as well as the effort and cost of implementation of such a solution in their environment.
Guidance and regulation
The council quickly understood the urgency of establishing guidance and regulation in this area. The result is available in the council library under the title: “PCI Tokenization Guidelines”.
Will you consider using a tokenization solution in your environment?
If this is already the case, what difficulties did you face with the implementation and what would be your number one recommendation for the community?