“Tokenization is a mechanism that replaces sensitive data, such as card details, with a unique digital identifier known as a token, which has no inherent value if compromised,” explains Alex Gatiragas, Director, Solution Experience, at G+D. “The token allows payments to be processed without exposing actual details, such as card data. With network tokenization, the token for card details, for example CoF for a merchant, is created and managed by the payment network provider, and the mapping of the payment card details is kept in a very secure fashion.”
Any transaction passing through the payments ecosystem goes through six simple steps:
- A customer enters their payment information at checkout
- The merchant’s PSP requests a network token from the card scheme
- The card scheme then shares the information with the card issuer
- The card scheme replaces the customer’s payment information with the network token
- In turn, the card scheme shares the network token with the merchant’s PSP
- Payment is processed using the network token.
That may sound like a lot for a single transaction, but it’s an automatic and near-instant process that safeguards security while making life more convenient for customers.
Network tokenization also represents a major step forward from the historical storage of CoF account numbers in merchants’ own databases, Excel documents, or CRM systems. And it also is key in establishing end-to-end security that reaches across the payments ecosystem.
In recent years, such network tokenization has been the backbone payments mechanism for many online giants, including Netflix, Google Pay, and Amazon. And Visa and Mastercard have also signed a reciprocal tokenization agreement to use tokens across their payment value chains.
That is helping to overcome one of the core challenges of the mechanism, says Gatiragas. “Network tokenization requires that participating card issuers also support network tokenization. This capability is rapidly growing. However, there may still be instances where a customer’s CoF details cannot be network tokenized, and that requires other non-network tokenization capabilities to be used in parallel – at least for a transition period.”