Over the last few years, the usage of the words ‘token’ and ‘tokenization’ has evolved a great deal at Bell ID. The last 12 months, however, have brought about the biggest change (both at Bell ID and in the wider industry) as ‘payment tokenization’ has come to the fore.
Where did it all start?
Bell ID first started using the term ‘token’ in 2010 as part of its Token Manager range of software products. Here, it was a ‘catch all’ term for the complete range of secure chip technology form factors supported by our range of products. From EMV chip cards, chip-based ID cards or passports, to the various form factors of secure element in mobile devices, we grouped them all as ‘tokens’.
Then everything changed.
Payment Tokenization and the Cloud
With the advent of host card emulation (HCE) and cloud-based payments, we were already evolving our product names. Mobile Token Manager evolved into Secure Element in the Cloud, for example, when support for HCE in Android enabled credentials and processing to be stored remotely in the cloud environment, rather than in a secure element in a device. In this context, a ‘token’ was deemed to be the combination of static and dynamic card data that is required to make a payment. This combination has since become known as a payload.
Shortly after this, tokenization took on a whole new meaning as EMVCo announced work to standardize payment tokenization and defined the role of the token service provider (TSP). In this context, a token refers to a surrogate primary account number (PAN). In essence, the unique payment credential for a consumer’s account is replaced by an alternative PAN. This boosts the security of online or mobile payment transactions as, if stolen, the surrogate PAN has little or no value to hackers as it can only be used under a specific set of circumstances.
With our 20+ years of history in the industry, it is no surprise that terminology has evolved. And we have evolved with it. Our software platforms are trusted by banks, service providers, governments, transit operators and numerous other entities around the world. We have been successful because our technology has not only moved with the times, it has supported evolution, as we have seen in the last year with HCE and tokenization.