Definition: Tokenization is the process of replacing sensitive data with a non-sensitive equivalent (referred to as a token) that has no extrinsic or exploitable meaning or value. This could be a unique ID number or identification symbols that retain all of the data’s essential information without jeopardizing its security.
Tokenization explained
Tokenization can protect privacy by ensuring that only tokens, rather than a permanent identity number or other personally identifiable information, are exposed or stored during a transaction. By turning sensitive data into an unrecognizable string of characters that are rendered unusable without the tokenization system in place means that, if stolen, provides no value to cybercriminals.
Digital tokenization and encryption are two different cryptographic methods used for data security. The main distinction between the two is that tokenization does not alter the length or type of data being protected, whereas encryption alters both length and data type. Tokenization uses undecryptable information to represent secret data. Encryption is decryptable with a key.
Tokenization has long been used in credit and debit card systems to replace data on the card (e.g., an account number or a payment card number) with a unique randomly generated token that can be used to represent the card data in transactions but does not reveal the original card data. This reduces the number of systems that have access to the original card data, including the risk of fraud if a system becomes compromised.
The tokenization system must be secured and validated in accordance with security best practices for sensitive data protection, secure storage, audit, authentication, and authorization. The tokenization system provides data processing applications the authority and interfaces they need to request tokens, or detokenize back to sensitive data.
Source: Utimaco: What is Tokenization?