« Back to Glossary Index
Source::
Tokenization
Definition:
Tokenization is the process of replacing a value with an alternative value, a token. The token is usually randomly generated and can be of the same structure and format as the input value. This can be done consistently or inconsistently and can be made to be reversible or irreversible
Commentary:
Categories: CDMC
Tags: CDMC
« Back to Glossary Index