April 12, 2022
This gives you a breakspace in an MDX post.
Everyone at the chocolate factory was in shock, especially the owner.
“It's gone! It's gone!”
Fifty pounds of unwrapped chocolate was stolen.
“If they get to my chocolate before we put the gloss on, they will be able to unlock the secret recipe just by tasting it.” He said in a frenzy.
Billy was sitting in a corner. He started to laugh.
The owner composed himself for a second. Then he shouted.
“What's so funny?”
“I am thinking about the thieves. Boy are they in for a surprise.”
“Yeah. A $2 billion surprise. My life's work.”
“Boss, I put my formula inside the chocolate. Until the chocolate is glossed and wrapped, my formula turns all the sugar into salt.”
Billy took a piece of the chocolate that fell on the floor.
The boss took a bite, then spit it out.
“That's disgusting. It's just lard and salt.”
“Yeah. Anyone who tries to get access to our candy before it is available to the public is going to gag. I put the second part of my formula at the glossing stage. The salt becomes sugar right after our chocolate is sealed inside its shipping crate.”
He walked to a crate, took out a double fudge square, and dug in. Smiling, he motioned to the boss.
Tokenization is the process of replacing your data with a reference instead of sending it out so outsiders cannot steal your information. It's like turning chocolate into lard and salt while anyone has access to its secrets, only to turn it back into chocolate when anyone can open the wrapper and take a bite.
When someone needs to use your data, something that represents the data being queried is sent. It's called a token, and it can be submitted to an approved query point, called a tokenization system, which uses the token to access your data without giving it to whomever is requesting it.
The tokenization system is isolated from the application that is storing the data so any user querying the data is in between two systems that have it. The user gets the token from the source, sends it to the tokenization system, and the system answers the query. The user carries from one point to another a turkey sandwich; a reference to the data that has no relation to the data itself.
Any application, system, or point of service going from the original point of data to the tokenization system is not carrying the data, but a token that can be exchanged for access to the data, or to events it can or cannot perform based on the information retrieved and processed by the tokenization system.
Tokenization is used for personally identifiable information (PII) like medical records, financial information, even a driver's license which holds a person's home address.
The most common use for tokenization is credit card processing.
Why the need for tokenization to keep a party out of the loop when they want to use your data for a productive purpose?
Say you are driving late at night on the freeway to an early meeting. It's 4AM, and you need something to keep you going. The next rest stop is about 50 minutes out. You park your car and make your purchase.
At the register is Eddie. He is a simple storekeeper in this quaint rural town.
You want Eddie to process your credit card, but you don't want him to use it. He needs to access the $3.75 for the tuna melt and diet coke, but not the credit card number or expiration date to pay for his next shipment of Hootie and the Blowfish CD Sets.
That's what tokens are for.
When Eddie swipes your card, all the register gets from your credit card is a token that represents the data Eddie needs to get paid.
His register uses that token to contact the credit card company. The credit card company uses the token to access your credit card data only to tell Eddie whether or not they gave him the money from your account. That's all Eddie needs to know. Once the company returns yes, he has his money and you get your card back.
This is the need for Tokenization: Enabling applications and people to use your data without seeing it.
Why not use data encryption?
Data encryption can also protect your data, and it sounds a lot cooler.
Data encryption is altering your data, making it unreadable to anyone who doesn't possess the encryption key. The strength of the key is based on the algorithm used to secure the data.
The more complex the algorithm, the stronger the key, and the more protected your data.
The risk is that a malicious attacker can try to crack your algorithm. They can replicate the key to decrypt your data. Using data encryption, even if the data is encrypted inside the digital equivalent of a steel safe with five locks around it, the user is still carrying the data.
Tokenization empowers the owner to decide which users never get to hold the data.
Tokenization replaces the data with a turkey sandwich. Those parties that need to use the data, even for necessary purposes, are given a turkey sandwich, or Converse sneakers, or an old Rubik’s Cube.
The only use the turkey sandwich has is for the tokenization system to reference this particular lunch with its specific trove of information. Even then, the tokenization system does not give the third party the data, but rather instructions on what events come next based on the data it is querying.
Tokenization prevents source of truth information to be shared along any points the owner wants it to be kept private.
This brings a lot of additional benefits:
The Payment Card Industry Data Security Standard (PCI DSS) requires organizations that use debit card and credit cards to comply with their standards.
Their third requirement obligates the protection of data at rest. The aim is to minimize the amount of sensitive data that is stored in third party servers.
Tokenization meets this standard by making sure there are no third-party servers with sensitive information. They are all holding tokens, which reference such information only in approved tokenization systems.
The data is never at rest with Eddie, or in any other location that might not have enough resources to adequately secure information about where your money is.
A quality tokenization provider can offer additional benefits like adding a layer of encryption to your tokenization, using variations of the process like vault less tokenization, consolidation of multiple data types from different technologies, and achieving compliance using only their platform.
All the major cloud platforms have tokenization solutions. Security companies and service providers offer tokenization solutions, you just have to make sure their tokenization offerings are their prime product, and not a secondary addition to what they focus on. Cloud providers offer tokenization-as-a-service options.
You're Set for Life with Endurance!
How We Compete in 2022
One Time Investment, Lifetime Returns!
For a LIFETIME membership you get everything you need: