Technique
Tokenization
The process of splitting raw text into tokens — the units (sub-words, words, or characters) that an LLM actually processes.
Technique
The process of splitting raw text into tokens — the units (sub-words, words, or characters) that an LLM actually processes.
We use cookies
Anonymous analytics help us improve the site. You can opt out anytime. Learn more