Well, google for a LLama 2 tokenizer then, e.g. -> https://github.com/belladoreai/llama-tokenizer-js

Well, google for a LLama 2 tokenizer then, e.g. -> https://github.com/belladoreai/llama-tokenizer-js (or use the official Python Sentencepiece tokenizer like this -> https://github.com/meta-llama/llama/blob/main/llama/tokenizer.py)
GitHub
JS tokenizer for LLaMA 1 and 2. Contribute to belladoreai/llama-tokenizer-js development by creating an account on GitHub.
Was this page helpful?