Limitation questions

Hi, I had a question about the LLM limits on tokens. I saw on the docs that certain LLMs like @cf/baai/bge-base-en-v1.5 have a max token limit of 512 in and 768 out, but I didn't happen to see anything for the @cf/meta/llama-2-7b-chat-fp16 or @cf/meta/llama-2-7b-chat-int8 models.

Anywhere I can see the info for those models?
Cloudflare Docs
Feature extraction models transform raw data into numerical features that can be processed while preserving the information in the original dataset.
Was this page helpful?