Does @cf/meta/llama-2-7b-chat-int8 in Workers AI have a token restriction that generates incomplete

Does @cf/meta/llama-2-7b-chat-int8 in Workers AI have a token restriction that generates incomplete results
Was this page helpful?