Any clue why ``@hf/meta-llama/meta-llama-3-8b-instruct`` is returning the prompt in the response wit
Any clue why
@hf/meta-llama/meta-llama-3-8b-instruct is returning the prompt in the response with these tags? Previously it appeared the typical behaviour was to return the actual response which in our scenario would include all text excluding <|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\nGenerate a joke from the users response.<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n. Can we expect such inconsistencies in a non-beta model? 