-
Notifications
You must be signed in to change notification settings - Fork 332
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
提示词模板格式是什么,如何避免生成结果混乱? #304
Comments
use https://github.com/chujiezheng/chat_templates/blob/main/chat_templates/openchat.jinja (which is said to compatable with qwen1.5 https://github.com/chujiezheng/chat_templates/blob/main/generation_configs/qwen2-chat.json ) ??? |
The chat template is embedded in the official GGUF files. They are also provided in tokenizer.json as the standard practice in |
I'm using llama.cpp https://github.com/withcatai/node-llama-cpp , how to use the embedding template?
It will use embedding template if you not passing a custom template. And that is what I did, and I get messy result. So it mighe be caused by tokenlizer? I'm not sure, and I am going to try that. |
If you're not using |
我目前使用这样的模板
结果生成结果非常不好,请问 qwen1.5 gguf 需要使用什么样的提示词模板呢,文档里暂时没看到,是否有 Jinja 模板可以参考?例如
The text was updated successfully, but these errors were encountered: