Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ChatGPT cog does not respect (or circumvent) 2000 character limit. #4

Open
desentizised opened this issue May 21, 2024 · 3 comments
Open

Comments

@desentizised
Copy link

I've been experiencing this problem for a while now and with the switch to GPT 4 it's become more prevalent.

image

What happened:
image

What should've happened:
The bot sends the response it gets from OpenAI. Possibly by splitting it up into multiple messages?

@Skylark95
Copy link
Owner

I've played with shortening the characters limit but as you've mentioned there are still situations where it goes over. I've typically used the bot in my server for short response and haven't had an issue, but will gladly welcome a Pull request.

@desentizised
Copy link
Author

When I was using GPT 3.5 I had the token limit at 300 and I assume that just kept OpenAI from allocating the resources on their hardware that would lead to >2000 character responses in the first place. I might try lowering that again.

When you say you use your bot for short responses, are you specifying this as an explicit instruction? That's how I currently work around it. To just append "Answer briefly." to everything.

While I have a background in Software Engineering I'm not sure if I could be of much service in terms of contributing here. My assumption was that the reply on line 79 ends up being whatever OpenAI sends back and then that could theoretically be split up into multiple ctx.send() calls if need be. I take it it's probably not that easy if you've already played around with the problem.

@Skylark95
Copy link
Owner

Skylark95 commented Jun 15, 2024

As a workaround I've truncated the message if it goes over 2000 characters (3d56510). I may explore in the future either splitting up the response into multiple messages or adding the long reply as an attachment.

As you mentioned playing with the token limit can help, but since tokens aren't an exact 1:1 mapping to character counts it takes some experimenting to get right.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants