Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

azservicebus.Receiver.ReceiveMessages ignoring context timeout #17979

Closed
adeturner opened this issue May 16, 2022 · 1 comment
Closed

azservicebus.Receiver.ReceiveMessages ignoring context timeout #17979

adeturner opened this issue May 16, 2022 · 1 comment
Labels
Client This issue points to a problem in the data-plane of the library. customer-reported Issues that are reported by GitHub users external to the Azure organization. question The issue doesn't require a change to the product in order to be resolved. Most issues start as that Service Bus

Comments

@adeturner
Copy link

adeturner commented May 16, 2022

Bug Report

github.com/Azure/azure-sdk-for-go/sdk/messaging/azservicebus v0.4.1
go version go1.18.1 linux/amd64

  • What happened?

From what I can see, azservicebus.Receiver.ReceiveMessages hangs when no messages are present in the queue. The call does not cancel according to context timeout setting (either customer set or internally). This makes it tricky to write KEDA driven serverless customer code for e.g. container apps

  • What did you expect or want to happen?

Preferred that ReceiveMessages times out based on the customer controlled setting, or the internal values if required (20ms/1second)

  • How can we reproduce it?

Please request if you would like me to provide some code to run

  • Anything we should know about your environment.

n/a

  • Investigation
ctx, _ := context.WithTimeout(context.TODO(), time.Second)
var receiver *azservicebus.Receiver
messages, err = receiver.ReceiveMessages(ctx, limit, nil)

fetchMessages doesnt yield to context timeout

defaultTimeAfterFirstMessage is set to 20ms or 1 second.

I propose a change to fetchMessages something like the below

Note: due to the interfacing its unclear to me whether the same change is required in internal.AMQPReceiver, but a quick look at Receive makes it looks like it handles things ok there

func fetchMessages(ctx context.Context, receiver internal.AMQPReceiver, maxMessages int, defaultTimeAfterFirstMessage time.Duration, messages *[]*ReceivedMessage) error {
	if err := receiver.IssueCredit(uint32(maxMessages)); err != nil {
		return err
	}

	var cancel context.CancelFunc

	for {

// BEGIN CHANGE
		select {
		case <-ctx.Done():
			// context timeout
			return nil
		default:
// END CHANGE
			amqpMessage, err := receiver.Receive(ctx)

			if err != nil {
				return err
			}

			*messages = append(*messages, newReceivedMessage(amqpMessage))

			if len(*messages) == maxMessages {
				return nil
			}

//BEGIN CHANGE
			// if cancel == nil {
			if _, deadlineSet := ctx.Deadline(); !deadlineSet {
				// replace the context that we're using for everything with a new one that will cancel
				// after a period of time.
				 ctx, cancel = context.WithTimeout(ctx, defaultTimeAfterFirstMessage)
				defer cancel()
			}
// END CHANGE
		}
	}
}
@ghost ghost added needs-triage Workflow: This is a new issue that needs to be triaged to the appropriate team. customer-reported Issues that are reported by GitHub users external to the Azure organization. question The issue doesn't require a change to the product in order to be resolved. Most issues start as that labels May 16, 2022
@adeturner
Copy link
Author

Apologies, I'd convinced myself that there was an issue, but I was overwriting my ctx. #facepalm

@RickWinter RickWinter added Service Bus Client This issue points to a problem in the data-plane of the library. labels Oct 6, 2022
@ghost ghost removed the needs-triage Workflow: This is a new issue that needs to be triaged to the appropriate team. label Oct 6, 2022
@github-actions github-actions bot locked and limited conversation to collaborators Apr 11, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Client This issue points to a problem in the data-plane of the library. customer-reported Issues that are reported by GitHub users external to the Azure organization. question The issue doesn't require a change to the product in order to be resolved. Most issues start as that Service Bus
Projects
None yet
Development

No branches or pull requests

2 participants