Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(kafka-logger): support sasl conf #7782

Closed
wants to merge 33 commits into from
Closed

Conversation

biubiue
Copy link
Contributor

@biubiue biubiue commented Aug 24, 2022

Description

Fixes # (issue)

Checklist

  • I have explained the need for this PR and the problem it solves
  • I have explained the changes or the new features added to this PR
  • I have added tests corresponding to this change
  • I have updated the documentation to reflect this change
  • I have verified that this change is backward compatible (If not, please discuss on the APISIX mailing list first)

--- config
location /t {
content_by_lua_block {
local producer = require "resty.kafka.producer"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should test the kafka-logger plugin directly like other tests, instead of using the producer outside of the plugin.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should test the kafka-logger plugin directly like other tests, instead of using the producer outside of the plugin.

Em... I maybe not understand https://github.com/apache/apisix/pull/7782#discussion_r956718372 .
OK, I will add test case like others

Comment on lines 644 to 649
=== TEST 21: access
--- request
GET /hello
--- response_body
hello world
--- wait: 2
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we can inject the producer.new like:

=== TEST 7: verify request
--- extra_init_by_lua
local cls = require("apisix.plugins.tencent-cloud-cls.cls-sdk")
cls.send_to_cls = function(self, logs)
if (#logs ~= 1) then
ngx.log(ngx.ERR, "unexpected logs length: ", #logs)
return
end
return true
end
--- request
GET /opentracing
--- response_body
opentracing
--- error_log
Batch Processor[tencent-cloud-cls] successfully processed the entries
--- wait: 0.5

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we can inject the producer.new like:

=== TEST 7: verify request
--- extra_init_by_lua
local cls = require("apisix.plugins.tencent-cloud-cls.cls-sdk")
cls.send_to_cls = function(self, logs)
if (#logs ~= 1) then
ngx.log(ngx.ERR, "unexpected logs length: ", #logs)
return
end
return true
end
--- request
GET /opentracing
--- response_body
opentracing
--- error_log
Batch Processor[tencent-cloud-cls] successfully processed the entries
--- wait: 0.5

Em, like this?
=== TEST 20: create producer with sasl_config
--- extra_init_by_lua
    local producer = require("resty.kafka.producer")
    local klogger = require("apisix.plugins.kafka-logger")
    producer.new = function(klogger)
        if (!klogger.sasl_config) then
            ngx.say("create producer without sasl_config")
            return producer
        end
        producer.sasl_config = klogger.sasl_config
        ngx.say("create producer with sasl_config")
        return producer
    end
--- request
GET /hello
--- response_body
hello world
--- wait: 2

properties = {
mechanism = { type = "string", description = "mechanism" },
password = { type = "string", description = "password" },
user = { type = "string", description = "user" },
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These fields should be required if the sasl config exists?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These fields should be required if the sasl config exists?

I modify these fields ,and mark required fields

@@ -38,6 +38,11 @@ It might take some time to receive the log data. It will be automatically sent a
| Name | Type | Required | Default | Valid values | Description |
| ---------------------- | ------- | -------- | -------------- | --------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| broker_list | object | True | | | List of Kafka brokers (nodes). |
| sasl_config | object | False | | | Kafka sasl conf. |
| sasl_config.mechanism | string | False | | | Kafka mechanism. |
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add the default value here.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add the default value here.

ok

type = "object",
description = "sasl config",
properties = {
mechanism = { type = "string", description = "mechanism", default = "PLAIN" },
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The mechanism should be restricted to only use a few enumerations like PLAIN, SCRAM and so on.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The mechanism should be restricted to only use a few enumerations like PLAIN, SCRAM and so on.

ok, I change it

Copy link
Contributor Author

@biubiue biubiue Sep 8, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The mechanism should be restricted to only use a few enumerations like PLAIN, SCRAM and so on.

Just now I review lua-resty-kafka ,
Write this local MECHANISM_SCRAMSHA256 = "SCRAM-SHA-256" -- to do, so SCRAM-SHA-256 maybe not work.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, SCRAM-SHA-256 is copied from Confluent docs. No worries at all.

@spacewander
Copy link
Member

Please make the CI pass, thanks!

@biubiue
Copy link
Contributor Author

biubiue commented Sep 9, 2022

Please make the CI pass, thanks!

Yeah,I also want to make CI pass, but write test case ,it’s not easy for me. I will do it soon.

            ngx.log(ngx.ERR, "unexpected broker_list length: ", #broker_list)
            return nil
        end
        return producer
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The producer should not be returned here, you can see that the original returned is a table.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The producer should not be returned here, you can see that the original returned is a table.

Hi, when inject producer.new,it also dependency client.new. It means I will need inject client.new .

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants