Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fluentbit not parsing my json logs as needed and sending to Opensearch #815

Open
homerakshay opened this issue May 10, 2024 · 0 comments
Open

Comments

@homerakshay
Copy link

homerakshay commented May 10, 2024

Hi there,

Version of fluentbit : latest

A part of my fluentbit config is as below :
data: fluent-bit.conf: | [SERVICE] Flush 5 Log_Level info Daemon off Parsers_File parsers.conf HTTP_Server On HTTP_Listen 0.0.0.0 HTTP_Port 2020 @INCLUDE input-kubernetes.conf @INCLUDE filter-kubernetes.conf @INCLUDE output-elasticsearch.conf input-kubernetes.conf: | [INPUT] Name tail Tag kube.* Path /var/log/containers/*.log multiline.Parser cri filter-kubernetes.conf: | [FILTER] Name kubernetes Match kube.* Kube_URL https://kubernetes.default.svc:443 Kube_CA_File /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Kube_Token_File /var/run/secrets/kubernetes.io/serviceaccount/token Kube_Tag_Prefix kube.var.log.containers. Merge_Log On Merge_Log_Key log_processed [FILTER] Name parser Match * key_name message Parser json output-elasticsearch.conf: > [OUTPUT] Name es Match kube.* Host xxxxxxxxxxx Port 443 TLS On AWS_Auth On AWS_Region af-south-1 Replace_Dots On Retry_Limit false Logstash_Format On Logstash_Prefix nonprod Suppress_Type_Name On parsers.conf: > [PARSER] Name cri Format regex Regex ^(?<time>[^ ]+) (?<stream>stdout|stderr) (?<logtag>[^ ]*) (?<message>.*)$ Time_Key time Time_Format %Y-%m-%dT%H:%M:%S.%L%z [PARSER] Name json Format json Time_Key time Time_Format %Y-%m-%dT%H:%M:%S.%L%z ---

Basically i am trying to remove the prefix entries appended by the containerd before I parse my json. Also i would want to part my json object keys as a root level field in fluentbit. However I do not see this happening and I see message/log section having the entire json as log data and nothin on the root level as needed for my field filter.

I tried using the NEST, modifier and modify but with no luck. Could anyone help with their inputs and how they did it with EKS 1.29 and json as app logs please? Will be much appreciated.

My json logs are as below for example :

{"id":"xxxx","amount":33,"paid_amount":0,"gateway_type":"cell","currency":"KES","country":"KE","metadata":{},"description":"connectorhub payment test34","statement_descriptor":"Cellulant Payment","capture":true,"card_details":null,"initiation_type":null,"visual_codes":[],"textual_codes":[],"customer_details":{"correlation_id":"","gc_type":"cell","gc_original_error_code":"","gc_status":null,"_gc_raw_data":[],"gc_error_code":null,"first_name":"Gunnar","last_name":"Stewart","email":"testing@gmiaL.com","phone_number":"43434343","rapyd_token":"xxxx"},"instructions":null,"remitter_information":null,"billing_address":{"name":"Gunnar Stewart","line1":"P O Box 76658 00508","line2":"","city":"sas","phone_number":"","state":"","country":"KE","zip_code":""},"payment_method_details":{"fields":{},"pmt_method_config":{"gw_pmt_id":{"prod":"22","test":"11"}},"payment_method":"sdsdsd","category":"bank_transfer","next_action":"initial"},"merchant_identifier":{"env":"test","merchant_service_code":"2323232","submid_mcc":null,"organization_details":{"org_id":"sasasas","alias":"sas cell asa","organization_entity":"asa","organization_address":{"city":null,"zip_code":null},"signup_country_iso_alpha_2":"IL","mcc":null}},"complete_payment_url":null,"error_payment_url":null,"operation_id":null,"reference_id":null,"payment":"saas","created_at":null,"paid_at":null,"error_code":"INTERNAL_SERVER_ERROR","error_message":null,"expiration":44444,"reconciliation":null,"gateway_name":"sas","gateway":null,"redirect_url":null,"status":"ERR"}

@homerakshay homerakshay changed the title Fluentbit not parsing and json logs as needed and sending to Opensearch Fluentbit not parsing my json logs as needed and sending to Opensearch May 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant