API
The Gcore Customer Portal is being updated. Screenshots may not show the current version.
Edge Cloud
Edge Cloud
OverviewBillingTerraformAnsible
API
Chosen image
Home/Edge Cloud/Managed Logging/Install a log shipper/Fluentd

Install and configure Fluentd

Fluentd is a data collection and processing tool that supports multiple input sources, data formats, and output destinations. The tool allows you to stream and consolidate logs from different applications into one unified format and send this information to the Gcore Managed Logging.

Install Fluentd

Download Fluentd from the official Fluentd website and install it using your preferred method.

Configure Fluentd

A standard Fluentd configuration consists of three sections: input, filter, and output. The input and filter sections depend on the sources of logs.

To send logs to Gcore Managed Logging, configure Fluentd with the Kafka Integration Plugin in your Fluentd installation.

1. Configure Fuentd with Kafka output by adding the following data to the fluent.conf file:

<match pattern> @type kafka2

# list of seed brokers brokers laas-example.gcore.com:443 username: username password: password scram_mechanism: "sha512" default_topic: namespace_username.topic_name sasl_over_ssl: true

# buffer settings <buffer> @type memory ## OPTIONAL PARAMS timekey: 30s timekey_wait: 10s timekey_use_utc: true chunk_limit_size: 4MB total_limit_size: 64MB chunk_full_threshold: "0.9" queued_chunks_limit_size: 12 flush_thread_count: 12 </buffer>

# data type settings <format> @type json </format> </match>

2. Customize the highlighted values:

  • brokers: Kafka servers to which logs will be exported

  • username: Your username

  • password: Your password

  • default_topic: Your username and topic name, separated with a dot (.)

You can find your username, password, and topic name information in the Gcore Customer Portal on the Logging page. Learn more about Managed Logging configuration in our dedicated guide.

To explore more settings, check out the official Fluentd documentation.

  • brokers: Kafka servers to which Fluentd logs will be sent
  • scram_mechanism: Authentication mechanism that verifies a username and password entered to sign in your logs storage
  • default_topic: Kafka topic to which logs will be exported
  • sasl_over_ssl: Use SASL for authentication and SSL for secure communication when set to “true”

3. Save the changes in the Fluentd configuration file.

4. Restart Fluentd. It will start sending logs to the Gcore Managed Logging.

Was this article helpful?

Not a Gcore user yet?

Discover our offerings, including virtual instances starting from 3.7 euro/mo, bare metal servers, AI Infrastructure, load balancers, Managed Kubernetes, Function as a Service, and Centralized Logging solutions.

Go to the product page