API
The Gcore Customer Portal is being updated. Screenshots may not show the current version.
Edge Cloud
Edge Cloud
OverviewBillingTerraform
API
Chosen image
Home/Edge Cloud/Managed Logging/Install a log shipper/Logstash

Install and configure Logstash

Logstash is an open-source data collection and processing tool. It’s used to collect, parse, enrich, and transform log data from various sources, including log files, event streams, and databases.

Install Logstash

Download Logstash from the official Elastic website and install it per the instructions.

Configure Logstash

A standard Logstash configuration consists of three sections: input, filter, and output. The input and filter sections depend on the sources of logs.

To send logs to Gcore Managed Logging, configure Logstash with Kafka output and enable the Kafka Integration Plugin in your Logstash installation.

1. Configure Logstash with Kafka output by adding the following data to the logstash.conf file:

output { kafka { codec => json topic_id => "yourlogin.yourtopic" bootstrap_servers => "endpoint" sasl_mechanism => "SCRAM-SHA-512" security_protocol => "SASL_SSL" sasl_jaas_config => "org.apache.kafka.common.security.scram.ScramLoginModule required username='yourlogin'password='yourpassword'; " key_serializer => "org.apache.kafka.common.serialization.StringSerializer" value_serializer => "org.apache.kafka.common.serialization.StringSerializer" } }

2. Customize the highlighted values:

  • yourlogin.yourtopic: Your username and topic name, separated with a dot (.)
  • endpoint: Kafka endpoint
  • yourlogin.yourtopic: Your login
  • yourpassword: Your password

You can find your username, password, login, and topic name information in the Gcore Customer Portal on the Logging page. Learn more about logging configuration in our dedicated guide.

For more settings, check out the Kafka output plugin documentation.

  • topic_id: ID of the Kafka topic to which Logstash will publish messages
  • bootstrap_servers: A comma-separated list of Kafka broker addresses
  • sasl.mechanism: Authentication mechanism that helps to verify a username and password entered to sign in your logs storage
  • security_protocol: Security protocol used to encrypt your data and protect it from theft
  • sasl_jaas_config: JAAS configuration required for SASL authentication
  • key_serializer: Serialization format used to convert keys of messages sent to Kafka
  • value_serializer: Serialization method applied to the content of messages before they are transmitted to Kafka

3. Save the changes in the Logstash configuration file.

4. Restart Logstash. Logstash will start sending logs to Gcore Managed Logging.

Was this article helpful?

Not a Gcore user yet?

Discover our offerings, including virtual instances starting from 3.7 euro/mo, bare metal servers, AI Infrastructure, load balancers, Managed Kubernetes, Function as a Service, and Centralized Logging solutions.

Go to the product page