Logstash is an open-source data collection and processing tool. It’s used to collect, parse, enrich, and transform log data from various sources, including log files, event streams, and databases.
Download Logstash from the official Elastic website and install it per the instructions.
A standard Logstash configuration consists of three sections: input
, filter
, and output
. The input
and filter
sections depend on the sources of logs.
To send logs to Gcore Managed Logging, configure Logstash with Kafka output and enable the Kafka Integration Plugin in your Logstash installation.
1. Configure Logstash with Kafka output by adding the following data to the logstash.conf
file:
2. Customize the highlighted values:
You can find your username, password, login, and topic name information in the Gcore Customer Portal on the Logging page. Learn more about logging configuration in our dedicated guide.
For more settings, check out the Kafka output plugin documentation.
3. Save the changes in the Logstash configuration file.
4. Restart Logstash. Logstash will start sending logs to Gcore Managed Logging.
Was this article helpful?
Discover our offerings, including virtual instances starting from 3.7 euro/mo, bare metal servers, AI Infrastructure, load balancers, Managed Kubernetes, Function as a Service, and Centralized Logging solutions.