API
The Gcore Customer Portal is being updated. Screenshots may not show the current version.
Edge Cloud
Edge Cloud
OverviewBillingTerraform
API
Chosen image
Home/Edge Cloud/Managed Logging/Install a log shipper/Filebeat

Install and configure Filebeat

Filebeat is a log shipping tool. It collects logs from a device and sends them to an external storage.

In Filebeat’s Settings, specify a source of the logs you need, for example, logs of a specific service, internal storage, TCP port, or OS events. Indicate a location for log export—Gcore’s Logging servers. After that, Filebeat works automatically: once a required log is recorded in the system, the log shipper transfers it to our storage, and the log appears on OpenSearch Dashboards.

Install

An up-to-date link for Filebeat installation is available on its Filebeat official webpage. Open the Filebeat website and click Download.

Configure

The Filebeat configuration file consists of two sections: inputs and output. inputs determines the source of the logs you need to collect, and output indicates the destination where they will be sent. Here’s how to set up the configuration file correctly:

  1. Open the filebeat.yml file and add the following data:
filebeat.inputs: - type: log enabled: true paths: - /var/log/*.log output.kafka: hosts: ["laas-example.gcore.com:443"] topic: 'yourlogin.yourtopic' sasl.mechanism: SCRAM-SHA-512 username: yourlogin password: yourpassword ssl.enabled: true

Customize the highlighted values:

  • /var/log/*.log: File(s) path
  • laas-example.gcore.com:443: Kafka Endpoint on the Logging page
  • yourlogin.yourtopic: Your username on the Logging page and your topic name separated with a dot (.)
  • yourlogin: Your username on the Logging page
  • yourpassword: Your password

You can also collect logs from other sources such as TCP, UDP, or syslog. For more details, refer to the "Configure inputs" section of the Filebeat documentation: Click the log source you need, and configure the inputs section following the provided guide.

  • output.kafka: Servers for log export (Kafka servers).
  • hosts: Address of server(s) where logs will be exported to.
  • topic: Topic(s) where logs will be exported to.
  • sasl.mechanism: Authentication mechanism that helps to verify a username and password entered to sign into your logs storage.
  • sasl.username: Username that helps to verify the sender.
  • password: Password that helps to verify the sender.
  • ssl.enabled: Security protocol used to encrypt your data and protect it from theft (true).
  1. Save the changes in the Filebeat configuration file.

  2. Restart Filebeat, and it will begin sending logs to the Gcore Managed Logging.

Was this article helpful?

Not a Gcore user yet?

Discover our offerings, including virtual instances starting from 3.7 euro/mo, bare metal servers, AI Infrastructure, load balancers, Managed Kubernetes, Function as a Service, and Centralized Logging solutions.

Go to the product page