How to use. Configure Logging and view your logs

1. Open the Logging tab and click "Set up Logging".


The Logging page will open, do Steps 2, 3, 4 and 7 in it. The screenshot below highlights the sections and buttons that you need to interact with at these steps.


2. Select the region — this is the city of the data center where we will deploy the storage for logs.

The Logging is available not in all regions. Usually, our customers choose a region closer to their equipment so that logs get to the storage as quickly as possible.

3. Click "Create topic", give it a name and confirm.  

A topic is a logical storage unit where logs are collected. You can compare it to a folder on your PC: logs will be collected into a topic exactly as you put different files into a folder. 

4. Click "Generate credentials". 

You will receive a login and password from the storage. With them, your equipment will be able to connect to the topic for log export. Immediately save the credentials to your PC: after closing the window, you cannot see them again. 

If you forget your username and password, click "Generate credentials" again — a new pair will be generated, and the previous one will become invalid. 

If you use Logging in multiple regions (you specify a region at Step 2), a new storage is created in each of them, and you need to generate other credentials to connect to it. Your login is the same for all storages, while the passwords are different. For example, if you use Logging in six regions, you will have one login and six passwords. 

5. Install a log shipping tool on the equipment from which you want to collect logs.

For example, you can install install Fluent Bit or Filebeat

6. In the log shipper settings, specify which logs you need to collect, and configure the log export to our Kafka servers: specify the topic name, Kafka Endpoint, as well as the credentials generated at Step 4.  

If you use Fluent Bit or Filebeat, you can configure it with our instructions: Configure Fluent Bit, Configure Filebeat

After setting up a log shipper, the logs of your device will be sent to your Kafka topic. Kafka servers will then automatically transfer them to our OpenSearch servers.  

7. Go to the OpenSearch Dashboards URL specified on the Logging page.  

8. Click "Create an index pattern" at the bottom of the screen. 

On the OpenSearch server, your logs are linked to your personal index. The index has the same name as your topic. At this step, you create an index pattern — a filter that will help OpenSearch Dashboards to display logs with the required index.


9. In the "index-pattern" field, enter the namespace, name of the topic you created at Step 3 with an asterisk. For example, if your namespace is namespace1, and you have created a topic named exampletopic, enter namespace1.exampletopic* in the field. Then click "Next step". 


10. In the dropdown list, select the format used to indicate date and time in your logs. Then click "Create index pattern".  

An index pattern is created, and your logs are available in OpenSearch Dashboards. The index pattern has been saved. The next time you log into OpenSearch Dashboards, you will not need to configure them again.


11. The setup is finished. In the main menu, select "Discover" to see your logs.


Was this article helpful?
Recently viewed articles