This step-by-step guide helps you collect, and parse logs from your logstash processing pipeline into Axiom
input
sources, filter data for the specified configuration, and eventually store it.
Logstash sits between your data and where you want to keep it.
org-id
header if you are using personal token. However, it’s best to use an API token to avoid the need to set the org-id
header.
Learn more about API and personal tokens
logstash.conf
file, define the source, set the rules to format your data, and set Axiom as the destination where the data is sent.
The Logstash configuration works with OpenSearch, so you can use the OpenSearch syntax to define the source and destination.
The Logstash Pipeline has three stages:
logstash.conf
, configure your Logstash pipeline to collect and send data logs to Axiom.
The example below shows Logstash configuration that sends data to Axiom:
COMBINEDAPACHELOG
.
hostname
to host
, converts the response
field value to an integer, changes the method
field to uppercase, and removes the request
and httpversion
fields.
syslog
with severity debug
.
cloned_event
that is a clone of the original event.
ip
field. Note that you may need to specify the path to the GeoIP database file in the plugin configuration, depending on your setup.