Skip to content

Forwarders

Overview

Coralogix forwarders let you effortlessly send your parsed and enriched data from Coralogix to an external Kafka cluster, integrating with your existing data pipelines and systems. Forwarders let you tap into specialized analytics tools, comply with data retention policies, and expand your observability setup. Moreover, as an AWS MSK runs open-source versions of Apache Kafka, you can avoid a single observability platform lock-in. Currently, Corralogix supports only AWS MSK as a target, and more streaming destinations will be added soon.

This tutorial details how to configure Coralogix to send your telemetry data to AWS MSK. For instructions on AWS MSK deployment and adapting it to Coralogix requirements, refer to this guide.

Prerequisites

Configuration

STEP 1. Navigate to Data Flow > Forwarders.

STEP 2. Select Add a forwarder.

STEP 3. Configure the forwarder parameters:

  • Name - A meaningful forwarder name.

  • AWS MSK URLs - URLs of the public endpoints of the AWS MSK instance. The MSK instance must be in the same region as your Coralogix platform.

  • Topic Name - The name of the Kafka topic.
    Note: Data streaming might fail if a Kafka topic has not been created on the cluster and its automatic creation on sending data to the topic has been disabled (auto.create.topics.enable = false). To resolve this, create the topic manually or enable its automatic creation (auto.create.topics.enable = true).

  • Compression - Data compression mode. Currently, it’s permanently set to Gzip.

  • DPXL - DataPrime expression(s) used to define what data to send to the AWS MSK.
    If a query is not included, all events will be forwarded.
    This guide explains how to use DataPrime Expression Language (DPXL).
    For example, send a log that has data with the application name starting with dev- and region ID us-east-1 or us-east-2.
    <v1> $l.applicationName.startsWith('dev-') && region_id:string.in('us-east-1', 'us-west-2')

STEP 4. Turn on the Active switch to activate the forwarder.

STEP 5. If you want to test your connection before saving the forwarder:

  • Hover over the Example Log icon to view a sample log to be sent.

  • Click Send sample logs to forward the sample log.

  • Run the kafka-console-consumer command to verify the data arrival at the Kafka topic.

 ./kafka-console-consumer.sh  --bootstrap-server <kafka-brokers> --consumer.config client.properties --topic <topic>

The test message must be stored in the topic.

{"metadata":{"timestamp":"2024-08-28T11:02:01.784000","severity":"Info","logid":"2d7a18b8-ddd3-4429-aab9-d510f341c3a4"},"labels":{"applicationname":"sample","subsystemname":"sample-logs"},"data":{"msg":"test message from 2024-08-28T11:02:01.784Z"}}

STEP 6. If the test log has been received successfully, save the forwarder to start sending your data to the AWS MSK.

Additional resources

DocumentationDeployment of customized AWS MSK
DataPrime Expression Language (DPXL)
ExternalGetting started using Amazon MSK

Support

Need help?

Our world-class customer success team is available 24/7 to walk you through your setup and answer any questions that may come up.

Feel free to reach out to us via our in-app chat or by sending us an email to [email protected].