Getting Started with the Mezmo Platform

Welcome to Mezmo! This Getting Started Guide will walk you through the process of setting up your organization, building your first telemetry pipeline, monitoring telemetry and creating alerts, and trying out the search and visualization features of Mezmo Log Analysis.

Set Up Your Organization

When you created your Free Trial account, you also set up an Organization that you can add members to. This enables you to share reusable pipeline components like Shared Sources and Processor Groups with other members of your organization.

Check out an Example Pipeline

For an example of specialized Pipeline for data optimization, check out the topic Pipeline Example: Kubernetes Telemetry Data Optimization

Build Your First Pipeline

Pipeline Basics

A Telemetry Pipeline is built from three components:

  • Sources
  • Destinations
  • Processors

These topics will provide you with an introduction to each type of component, and a list of the component options for Mezmo Telemetry Pipelines, along with configuration instructions.

Our docs topic Build and Deploy a Mezmo Telemetry Pipeline provides an overview of the basic Pipeline construction process, while View and Sample Pipeline Data will explain how to view your pipeline data in-stream and create samples to use in testing your pipeline.

With Mezmo Flow

With Mezmo Flow guiding the way, you can set up your first log volume reduction pipeline in minutes! Set up your data source, then let Mezmo Flow profile your data and make recommendations for processors to reduce your log volume by as much as 50%.

Add a Data Source

The first step is to set up the Source of your telemetry data. Your options include using the Mezmo Agent, an OpenTelemetry Collector, or Demo Logs You can find a complete list of Supported Telemetry Pipeline Sources in our product documentation.

As part of the Mezmo Platform onboarding flow, you can choose to set up a Pipeline specifically to reduce the log volume, and cost, associated with a DataDog Agent. During the onboarding flow process, set up your Datadog Agent, and Datadog Metrics and Datadog Logs Destinations, then follow the instructions to finish setting up your Pipeline.

Create a Profile

A Data Profile provides you with an in-depth analysis of the most common types, volume, and sources of your incoming telemetry data. When you set up a Source with Mezmo Flow, you will generate a data profile for that source.

When the data profiler completes its analysis, you’ll see charts that provide you with information about the composition of the source logs, the most common message patterns, and a breakdown of log metrics by app, host, level, and log type.

Add a Data Destination

All telemetry pipelines terminate in a Destination like an observability tool or a storage location. When you set up a Pipeline using Mezmo Flow, Mezmo Log Analysis is automatically added as a Destination. You can find a complete list of supported Destinations in our product documentation.

Add Processors

Once Mezmo Flow has analyzed your data and presented you with a data profile, you have the ability to apply Processors to specific message patterns to reduce the volume of log data you’re sending to your destination.

  1. Select the Processor you want to apply to the log data from the Process Logs menu.
  2. As you select a Processor, you will see the effect it has on your overall log volume.
  3. When you’re satisfied with the results, click Apply Processors to Pipeline.
  4. Mezmo Flow will generate a visualization of your Pipeline, with your selected processors grouped into a Processor Group.
  5. You can now edit your Pipeline, add or edit the configuration of the components, or set up additional functionality like In-Stream Alerts. Just click Edit Pipeline to get started.
  6. When you’re finished working on your Pipeline, don’t forget to click Deploy to make the changes active!

You can find a complete list of the available processors, along with links to configuration instructions and usage information, on our Supported Processors page.

If you need more information or advice on building a telemetry pipeline to meet your data management requirements, feel free to reach out to our Technical Services team!

Learn with a Tutorial

These tutorials, which include interactive demos, will show you how to build a mini-pipeline, also known as a Pipette, for specific processing functionality. Along the way you’ll learn about best practices like using the Blackhole destination for testing, using Pipeline Tap to view the changes in data as it passes along the Processor chain, and using Simulationn Mode to test the end-to-end processing of your data.

View and Analyze Telemetry Data

Once you’ve set up your telemetry data pipeline with a Mezmo Log Analysis Destination, you can use the log viewing and search functionality to take a deep dive into your optimized data.

Create Alerts

One of the most important features of a telemetry data platform is its ability to notify you of critical system conditions in a timely way. With most observability tools, alerts are sent only after the data has been indexed and analyzed, which can have a significant negative impact on your ability to respond.

With the Mezmo Platform, you can set alerts not only for the volume of data being indexed, as well as specific views, you can also set alert conditions for any data stream in your telemetry pipeline, which will alert you within milliseconds of the event occurring, rather than after it has been indexed.

Visualize Telemetry Data

With Mezmo's Log Management visualization features, you can create graphs that will enable you to track trends in your log data over time.

Type to search, ESC to discard
Type to search, ESC to discard
Type to search, ESC to discard
  Last updated