Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Figure out how to collect any Azure logs using the generic event hub integration #74

Open
zmoog opened this issue Jan 22, 2024 · 4 comments
Assignees
Labels

Comments

@zmoog
Copy link
Owner

zmoog commented Jan 22, 2024

I want to collect Azure Application insights logs using the Elastic Agent.

Unfortunately, at the time of this writing, there isn't a specialized integration to collect such logs. But we can leverage the generic Event Hub integration to collect Azure Application insights logs and any other log exported using a Diagnostic Settings.

@zmoog zmoog added the research label Jan 22, 2024
@zmoog zmoog self-assigned this Jan 22, 2024
@zmoog zmoog added this to Notes Jan 22, 2024
@zmoog zmoog moved this to In Progress in Notes Jan 22, 2024
@zmoog
Copy link
Owner Author

zmoog commented Jan 22, 2024

Prerequisites

Application

Search for a Diagnostic Settings that exports Azure Application insights logs.

For this test, I will use an application insight app (or component) named return-of-the-jedi:

CleanShot 2024-01-23 at 00 29 55@2x

Event Hub

We need a new event hub to collect all the logs for this application.

  • Create or use an existing Event Hub namespace
  • Create a new event hub named "insightslogs"

CleanShot 2024-01-23 at 00 30 30@2x

@zmoog
Copy link
Owner Author

zmoog commented Jan 22, 2024

Configuration

Set up the Diagnostic Settings

Using the application return-of-the-jedi:

  • Visit Application > Monitoring > Diagnostic Settings and click on Add diagnostic setting.
  • Set a name
  • Select all the categories you're interested in
  • On Destination details select Stream to an event hub
  • Select the namespace and event hub name from the drop down lists
  • Click Save

CleanShot 2024-01-23 at 00 23 56@2x

Generate some logs

Use the application connected to the application insights resource to get some test logs. In this example, return-of-the-jedi is connected to an App Function with an HTTP endpoint.

I am sending a few requests to the HTTP endpoint, and here are a few logs:

CleanShot 2024-01-23 at 00 35 50@2x

Check the Event Hub for exported logs

If I go back to the event hub "insightslogs", the charts start reporting some data:

CleanShot 2024-01-23 at 01 21 16@2x

@zmoog
Copy link
Owner Author

zmoog commented Jan 22, 2024

Collect the logs

Set up the agent

  • Create a new "Application Insights logs" agent policy for this test
  • Install the generic Azure Event Hub input integration

CleanShot 2024-01-23 at 00 45 17@2x

Set up the integration using the "insightslogs" event hub and the other options. See https://docs.elastic.co/integrations/azure#setup to learn more.

CleanShot 2024-01-23 at 00 48 12@2x

In this first iteration:

  • Leave "Parse azure message" off
  • Turn "Preserve original event: on

Explore the logs

Assign the agent policy to an agent and start exploring the logs.

Open Analytics > Discover and then filter documents using data_stream.dataset : "azure.eventhub":

CleanShot 2024-01-23 at 00 59 34@2x

@zmoog
Copy link
Owner Author

zmoog commented Jan 23, 2024

Explore the logs

Basic parsing

With the current configuration, the integration collects the applications insights logs as string in the message field:

CleanShot 2024-01-23 at 01 03 23@2x

Next Steps

At this point, we have two options:

  • Enable the "Parse azure message" to turn the content of the message field into an object, levering the dynamic mapping.
  • Add a custom pipeline and mapping to fine-tune the documents.

Enable the "Parse azure message"

This is a quick option to start using the logs. Go back to the agent policy and flip the "Parse azure message" switch:

CleanShot 2024-01-23 at 01 06 11@2x

Here is an example document with parsing enabled:

CleanShot 2024-01-23 at 01 09 10@2x

Add a custom pipeline and mapping

The document parsing is great, but there are downsides:

  1. The automatic parsing turns the JSON log into an object; field names can vary a lot, depending on the conventions used by the Azure team responsible for the service.
  2. Conflicts may occur; for example, log categories may have the same field name with different types

Conclusions

"Parse azure message" is a great option, but I recommend considering building custom pipelines and mappings to take complete control.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
Status: In Review
Development

No branches or pull requests

1 participant