Pricing Login
Pricing
Support
Demo
Interactive demos

Click through interactive platform demos now.

Live demo, real expert

Schedule a platform demo with a Sumo Logic expert.

Start free trial
Back to blog results

February 24, 2022 By Sumo Logic

How to monitor ActiveMQ logs and metrics

ActiveMQ is a message-oriented middleware, which means that it is a piece of software that handles messages across applications. It acts as a broker that can help facilitate asynchronous communication patterns like publish-subscribe and message queues. The main goal of those servers is to create a scalable and reliable message bus that different components can use to communicate with each other.

ActiveMQ competes with similar brokers like RabbitMQ and Kafka, and although it’s not as widely adopted as the former tools, it has many uses in the enterprise world.

In this post, we will explain how to monitor ActiveMQ logs and metrics using Sumo Logic Cloud SIEM. Then, we’ll walk you through the process of installing and configuring a sample ActiveMQ server using Docker. Finally, we will show you how to install both Hosted and Installed Sumo Logic Collectors, then configure monitoring and alerting for them.

Let’s get started.

Setting up an ActiveMQ cluster

In this section, we will show you how to install and set up an ActiveMQ cluster using Docker.

ActiveMQ is a classic Java-based distributed system that implements JMS. Before you set up monitoring on it, you must configure it properly with the correct settings and permission levels. We will deploy an ActiveMQ cluster using Docker in this tutorial, but note that the process of deploying it in a dedicated virtual machine (VM) is similar.

First, we need a stable Docker image. We’ll use the rmohr/activemq (safe link) image, which follows the traditional approach of downloading the release jars and running the standalone server.

The default ActiveMQ configuration will not be enough to expose monitoring via JMX for our Collectors, so we’ll need a way to configure it. We’ll use the Docker host to copy the configuration and data from the Docker image, then configure it appropriately and start the server with that config.

Start with running a shell inside the container:

$ docker run --user root --rm -ti \

-v $(pwd)/conf:/mnt/conf \

-v $(pwd)/data:/mnt/data \

rmohr/activemq:5.15.9 /bin/bash


$ root@ecefc320f4b1:/opt/apache-activemq-5.15.9#


With this command, you share the /conf and /data folders on the host filesystem with the /mnt/conf and /mnt/data inside the container (respectively). We use that to copy the configuration files to their destinations, as follows:

$ chown activemq:activemq /mnt/conf

$ chown activemq:activemq /mnt/data

$ cp -a /opt/activemq/conf/* /mnt/conf/

$ cp -a /opt/activemq/data/* /mnt/data/

exit

Now that we’ve copied the config and data into the host, we are ready to configure them. Edit the following activemq.xml file and enable JMX metrics for it:

conf/activemq.xml

<broker> xmlns="http://activemq.apache.org/schema/core" brokerName="localhost" dataDirectory="${activemq.data}" useJmx="true"</broker>


You might also need to expose JMX if you want to inspect the container using Jconsole from within your host. You will need to start the container with some Java configuration values:

export ACTIVEMQ_SUNJMX_START="-Dcom.sun.management.jmxremote \

-Dcom.sun.management.jmxremote.authenticate=false \

-Dcom.sun.management.jmxremote.ssl=false \

-Dcom.sun.management.jmxremote.port=1099 \

-Dcom.sun.management.jmxremote.rmi.port=1099 \

-Djava.rmi.server.hostname=activemq \

-Dcom.sun.management.jmxremote.host=0.0.0.0 \

-Dcom.sun.management.jmxremote.local.only=false"

Those are the standard flags that we need to use to connect to JMX metrics within Docker. In a production environment, you might want to limit this for localhost and use authenticated credentials.

Now, start the container and expose the necessary ports:

$ docker run -p 61616:61616 -p 8161:8161 -p 1099:1099 —-name=activemq -e ACTIVEMQ_SUNJMX_START=${ACTIVEMQ_SUNJMX_START} \

-v $(pwd)/conf:/opt/activemq/conf \

-v $(pwd)/data:/opt/activemq/data \

rmohr/activemq


You should be able to see logs that show important information about the server status in the console:

INFO | For help or more information please see: http://activemq.apache.org

INFO | No Spring WebApplicationInitializer types detected on classpath

INFO | ActiveMQ WebConsole available at http://0.0.0.0:8161/

INFO | ActiveMQ Jolokia REST API available at http://0.0.0.0:8161/api/jolokia/

INFO | Initializing Spring FrameworkServlet 'dispatcher'

INFO | No Spring WebApplicationInitializer types detected on classpath

INFO | jolokia-agent: Using policy access restrictor classpath:/jolokia-access

You can access the exposed Jolokia REST API at http://0.0.0.0:8161/api/jolokia/ to make sure that it responds with information:

Figure 1.1 – An ActiveMQ Jolokia REST API

You can also connect to the exposed JMX endpoint using JConsole:

$ jconsole


Then, use the following connection string which points to the jmx rmi service:

service:jmx:rmi:///jndi/rmi://localhost:1099/jmxrmi

Figure 1.2 ActiveMQ Jconsole Access

When successful, you should be able to inspect the JVM metrics:

Figure 1.3 – ActiveMQ JConsole Metrics

Now we are ready to start collecting logs using the Sumo Logic Hosted Collector.

Setting up a Hosted Collector for Collecting ActiveMQ Metrics

In this section, we will show you how to use a Hosted Collector for collecting ActiveMQ metrics.

Hosted Collectors are basically predefined URLs hosted by Sumo Logic that we use to send data from various services. They can create and handle thousands of requests per second based on your account plan. You’ll use Hosted Collectors in combination with Telegraf agents, which you will also need to install in your system.

First, you need to create a new Collector for ActiveMQ events using the Sumo Logic UI. Follow the steps outlined here and add the necessary metadata fields. The figure below shows what we filled in:

Figure 1.4 – Our New ActiveMQ Hosted Collector Form

If you save the form, you will be prompted to add a source for the Collector. Select “HTTP Logs and Metrics” and fill in the following fields:

Figure 1.5 – New ActiveMQ HTTP Logs and Metrics

Once saved, the form should give you a unique URL. You will need that URL for the Telegraf config, so be sure to save it temporarily.

Now, we need to create a config file for the Telegraf agent. We’ll use the following standard config and fill in the values for our server and metadata fields:

[[inputs.disk]]

mount_points = ["/"]

[inputs.disk.tags]

environment="prod"

component="messaging"

messaging_system="activemq"

messaging_cluster="activemq_production"

[[inputs.jolokia2_agent]]

urls = ["http://localhost:8161/api/jolokia"]

name_prefix = "activemq_"

username = "admin"

password = "admin"

[inputs.jolokia2_agent.tags]

environment="prod"

component="messaging"

messaging_system="activemq"

messaging_cluster="activemq_production"

[[inputs.jolokia2_agent.metric]]

name = "OperatingSystem"

mbean = "java.lang:type=OperatingSystem"

[[inputs.jolokia2_agent.metric]]

name = "jvm_runtime"

mbean = "java.lang:type=Runtime"

paths = ["Uptime"]

[[inputs.jolokia2_agent.metric]]

name = "jvm_memory"

mbean = "java.lang:type=Memory"

[[inputs.jolokia2_agent.metric]]

name = "jvm_garbage_collector"

mbean = "java.lang:name=*,type=GarbageCollector"

paths = ["CollectionCount"]

tag_keys = ["name"]

[[inputs.jolokia2_agent.metric]]

name = "queue"

mbean = "org.apache.activemq:brokerName=*,destinationName=*, destinationType=Queue,type=Broker"

tag_keys = ["brokerName","destinationName"]

[[inputs.jolokia2_agent.metric]]

name = "topic"

mbean = "org.apache.activemq:brokerName=*,destinationName=*, destinationType=Topic,type=Broker"

tag_keys = ["brokerName","destinationName"]

[[inputs.jolokia2_agent.metric]]

name = "broker"

mbean = "org.apache.activemq:brokerName=*,type=Broker"

tag_keys = ["brokerName"]

[[outputs.sumologic]]

url = "<URL FROM HOSTED COLLECTOR>"

data_format = "prometheus"

Now you can start the agent. You should be able to inspect the logs and verify that it can send data to the output URL:

$ telegraf --debug --config telegraf.conf

2022-01-18T11:55:45Z I! Starting Telegraf 1.21.2

2022-01-18T11:55:45Z D! [agent] Successfully connected to outputs.sumologic

2022-01-18T11:55:45Z D! [agent] Starting service inputs

While we are sending metrics with our Hosted Collector, we can also show you how to set up an Installed Collector for logs.

Setting up an Installed Collector for collecting ActiveMQ logs

In this section, we will show you how to use an Installed Collector for collecting ActiveMQ logs.

As the name implies, an Installed Collector is a collector that is installed on your host. It serves the ActiveMQ broker and is used for collecting application logs and events which it sends directly to Sumo Logic.

To begin, you need to install the Collector on your platform by following these steps. You will need to create access credentials for it, which you can do on the “Preferences” page.

When you’re finished, you need to use the UI to create a new local file source for ActiveMQ. You can use the following figure to fill in the form:

Figure 1.6 – New ActiveMQ Local File Source

You will need to make sure that you point to the correct logs file (which we already shared with the host in the /data folder when we created the container). In our example, it’s inside:

/users/theo.despoudis/Workspace/activemq/data/*.log

While you are there, you can change the log4j.properties file to enable debug logging:

conf/log4j.properties

log4j.appender.logfile.maxFileSize=10240MB

log4j.logger.org.apache.activemq=DEBUG

Now, give it some time or create topics and messages using the management UI located at http://0.0.0.0:8161/admin/ so that you can populate some event logs. After a while, you should be able to see them in your Sumo Logic Logs Dashboard:

Figure 1.7 Logged Events from an ActiveMQ Hosted Collection

Now that we have configured logging for logs and metrics, we need to start setting up specialized monitors for ActiveMQ that will help us trigger alerts in case some predefined thresholds are exceeded.

Setting up monitoring for ActiveMQ

In this section, we will show you how to create monitoring rules for ActiveMQ. The following steps are mainly taken from this documentation page. You can install predefined monitors using the UI’s import option. When you copy the JSON file (and before you save it), you want to modify the following keys and fill in the values of the cluster that we just created:

messaging_cluster=activemq_production

host=theo-despoudis

messaging_system=activemq

Once they’re imported, you want to enable each individual monitor:

Figure 1.8 – List of ActiveMQ Monitors

With the monitors in place, you should be able to receive event alerts from monitoring the ActiveMQ cluster. You might also consider installing the dedicated ActiveMQ app for better visibility.

Sumo Logic apps are custom dashboards built for specific server components and common use cases. You can install AcitveMQ from the app catalog page, available on the web or to customers from our product. When you install ActiveMQ, you can specify the usual data filters related to our cluster:

messaging_cluster=activemq_production

host=theo-despoudis

messaging_system=activemq

environment=prod

Then, you should be able to view and explore the status of your ActiveMQ cluster in detail using the dedicated dashboard.

Figure 1.9 - The ActiveMQ - Overview dashboard gives you an at-a-glance view of your ActiveMQ deployment across brokers, queues, topics and messages.

Next steps

Using Sumo Logic for monitoring and alerting enables you to utilize a single platform to troubleshoot a wide variety of issues with applications in real time. In this article, we showed you how easy it is to set up monitoring and alerts on enterprise application servers like ActiveMQ, and we also explained how to use apps to gain a detailed overview of the server activities.

If you’d like to start using Sumo Logic to start making sense of all your data, click here to give it a try.

Complete visibility for DevSecOps

Reduce downtime and move from reactive to proactive monitoring.

Sumo Logic cloud-native SaaS analytics

Build, run, and secure modern applications and cloud infrastructures.

Start free trial

Sumo Logic

More posts by Sumo Logic.

People who read this also enjoyed