8. Tower Logging and Aggregation

Logging is a standalone feature introduced in Ansible Tower 3.1.0 that provides the capability to send detailed logs to several kinds of 3rd party external log aggregation services. Services connected to this data feed serve as a useful means in gaining insight into Tower usage or technical trends. The data can be used to analyze events in the infrastructure, monitor for anomalies, and correlate events from one service with events in another. The types of data that are most useful to Tower are job fact data, job events/job runs, activity stream data, and log messages. The data is sent in JSON format over a HTTP connection using minimal service-specific tweaks engineered in a custom handler or via an imported library.

8.1. Logging Aggregator Services

The logging aggregator service works with the following monitoring and data analysis systems:

8.1.1. Splunk

Ansible Tower’s Splunk logging integration uses the Splunk HTTP Collector. For details on setting up the HTTP Event Collector, refer to for further instructions.

8.1.2. Loggly

To set up the sending of logs through Loggly’s HTTP endpoint, refer to Loggly uses the URL convention described at, which is shown inputted in the Logging Aggregator field in the example below:


8.1.3. Sumologic

In Sumologic, create a search criteria containing the json files that provide the parameters used to collect the data you need.


8.1.4. Elastic stack (formerly ELK stack)

You can visualize information from the Tower logs in Kibana, captured via an Elastic stack consuming the logs. Ansible Tower provides compatibility with the logstash connector, and compatibility with the data model of elastic search. You can use the example settings, and either a library or provided examples to stand up containers that will demo the Elastic stack use end-to-end.

Tower uses logstash configuration to specify the source of the logs. Use this template to provide the input:

input {
       http {
        port => 8085
        user => logger_username
        password => "password"

Add this to your configuration file in order to process the message content:

filter {
       json {
        source => "message"

8.2. Set Up Logging with Tower

To set up logging to any of the aggregator types:

  1. From the Settings (settings) Menu screen, click on Configure Tower.
  1. Select the System tab.
  2. Select Logging from the Sub Category drop-down menu list.
  3. Set the configurable options from the fields provided:
  • Logging Aggregator: Enter the hostname or IP address you want to send logs.
  • Logging Aggregator Port: Specify the port for the aggregator if it requires one.
  • Logging Aggregator Type: Click to select the aggregator service from the drop-down menu:
  • Logging Aggregator Username: Enter the username of the logging aggregator if it requires it.
  • Logging Aggregator Password/Token: Enter the password of the logging aggregator if it requires it.
  • Loggers to Send Data to the Log Aggregator From: All four types of data are pre-populate by default. Click the tooltip help icon next to the field for additional information on each data type. Delete the data types you do not want.
  • Log System Tracking Facts Individually: Click the tooltip help icon for additional information whether or not you want to turn it on, or leave it off by default.
  • Enable External Logging: Click the toggle button to ON if you want to send logs to an external log aggregator.
  1. Review your entries for your chosen logging aggregation. Below is an example of one set up for Splunk:
  1. When done, click Save to apply the settings or Cancel to abandon the changes.

8.3. SSL Certificate Verification

SSL certificate verification is enabled for logging aggregation by default. Ansible Tower 3.1.3 allows you to disable it by manually adding the following environment variable in your local file and setting it to False: