Collecting log data from Amazon RDS instances can be done through a hosted HTTP collector. There is some configuration required to make this happen, but once the foundation is built, this can be a seamless integration from RDS to Sumo Logic.
Install the AWS RDS Command Line Tools and Configure Access:
This tutorial was performed on a Linux based EC2 machine, for detailed instructions on Windows, please refer to the documentation in the link above.
Obtain the command line tools
Copy the zip file to the desired installation path and unzip
Set up the following environment variables (these might look differently on your system, refer to the documentation for additional detail)
Set up the proper credentials for RDS access by entering access keys here:
For detailed instructions for RDS access, please see (Providing Credentials for the Tools): http://docs.aws.amazon.com/AmazonRDS/latest/CommandLineReference/StartCLI.html
You must also be sure that the user account interacting with RDS has the proper permissions configured in IAM: http://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/UsingWithRDS.IAM.html
Verify by issuing the following command
$ rds-describe-db-log-files <rds instance name here>
If a list of the available log files is returned, you are ready to push the data into Sumo Logic.
Set Up a Sumo Logic Hosted HTTP Collector and Source:
Log in to Sumo Logic and select Add Collector
Choose Hosted Collector, Name it and Select OK when asked if you would like to add a data source:
Give the source a name and fill out relevant metadata. Also configure the options for timestamp parsing and multi line settings:
Upon saving the new source, you will be provided with a unique URL. This is the endpoint to which you will push the AWS RDS logs:
Collecting Logs from RDS and Pushing them to Sumo Logic:
To list available log files for your RDS instance, issue the following command:
$ rds-describe-db-log-files <db instance name>
You can limit the list by date last written as follows (note, uses UNIX POSIX timestamp):
$ rds-describe-db-log-files <db instance name> --file-last-written 1395341819000
To manually push logs to your newly configured HTTP endpoint, this can be done using curl. In the following example, we are pulling one log file and pushing it to Sumo Logic:
$ rds-download-db-logfile orasumo --log-file-name trace/alert_ORASUMO.log | curl -X POST -d @- https://collectors.sumologic.com/receiver/v1/http/redactedKEY
Note: the forward slash in the file name is escaped with a back slash and the output of the rds-download-db-logfile is piped into a curl command that posts the data to Sumo Logic.
Luckily, the RDS command line tools provide an option to continuously monitor log files for activity, to use this feature for an HTTP push, you can do the following:
$ rds-watch-db-logfile sumopostgres --log-file-name error/postgres.log | ./watch-rds.sh
Note, that we are piping the output into a shell script. The contents of our sample script can be seen below:
URL="https://collectors.sumologic.com/receiver/v1/http/<unique URL string>"
while read data;
curl --data "$data" $URL
This script will run until cancelled, so it is best to launch it in the background/nohup.
$ nohup sh -c 'rds-watch-db-logfile <your db instance name> --log-file-name <your db log file name> | ./watch-rds.sh'
Installed Collector Alternative:
If you already have a Sumo Logic collector installed and can access your RDS logs from the command line utilities, simply piping the results from above to a local file and sending the log messages via the collector will also work.
$ rds-watch-db-logfile sumopostgres --log-file-name error/postgres.log > /path/to/localfile.log
Where /path/to/localfile.log is a configured Sumo Logic source for the installed collector.
This article originally appeared on DwayneHoover.com