An IT Manager’s Guide to S3 Logging
Predicting and provisioning your storage space needs can be a complex challenge. The Amazon Simple Storage Service (Amazon S3) application brings cloud-based, scalable, affordable and reliable storage options under your command.
“How much storage space will my organization need, and how much will it cost to manage and maintain?”
This question has kept IT directors up late since the dawn of computing. In the old days preparing for your digital storage needs was a combination of forecasting and guesswork, often resulting in either insufficient space for a business’s needs or, perhaps even worse, massively and expensively overbuilt storage, most of which sits idle and wasted.
But thanks to Amazon S3, modern organizations and enterprises have a powerful new way to shape, predict, and scale storage needs dynamically, using (and paying for) only the space you need at any given time. Additionally, Amazon S3 gives you deep insights into storage patterns and usage activity, allowing you to troubleshoot and resize storage buckets on the fly and deliver optimal user experience.
Here are some of the top capabilities and benefits of Amazon S3, with advice on how to unleash its potential for your organization.
Grab Your Bucket: The Basics of S3 Storage Features
Under Amazon S3 architecture, data is stored in scaleable containers known as buckets. Buckets can contain almost any kind of data, from tiny text files to massive databases or multimedia repositories.
Amazon Transfer Acceleration intelligently routes your data across S3 at up to 6 times regular speeds.
S3 buckets are created and managed in the S3 web interface console, where users oversee their storage infrastructure and options. But unlike standard cloud storage folders, buckets come with API and fine-tuning options to help you optimize storage cost. Built-in monitors tell you what buckets your users are accessing most, and you only pay for the storage and data services you require on a given day.
Buckets can also be moved anywhere across the world in moments, transferred easily to any of the many AWS S3 storage locations around the globe. S3 also offers the unique AWS Transfer Acceleration, a custom service that takes advantage of the AWS global network to speed file movement by up to 500 percent.
This kind of off site redundancy and accessibility would represent major investments in privately owned infrastructure, but with S3 you can be up and running with a truly decentralized, redundant storage solution in minutes.
S3: Storage for all Classes of Data
Data stored in the cloud generally falls into three categories. Amazon S3 manages each of these from the central console:
Frequent access data. Think of this as the day-to-day data used or created in normal business operations. Frequently accessed data is the standard model in Amazon S3, low latency, high availability and scalable throughput to make sure resources are available no matter what the daily traffic.
Infrequent access data. Logs, archived orders, and other important information must be saved and accessible, but probably isn’t required on a daily basis. Amazon S3 offers lower storage and access fees per gigabyte for this data, maximizing the efficiency of your cloud storage budget. Using the S3 web console you can move data between frequent and infrequent storage classes simply and with no changes to core applications.
Archive data. Yearly records, past sales activity, and some other types of data must be safely locked away but under normal circumstances require rare retrieval, or none at all. Amazon solved this storage challenge with Amazon Glacier, a storage solution for your archive data. Offering highly affordable storage costs in exchange for a slower but still reliable retrieval speed, Glacier provides affordable long term storage.
Amazon Glacier solves your long term storage needs. It’s just one of the powerful data tools for S3.
Learning to manage your buckets and move them across storage classes is a snap with S3, and all of your data is guaranteed to be available with Amazon’s 99.99+ percent uptime performance.
Pinpoint Users with S3 Geolocations
Identifying and managing traffic, both the good kind and the bad, is simplified with Amazon S3. Spot traffic trends (or sudden drop offs) by region to investigate anomalies and compare them to normal baselines. If sudden bursts of traffic from unusual regions are requesting access to your data, S3 helps you identify and respond to these potential threats.
Geo tracking also helps with identifying good traffic patterns and adjusting your AWS resources as necessary to better serve particular regions. The insights from S3 logs can provide actionable information about where, when, and for how long your users are active. Storage buckets can then be physically moved to different S3 storage sites to make them closer and more quickly accessible to target users.
Audit Log Data to Perfect your Storage Plan
A fast, clean user experience is key to any cloud service. Amazon S3 helps you detect points where your network is bottlenecked and easily make the changes needed to improve performance.
Underneath the smooth graphic interfaces that simplify modern data management, typical network activity still consists primarily of the following data requests:
- GET, with which a user requests access to an object;
- PUT, which requests permission to add data to resource and make it available via URL;
- LIST, a simple inventory requesting the contents of a resource; and
- DELETE, a restricted privilege for removing a resource.
A basic Amazon S3 configuration provisions storage buckets to handle 300 PUT/LIST/DELETE and 800 GET requests per second. If your traffic patterns exceed this you may see latency issues develop, which you can then address by duplicating resources or adding additional ones to ensure your user experience remains optimal.
On the other hand, analysis may reveal buckets in active storage that are rarely visited, offering the potential of moving them to the less-expense infrequent access data class.
All of this management takes place in the S3 web management console, giving you full control of your latencies and storage configurations from anywhere in the world.
Record and Respond to AWS Error Codes
A wealth of critical information about your network and applications resides in the errors monitored and logged in Amazon S3. Amazon provides a complete list of codes generated when something in your operation goes awry, but common ones include:
Access Denied. When a user request produces a ‘403 Forbidden’ URL, it’s usually caused by mistake, from improper permission settings on new or moved resources, or by malice; repeated Access Denied errors coming from one region or targeting particular resources are good indicators of hacking attempts or malware. S3 makes it easy to capture these errors, trace them to sources, and take corrective actions.
Slow Down. Lots of traffic is generally good for business, but this error, which produces a ‘503 Slow Down’ code in the logs, usually indicates that one region or resource is getting a bit too much traffic. S3 buckets come standardly configured with maximum request rates, so this error tells you attention is needed on one or more of them.
Invalid Request. Though this error code in your S3 logs can indicate a variety of issues, it commonly appears when attempting to use Amazon’s Transfer Acceleration on buckets that either don’t support it or aren’t yet configured for it. Oftentimes the cure for Invalid Request errors can be as simple as clicking the Transfer Acceleration enable box in the S3 management console.
There are many different error codes, and the ones appearing in your S3 logs can come from a variety of causes. But the data you need to act quickly is always at your fingertips.
Stretch Your Data Storage Budget and Save Money
Amazon S3’s key strength lies in leveraging the power of its global network and all the technology behind for individuals. Without building out a massive international infrastructure, staffing it with trained professionals, and hiring 24/7 security experts to keep things running safely, S3 offers comparable capabilities with none of the overhead.
Amazon Web Services offer many tools and plugins that expand on S3’s power, like Glacier for deep storage and Cloudfront, a global content delivery service that accelerates delivery of your AWS content to users around the world.
S3 is a safe and viable option for any new or changing organization, one whose potential should be fully investigated before investing in private infrastructure. You may find it’s the easiest, fastest, and most affordable way to tackle your IT challenges.