AWS Partner Network (APN) Blog

Category: Amazon Simple Storage Service (S3)

Zaloni-AWS-Partners

Turning Data into a Key Enterprise Asset with a Governed Data Lake on AWS

Data and analytics success relies on providing analysts and data end users with quick, easy access to accurate, quality data. Enterprises need a high performing and cost-efficient data architecture that supports demand for data access, while providing the data governance and management capabilities required by IT. Data management excellence, which is best achieved via a data lake on AWS, captures and makes quality data available to analysts in a fast and cost-effective way.

MongoDB_AWS Solutions

MongoDB Atlas Data Lake Lets Developers Create Value from Rich Modern Data 

With the proliferation of cost-effective storage options such as Amazon S3, there should be no reason you can’t keep your data forever, except that with this much data it can be difficult to create value in a timely and efficient way. MongoDB’s Atlas Data Lake enables developers to mine their data for insights with more storage options and the speed and agility of the AWS Cloud. It provides a serverless parallelized compute platform that gives you a powerful and flexible way to analyze and explore your data on Amazon S3.

How to Create a Continually Refreshed Amazon S3 Data Lake in Just One Day

Data management architectures have evolved drastically from the traditional data warehousing model, to today’s more flexible systems that use pay-as-you-go cloud computing models for big data workloads. Learn how AWS services like Amazon EMR can be used with Bryte Systems to deploy an Amazon S3 data lake in one day. We’ll also detail how AWS and the BryteFlow solution can automate modern data architecture to significantly accelerate delivery and business insights at scale.

Splunk_AWS Solutions

How to Reduce AWS Storage Costs for Splunk Deployments Using SmartStore

It can be overwhelming for organizations to keep pace with the amount of data being generated by machines every day. There’s a great deal of meaningful information that can be extracted from data, but companies need software vendors to develop tools that help. In this post, learn about Splunk SmartStore and how it helps customers to reduce storage cost in a Splunk deployment on AWS. Many customers are using SmartStore to reduce the size of Amazon EBS volumes and moving data to Amazon S3.

How Cloud Backup for Mainframes Cuts Costs with BMC AMI Cloud Data and AWS

Mainframe cold storage based on disks and tapes is typically expensive and rigid. BMC AMI Cloud Data improves the economics and flexibility by leveraging AWS storage for archival, backup, and recovery of mainframe data. BMC AMI Cloud Data enables mainframe customers to leverage modern cloud technologies and economics to reduce data recovery risks and improve application availability by providing a software-defined solution for archive, backup, and recovery directly from AWS.

AWS Cloud Automation

Using Amazon CloudFront with Multi-Region Amazon S3 Origins

By leveraging services like Amazon S3 to host content, AWS Competency Partner Cloudar has a cost effective way to build websites that are highly available. If content is stored in a single Amazon S3 bucket, all of the content is stored in a single AWS region. To serve content from other regions, you need to route requests to different Amazon S3 buckets. In this post, explore how to accomplished this by using Amazon CloudFront as a content delivery network and Lambda@Edge as a router.

AWS-Blu-Age

How to Migrate Mainframe Batch to Cloud Microservices with AWS Blu Age

While modernizing customer mainframes, the team at AWS Blu Age discovered that Batch can be a complex aspect of a mainframe migration to AWS. It’s critical to design your AWS architecture to account for the key Batch stringent performance requirements such as intensive I/Os, large datasets, and short durations. Let’s explore how to migrate mainframe Batch to AWS microservices using AWS Blu Age automated transformation technology.

Storage_featured

Getting the Most Out of the Amazon S3 CLI

Amazon S3 makes it possible to store unlimited numbers of objects, each up to 5 TB in size. Managing resources at this scale requires quality tooling. When it comes time to upload many objects, a few large objects or a mix of both, you’ll want to find the right tool for the job. This post looks at one option that is sometimes overlooked: the AWS Command Line Interface (AWS CLI) for Amazon S3.