AWS Database Blog

Amazon RDS Multi-AZ with two readable standbys: Under the hood

In this post, I discuss Amazon Relational Database Service (Amazon RDS) Multi-AZ DB cluster configurations for Amazon RDS for MySQL and Amazon RDS for PostgreSQL database instances. When you create a Multi-AZ DB cluster, Amazon RDS maintains a primary and two readable standby copies of your data. If there are problems with the primary copy, […]

Build a mortgage-backed securities data model using Amazon Neptune

As organizations adopt modern application architectures such as microservices, application teams tend to retrofit one-size-fits-all databases. The mortgage industry is going through unprecedented transformation due to changing generation technologies such as API adoption. In the mortgage industry, API-enabled software allows lenders, issuers, borrowers, and more to integrate different functionalities into their portal, meaning they bring […]

Migrate from self-managed Db2 to Amazon RDS for Db2 using AWS DMS

We’re excited to announce that AWS Database Migration Service (AWS DMS) now supports Amazon Relational Database Service (Amazon RDS) for Db2 as a target endpoint. This development simplifies the process of migrating self-managed Db2 workloads to Amazon RDS for Db2, a managed service designed to ease the setup, operation, and scaling of Db2 databases in […]

Migrate JSON data from Oracle’s CLOB to Amazon Aurora PostgreSQL and Amazon RDS for PostgreSQL

Migrating databases from Oracle to either Amazon Aurora PostgreSQL-Compatible Edition or Amazon Relational Database Service (Amazon RDS) for PostgreSQL presents a unique challenge when migrating JSON data. Oracle CLOB data type can contain both well-formed and invalid JSON, whereas the PostgreSQL JSON and JSONB data type requires JSON data to be correctly formatted according to […]

Real-time serverless data ingestion from your Kafka clusters into Amazon Timestream using Kafka Connect

Organizations require systems and mechanisms in place to gather and analyze large amounts of data as it is created, in order to get insights and respond in real time. Stream processing data technologies enable organizations to ingest data as it is created, process it, and analyze it as soon as it is accessible. In this […]

Intuit’s implementation of Amazon Aurora mixed-configuration cluster: Achieving high availability, disaster recovery, and up to 55% cost savings

This post was co-written with Rajesh Saluja, Principal Engineer at Intuit. Intuit is the global financial technology environment that powers prosperity for 100 million consumer and small business customers with TurboTax, Credit Karma, QuickBooks, and Mailchimp. Intuit has built on AWS since 2013 and has taken an “all-in” approach in its move to the cloud. […]

How a small DevOps team at Deutsche Bahn unlocked analytics for their SaaS product

This is a guest blog post by Oliver Jägle, Software Architect at DB Curbside Management, in partnership with AWS Senior Solutions Architect Ben Freiberg. Have you ever rented a scooter or a bicycle instead of walking or taking a bus? You’re not alone. More and more people around the globe have adopted this sort of […]

Effective data sorting with Amazon DynamoDB

Amazon DynamoDB offers high scalability and performance for applications with varying workloads. While DynamoDB excels at efficiently distributing data across multiple partitions, it inherently follows a specific sorting order based on the schema selected. In this post, we show two example data models, one designed to store e-commerce order information and one to store game […]

Featured Image for Detect PII data in Amazon Aurora with Amazon Comprehend

Detect PII data in Amazon Aurora with Amazon Comprehend

In this post, we demonstrate how to build a mechanism to automate the detection of sensitive data, in particular personally identifiable information (PII), in your relational database. PII is information connected to an individual and can be used to identify them. Handling PII data in a relational database, such as Amazon Aurora, requires planning and […]

Migrate an Oracle associative array to Amazon Aurora PostgreSQL or Amazon RDS for PostgreSQL

The typical migration process for an Oracle database to Amazon Aurora PostgreSQL-Compatible Edition or Amazon Relational Database Service (Amazon RDS) for PostgreSQL requires both automated and manual procedures. The AWS Schema Conversion Tool (AWS SCT) can handle the automated duties of schema conversion. For specific database objects that can’t be automatically migrated, the manual duties […]