AWS Database Blog
Category: Intermediate (200)
Create a Knowledge Graph application with metaphactory and Amazon Neptune
In a previous post, we described how to connect Amazon Neptune to metaphactory, securely, and then how to explore and search the Neptune graph data using metaphactory. In this post, we show how you can use metaphactory to build an end user application using its dynamic model driven components, driven by SPARQL queries.
Workaround for T-SQL global temporary tables in Babelfish for Aurora PostgreSQL
In this post, we show you can implement T-SQL global temporary table behavior in Babelfish for Aurora PostgreSQL using permanent table.
Configure SSL encryption on an SAP ASE source endpoint in AWS DMS
In this post, we walk you through how to configure Secure Sockets Layer (SSL) encryption between the source endpoints in AWS DMS and an on-premises SAP ASE source for secure data transfer. We also show you the steps for enabling SSL on an on-premises SAP ASE database. Configuring SSL encryption on source endpoints enables encrypting data in transit during the database migration process for enhanced security.
Amazon DynamoDB use cases for media and entertainment customers
In this post, we discuss how Amazon DynamoDB helps media and entertainment customers overcome these challenges for streaming and media supply chain workloads. We also share customer examples, such as Disney, Warner Bros. Discovery, ViacomCBS, and other media applications that are built with DynamoDB.
Adding real-time ML predictions for your Amazon Aurora database: Part 2
In this post, we discuss how to implement Aurora ML performance optimizations to perform real-time inference against a SageMaker endpoint at a large scale. More specifically, we simulate an OLTP workload against the database, where multiple clients are making simultaneous calls against the database and are putting the SageMaker endpoint under stress to respond to thousands of requests in a short time window. Moreover, we show how to use SQL triggers to create an automatic orchestration pipeline for your predictive workload without using additional services.
Automate cross-account backup of Amazon RDS for Oracle including database parameter groups, option groups and security groups
In this post, we showcase AWS Backup and CloudFormation support feature of AWS Backup to automate the backup of Amazon RDS for Oracle, including customized database resources such as database parameter group, option group, and security group across AWS accounts.
How PayU uses Amazon Keyspaces (for Apache Cassandra) as a feature store
PayU provides payment gateway solutions to online businesses through its award-winning technology and has empowered over 500 thousand businesses, including the country’s leading enterprises, e-commerce giants, and SMBs, to process millions of transactions daily. In this post, we outline how at PayU, we use Amazon Keyspaces (for Apache Cassandra) as the feature store for real-time, low-latency inference in the payment flow.
How Scopely scaled “MONOPOLY GO!” for millions of players around the globe with Amazon DynamoDB
In this post, we show you how Amazon DynamoDB enabled Scopely to quickly respond to their rapid growth with consistent game performance and availability. We also describe how Scopely improved the availability and performance of their matchmaking service with DynamoDB after facing challenges at scale with other solutions.
Unlock the power of parallel indexing in Amazon DocumentDB
Parallel indexing in Amazon DocumentDB (with MongoDB compatibility) significantly reduces the time to create indexes. In this post, we show you how parallel indexing works, its benefits, and best practices for implementation.
Privileged Database User Activity Monitoring using Database Activity Streams(DAS) and Amazon OpenSearch Service
In this post, we demonstrate how to create a centralized monitoring solution using Database Activity Streams and Amazon OpenSearch Service to meet audit requirements. The solution enables the security team to gather audit data from several Kinesis data streams, enrich, process, and store it with retention to meet compliance requirements, and produce relevant alarms and dashboards.