AWS Partner Network (APN) Blog
Category: Analytics
How WSP Digital Improves Geospatial Pipeline Efficiency with AWS Step Functions and AWS Batch
Geospatial ETL pipelines prepare data for business analysis and insights, enabling leaders to make informed decisions. Learn how to migrate a geospatial pipeline to AWS Step Functions and AWS Batch to simplify pipeline management while improving performance and costs. Migrating to AWS Batch and AWS Step Functions has transformed the way WSP Digital handles data processing and orchestration, enabling streamlined and automated workflows.
Creating the Right Patient Outcomes with Amazon HealthLake and Accenture Health Analytics
The ability to accurately share and analyze patient information between different healthcare providers and systems is critical to the transition to patient-centric care. Learn how AWS and Accenture collaborated to build a population-scale research cohort analytics solution called Accenture Health Analytics (AHA) which contains 54 million longitudinal patient records using a range of AWS services. It helps healthcare organizations improve patient outcomes and reduce delivery costs.
How to Unlock Real-Time Data Streams with CockroachDB and Amazon MSK
Managing an Apache Kafka deployment can be complex and resource intensive, often requiring additional support. Integrating Amazon Managed Streaming for Apache Kafka (Amazon MSK) and CockroachDB in your Kafka deployment enables a plethora of use cases, including real-time analytics, event-driven microservices such as inventory management, and the ability to archive data for audit logging. This post offers a step-by-step guide to integrate Amazon MSK within the CockroachDB platform.
Enriching Snowflake Data with Amazon Location Service and AWS Lambda
The integration of geospatial data into the broader business intelligence and decision-making process is referred to as location intelligence. On AWS, you can use the Snowflake Data Cloud to integrate fragmented data, discover and securely share data, and execute diverse analytic workloads. This post shows how you can enrich your existing Snowflake data with location-based insights using Amazon Location Service for location intelligence workloads.
Filter and Stream Logs from Amazon S3 Logging Buckets into Splunk Using AWS Lambda
This post showcases a way to filter and stream logs from centralized Amazon S3 logging buckets to Splunk using a push mechanism leveraging AWS Lambda. The push mechanism offers benefits such as lower operational overhead, lower costs, and automated scaling. We’ll provide instructions and a sample Lambda code that filters virtual private cloud (VPC) flow logs with “action” flag set to “REJECT” and pushes it to Splunk via a Splunk HTTP Event Collector (HEC) endpoint.
Cognizant’s Patient Health Insights Suite is a Scalable Platform for Improving Healthcare
Traditional preventive measures mainly focus on promotion of healthcare benefits and lack methods to process huge amounts of data. Cognizant’s Patient Health Insights Suite is a cloud-based, multi-user analytics and insights platform for clinical and real-world evidence data. It provides a suite of interactive self-service applications for comprehensive visual, exploratory, and predictive/prescriptive analyses of patient care and health insights by means of advanced AI algorithms.
Strategies, Patterns, and Security Measures for Integrating Infor CloudSuite with AWS
Infor OS provides deep integration capabilities and includes Intelligent Open Network (ION), which is an interoperability and business process management platform designed to integrate applications, processes, people, and data to run your business. Infor ION enables you to easily integrate your Infor and non-Infor enterprise systems, whether they’re on-premises, in the cloud, or both. In this post, we discuss general scenarios and integration patterns while using ION.
Building a Data Lakehouse with Amazon S3 and Dremio on Apache Iceberg Tables
Learn how to implement a data lakehouse using Amazon S3 and Dremio on Apache Iceberg, which enables data teams to quickly, easily, and safely keep up with data and analytics changes. This helps businesses realize fast turnaround times to process the changes end-to-end. Dremio is an AWS Partner whose data lake engine delivers fast query speed and a self-service semantic layer operating directly against S3 data.
Implementing a Snowflake-Centric Data Mesh on AWS for Scalability and Autonomy
A data mesh architecture is a relatively new approach to managing data in large organizations, aimed at improving scalability, agility, and autonomy of data teams. There’s a need for an architecture that removes complexity and friction of provisioning and managing the lifecycle of data. This post outlines an approach to implement a data mesh with Snowflake as the data platform and with many AWS services like to support all pillars of the data mesh architecture.
Best Practices from Rackspace for Modernizing a Legacy HBase/Solr Architecture Using AWS Services
As technology advances and business requirements change, organizations may find themselves needing to migrate away from legacy data processing systems like HBase, Solr, and HBase Indexer. Explore the advantages of migrating from HBase, Solr, and HBase indexer to a modern data ecosystem based on AWS, and dive deep on the discuss architecture, design, and pathways for implementation. This post offers insights and guidance from Rackspace for those looking to embark on this intricate migration journey.