AWS Public Sector Blog

Category: Technical How-to

How to implement CNAP for federal and defense customers in AWS

In July 2021, the U.S. Department of Defense (DoD) released a cloud native access point (CNAP) reference design that follows zero trust architecture (ZTA) principles and provides a new approach to access mission owner (MO) applications. The DoD’s reference design discusses four core capabilities of CNAP: authenticated and authorized entities (C1), authorized ingress (C2), authorized egress (C3), and security monitoring and compliance enforcement (C4). In this blog post, we walk through how to establish the C2 component via a virtual internet access point (vIAP) with AWS. The proposed architectures can reduce operational cost and management overhead, while improving the accessibility, resiliency, and security of mission owner applications.

How to partition your geospatial data lake for analysis with Amazon Redshift

Data lakes are becoming increasingly common in many different workloads, and geospatial is no exception. In 2021, Amazon Web Services (AWS) announced geography and geohash support on Amazon Redshift, so geospatial analysts have the capability to quickly and efficiently query geohashed vector data in Amazon Simple Storage Service (Amazon S3). In this blog post, I walk through how to use geohashing with Amazon Redshift partitioning for quick and efficient geospatial data access, analysis, and transformation in your data lake.

Architecture framework for transforming federal customer experience and service delivery

Customer experience (CX) has emerged as a key priority in the US following the 2021 Biden Administration Executive Order (EO) to transform federal customer experience and service delivery. Application modernization enables agencies to simplify business processes and provide customers with flexible, interactive, and simple to use applications, resulting in improved CX. In this blog post, we present an AWS architecture framework that agencies can use to develop and deploy a modern application that helps improve CX.

How KHUH built a long-term storage solution for medical image data with AWS

King Hamad University Hospital (KHUH) and Bahrain Oncology Center is a 600-bed-hospital in Bahrain. Over the years, KHUH faced constraints with exponential growth of their on-premise storage needs, particularly with the medical images stored by their picture archiving and communication system (PACS). KHUH turned to AWS to develop a cost- and time-effective long-term storage solution, without making changes to their existing PACS, that reduced storage costs by 40%.

Getting started with healthcare data lakes: Using microservices

Data lakes can help hospitals and healthcare organizations turn data into insights and maintain business continuity, while preserving patient privacy. This blog post is part of a larger series about getting started with setting up a healthcare data lake. In this blog post, I detail how the solution has evolved at a foundational level over the series to include microservices. I describe the design decisions I’ve made and the additional features used. You can access code samples for this solution through a GitHub repo for reference.

How public sector agencies can identify improper payments with machine learning

To mitigate synthetic fraud, government agencies should consider complementing their rules-based improper payment detection systems with machine learning (ML) techniques. By using ML on a large number of disparate but related data sources, including social media, agencies can formulate a more comprehensive risk score for each individual or transaction to help investigators identify improper payments efficiently. In this blog post, we provide a foundational reference architecture for an ML-powered improper payment detection solution using AWS ML services.

Enhance operational agility and decision advantage with AWS Snowball Edge

In a data-dependent world, success belongs to the side with decision advantage: the ability to acquire data and make sense of a complex and adaptive environment, and act smarter and faster than the competition. Understanding global environments requires more than just more data – it requires live two- and three-dimensional maps, new support tools, improved processes, seamless connectivity, and better collaboration that can scale to the needs of the environment. This blog post explores how to address challenges of big data and accelerate time to data insights with machine learning with AWS Snowball Edge device deployment at the edge.

Move data in and out of AWS GovCloud (US) with Amazon S3

Increasingly, AWS customers are operating workloads both in AWS GovCloud (US) and standard AWS Regions. Dependencies between workloads, changing data controls, or enrichment of data across multiple data levels are examples of business needs that may require moving data in and out of AWS GovCloud (US). In this blog post, I explain how to move data between Amazon Simple Storage Service (Amazon S3) buckets in the AWS GovCloud (US) and standard partitions.

Move data in and out of AWS GovCloud (US) with AWS DataSync

As public sector customers find increasing need to move data between the AWS GovCloud (US) partition and the standard partition, they need tools to help them lower their operational burden. In this blog post, I walk through how to use AWS DataSync to move data on network file system (NFS) shares between the two partitions.

Virtualizing the satellite ground segment with AWS

As the number of spacecraft and spacecraft missions accelerates, moving aerospace and satellite operations to the cloud via digital transformation — including virtualizing the ground segment — is key for economic viability. In this blog post, we explain the benefits of virtualizing the ground segment in the cloud and present the core components of a reference architecture that uses AWS to support several stages of a comprehensive ground segment virtualization. Then, working from this model, we present additional reference architectures for virtualizing the ground segment that can accommodate various requirements and usage scenarios.