AWS Public Sector Blog

Tag: AWS Step Functions

Unlock third-party data with API-driven data pipelines on AWS

Unlock third-party data with API-driven data pipelines on AWS

Public sector organizations often utilize third-party Software-as-a-Service (SaaS) to manage various business functions, such as marketing and communications, payment processing, workflow automation, donor management, and more. This common SaaS landscape can lead to data silos where data becomes isolated in disparate systems and difficult to centralize for business insights. If existing SaaS connectors are not available, public sector organizations can use AWS to build an API-driven data pipeline to consolidate data from SaaS platforms offering open APIs. In this post, learn how to build an API data pipeline on AWS.

Orbital Sidekick uses AWS to monitor energy pipelines and reduce risks and emissions

Orbital Sidekick (OSK) uses advanced satellite technology and data analytics to help the energy industry protect pipelines and make them less vulnerable to risks such as leaks, contamination, and damage caused by construction and natural disasters. OSK uses compute and analytics services from AWS to power the scalable OSK data pipeline and imagery storage solution in order to persistently monitor tens of thousands of miles of pipeline energy infrastructure and deliver real-time, actionable insights to customers.

Modernizing public sector applications using serverless and containers

Application modernization helps public sector customers innovate faster with resilient, highly available, and scalable applications. Serverless and containers services support customers in accelerating time to market and migrating existing applications to AWS, and they can decrease an organization’s total cost of ownership (TCO). In this blog post, learn how public sector customers use AWS serverless and containers technology to modernize their applications.

How government agencies can vet external data in minutes with data interchange zones

Learn how government agencies can use AWS to build data interchange zones to automate their ability to ingest and validate data from other agencies or external entities in a secure manner. Automating this process can help agencies save time to focus on more strategic aspects of their mission.

Nara Space uses AWS to improve satellite image quality up to three times with deep learning

Nara Space Technology is a South Korea-based startup that builds nano satellite constellations and provides satellite data services to let customers quickly identify and address issues like changing climate conditions and disaster recovery to improve life on Earth. Nara Space provides solutions for nano satellite and small spacecraft system design, integration, development, and testing; enables satellite data analytics based on deep learning; and improves the visual quality of standard satellite imagery with its Super Resolution core technology. To do this, Nara Space uses AWS for secure, flexible, scalable, and cost-efficient cloud solutions.

How Skillshare increased their click-through rate by 63% with Amazon Personalize

Skillshare is the largest global online learning community for creativity. They offer thousands of inspiring classes for creative and curious people on topics including illustration, design, photography, video, freelancing, and more. Skillshare wanted their members to easily discover relevant content with a seamless discovery process of personalized recommendations. Skillshare decided to test Amazon Personalize from AWS to make these data-fueled recommendations for members with machine learning. This blog post describes their Amazon Personalize solution architecture, their AWS Step Functions process, and the results of their experiment.

Discovery Education homepage screen shot

Enhancing content recommendations for educators at Discovery Education with Amazon Personalize

Discovery Education (DE) provides standards-aligned digital curriculum resources, engaging content, and professional learning for K12 classrooms. Learn how they worked with AWS to enhance its K12 learning platform with machine learning (ML) capabilities.

Hope orbiter Mars, photo courtesy of MBRSC, United Arab Emirates

UAE Mars mission uses AWS to advance scientific discoveries

On February 9, a new object successfully began to orbit Mars: an uncrewed spacecraft called the Hope Probe. The mission has already returned the first image of Mars, taken by Hope’s Emirates eXploration Imager from an altitude of 24,700 km. Led by the Mohammed Bin Rashid Space Centre (MBRSC), the Hope Probe is the first interplanetary mission for the United Arab Emirates, the fifth country in history to reach the red planet. It will also be the first spacecraft to capture a complete picture of the Martian atmosphere and its layers during different times of the day and different seasons for one complete Martian year. Once data transmitted by the Hope Probe reaches the scientific teams on Earth, MBRSC will use AWS advanced technologies to process and analyze the vast amounts of data and imagery to help researchers better understand the Martian atmosphere and its layers.

Photo by Hunter Harritt on Unsplash

Modern data engineering in higher ed: Doing DataOps atop a data lake on AWS

Modern data engineering covers several key components of building a modern data lake. Most databases and data warehouses, to an extent, do not lend themselves well to a DevOps model. DataOps grew out of frustrations trying to build a scalable, reusable data pipeline in an automated fashion. DataOps was founded on applying DevOps principles on top of data lakes to help build automated solutions in a more agile manner. With DataOps, users apply principles of data processing on the data lake to curate and collect the transformed data for downstream processing. One reason that DevOps was hard on databases was because testing was hard to automate on such systems. At California State University Chancellors Office (CSUCO), we took a different approach by residing most of our logic with a programming framework that allows us to build a testable platform. Learn how to apply DataOps in ten steps.