AWS Public Sector Blog
Tag: analytics
Solving medical mysteries in the AWS Cloud: Medical data-sharing innovation through the Undiagnosed Diseases Network
It takes a medical village to discover and diagnose rare diseases. The National Institutes of Health’s Undiagnosed Diseases Network (UDN) is made up of a coordinating center, 12 clinical sites, a model organism screening center, a metabolomics core, a sequencing core, and a biorepository. For many years prior to the UDN, the experts at these sites were limited by antiquated data-sharing procedures. The UDN leadership realized that if they wanted to scale up and serve as many patients as possible, they needed to transform how they process, store, and share medical data—which led the UDN to the AWS Cloud.
Balanced budgets and enhanced constituent services: ERP beyond infrastructure in state and local governments
State and local governments (SLGs) are constantly looking for ways to improve the lives and well-being of their constituents. They strive for increased efficiencies in delivering existing government services or potential new services such as those related to assistance with food, health, education, affordable housing, weatherization, and more. Enterprise resource planning (ERP) solutions, along with ancillary business systems that support performance-based budgeting, play a key role in providing data insights. The balancing act between tight budgets and enablement of government services is enabled by fiscal health, sound policies, and data insights. ERP systems deployed on the cloud can help with the capabilities and technology tools that governments need to derive efficiencies with existing services and deploy new services.
New Performance Dashboard on AWS makes delivering open, responsive government simple
Data is at the heart of showing citizens how public services are working, and it enables the public sector to improve policy and operational delivery. Citizens expect accessible and useful services. The public sector aims to demonstrate success through data. To build trust in this relationship and promote accountability, public sector organizations need to communicate the data-driven performance of the services they provide. To help address these challenges, AWS is releasing Performance Dashboard on AWS. Performance Dashboard on AWS is a new open source solution to help you measure and share what’s important in one place and at minimal cost, and you can have the solution up and running in a matter of minutes.
NUS Urban Analytics Lab scales research globally with AWS
The Urban Analytics Lab at the National University of Singapore (NUS) spearheads research in geospatial data analysis and 3D city modelling. The lab’s work underpins the development of smart cities and provides scientists, architects, urban planners, and real estate developers with data insights. These insights help parties make informed decisions about projects ranging from energy modelling to urban farming. To meet rising global demand for its data analytics and planning tools, Urban Analytics Lab turned to Amazon Web Services (AWS).
Accelerating nonprofit and education sector impact through data insights with Salesforce and AWS
Nonprofits and education institutions of all sizes rely on large amounts of data to serve their stakeholders, programs, and governance. For many organizations, the first step in a technology transformation begins with centralizing data that is siloed across a variety of mission critical systems. In support of these goals, Salesforce.org and Amazon Web Service (AWS) are working together to help nonprofits and education institutions derive actionable insights from their data.
Sharing SAS data with Athena and ODBC
If you share data with other researchers, especially if they are using a different tool, you can quickly run into version issues, not knowing which file is the most current. Rather than sending data files everywhere, AWS offers a simple way to store your data in one central location so that you can read your data into SAS and still share it with other colleagues. In this blog post, I will explain how to export your data, store it in AWS, and query the data using SAS.
Combating illicit activity by tracking flight data via the cloud
Many organizations including the intelligence community, security organizations, law enforcement, regulatory bodies, news organizations, and non-governmental organizations work together to disrupt transnational crime networks. Their missions include combating illicit trade; disrupting human, animal, and narcotics trafficking; detecting money laundering; and exposing political corruption. This community needs rapid analysis of large, diverse streams of information about air transportation networks, because air transportation is the fastest way to conduct illicit trade internationally. The nonprofit Center for Advanced Defense Studies (C4ADS) built the Icarus Flights application to meet this need. By building on AWS using managed cloud services, C4ADS spends less time and energy managing infrastructure, which frees them to focus on building innovative analytics and alerting services that their user community needs.
Modern data engineering in higher ed: Doing DataOps atop a data lake on AWS
Modern data engineering covers several key components of building a modern data lake. Most databases and data warehouses, to an extent, do not lend themselves well to a DevOps model. DataOps grew out of frustrations trying to build a scalable, reusable data pipeline in an automated fashion. DataOps was founded on applying DevOps principles on top of data lakes to help build automated solutions in a more agile manner. With DataOps, users apply principles of data processing on the data lake to curate and collect the transformed data for downstream processing. One reason that DevOps was hard on databases was because testing was hard to automate on such systems. At California State University Chancellors Office (CSUCO), we took a different approach by residing most of our logic with a programming framework that allows us to build a testable platform. Learn how to apply DataOps in ten steps.
Enabling warfighters and intelligence mission success
In a world where data is produced and handled at unprecedented speeds and quantities, the need for effective methods to securely store, analyze, and interpret this data is more important now than ever. As agencies within the U.S. Department of Defense and Intelligence Community turn to cloud adoption, they are able to bring new capabilities closer to the tactical edge and accelerate their digital transformation. Agencies can effectively leverage these new technologies such as AI, ML, and data analytics to free up time and resources for warfighters and analysts to focus on mission critical tasks.
Addressing emergencies and disruptions to create business continuity
While disruptive events are challenging for any organization, sudden and large-scale incidents such as natural disasters, IT outages, pandemics, and cyber-attacks can expose critical gaps in technology, culture, and organizational resiliency. Even smaller, unexpected events such as water damage to a critical facility or electrical outages can negatively impact your organization if there is no long-term resiliency plan in place. These events can have significant consequences on your employees, stakeholders, and mission, and can result in long-term financial losses, lost productivity, loss of life, a deterioration of trust with citizens and customers, and lasting reputational damage.