AWS Public Sector Blog
Tag: storage
Deploying AWS Modular Data Center: From ordering to delivery and installation
The Amazon Web Services (AWS) Modular Data Center (MDC) is a service that enables rapid deployment of AWS managed data centers for running location- or latency-sensitive applications in locations with limited infrastructure. It reduces deployment time in remote areas and supports up to five racks of AWS Outposts or AWS Snow Family devices. In this post, we guide you through the end-to-end process of deploying the MDC at your site.
5 best practices for accelerating research computing with AWS
Amazon Web Services (AWS) works with higher education institutions, research labs, and researchers around the world to offer cost-effective, scalable, and secure compute, storage, and database capabilities to accelerate time to science. In our work with research leaders and stakeholders, users often ask us about best practices for leveraging cloud for research. In this post, we dive into five common questions we field from research leaders as they build the academic research innovation centers of the future.
St. Louis University uses AWS to make big data accessible for researchers
The research team at SLU’s Sinquefield Center for Applied Economic Research (SCAER) required vast quantities of anonymized cell phone data in order to study the impacts of large-scale social problems. SCAER needed to store, clean, and process 450 terabytes of data, so it worked with Amazon Web Services (AWS) to create a fast, cost-effective solution for managing its growing quantities of data.
Announcing new AWS data connector for popular nonprofit CRM: Blackbaud Raiser’s Edge NXT
The AWS for Nonprofits team announced a new Amazon AppFlow data connector that enables nonprofits to transfer valuable data from Blackbaud Raiser’s Edge NXT to AWS services and other destinations. In this blog post, learn some common nonprofit use cases that can be addressed by integrating your data with other AWS services and commercially available software-as-service (SaaS) applications.
Optimizing operations for ground-based, extremely large telescopes with AWS
Ground-based, extremely large telescopes (ELTs), such as the Giant Magellan Telescope (GMT), will play a crucial role in modern astronomy by providing observations of the universe with remarkable clarity and detail. However, managing the vast amount of data generated by these instruments and supporting optimal performance can be a challenging task. AWS provides a suite of cloud-based solutions that can help address these challenges and streamline ELT operations. Learn how various AWS services can be used to optimize data storage, management, and processing, as well as advanced monitoring and remote continuity techniques, leading to improved overall performance and efficiency for ELTs.
How Digithurst and Telepaxx built a secure and scalable radiology solution chain using AWS
Medical software development companies Digithurst and Telepaxx worked together to create an end-to-end cloud solution chain handling administration of patient data and their radiological scans; viewing and editing of scans; as well as long-term archiving. To develop a scalable, secure, and cost effective solution chain supporting further innovations, the companies turned to the AWS Cloud.
Addressing federal record retention in mobile device messaging
Virtually all federal, state, and local government agencies are subject to various data retention and records management policies, regulations, and laws. AWS Wickr provides federal agencies with an innovative solution that can help them build public trust by protecting sensitive communications, while supporting the capture and management of records.
How KHUH built a long-term storage solution for medical image data with AWS
King Hamad University Hospital (KHUH) and Bahrain Oncology Center is a 600-bed-hospital in Bahrain. Over the years, KHUH faced constraints with exponential growth of their on-premise storage needs, particularly with the medical images stored by their picture archiving and communication system (PACS). KHUH turned to AWS to develop a cost- and time-effective long-term storage solution, without making changes to their existing PACS, that reduced storage costs by 40%.
Introducing 10 minute cloud tutorials for research
Ten Minute Tutorials for Research provides a way for researchers to quickly learn about topics and tools that are specific to their unique needs, covering the basics on how to get started and providing helpful links to get more in-depth information and support—all in ten minutes. The series is led by AWS solutions architects and AWS research business development specialists who work closely with researchers. Many of the presenters are former researchers themselves and content is specifically geared to a research audience.
Modeling clouds in the cloud for air pollution planning: 3 tips from LADCO on using HPC
In the spring of 2019, environmental modelers at the Lake Michigan Air Directors Consortium (LADCO) had a new problem to solve. Emerging research on air pollution along the shores of the Great Lakes in the United States showed that to properly simulate the pollution episodes in the region we needed to apply our models at a finer spatial granularity than the computational capacity of our in-house HPC cluster could handle. The LADCO modelers turned to AWS ParallelCluster to access the HPC resources needed to do this modeling faster and scale for our member states.