AWS Public Sector Blog
Tag: Machine Learning
BriBooks improves children’s creative writing with generative AI, powered by AWS
Generative artificial intelligence (generative AI) has the potential to play several important roles in education, transforming the way we teach and learn. This blog post looks at how one EdTech startup, BriBooks, is leveraging generative AI to assist young children with creative writing.
Maximizing satellite communications usage with Amazon Forecast
This walkthrough explores how to leverage Amazon Forecast to derive valuable business insights in satellite communications use-cases. Operations teams can quickly see accurate satellite capacity forecasts on a per beam basis. The benefits include lower cost via provisioning just the right amount of bandwidth, and a more streamlined customer experience since users will be less impacted by weather or surge events.
How to detect wildfire smoke using Amazon Rekognition
Since wildfires can double in size and intensity every three to five minutes, early detection and reduced response times are essential. Cloud technologies, including artificial intelligence (AI) and machine learning (ML), can help with this. Learn a high-level architecture to create a solution with AWS that uses AI to identify and classify wildfire smoke imagery and then rapidly alert and inform first responders about the location and condition of a fire incident.
Top highlights from the 2023 IMAGINE: Education, State, and Local Leaders conference
From July 11-12, 2023, the IMAGINE: Education, State, and Local Leaders conference from AWS gathered public sector leaders from around the world in Sacramento, CA to reimagine the future of state and local government and education. At the keynote address, Kim Majerus, vice president of global education and US state and local government at AWS, shared stories of education and state and local organizations using cloud technology to provide meaningful services for their communities. Three guest speakers joined the stage to share how they’re using AWS to protect education services against the rising threat of cyber attacks, advance cutting-edge research, promote innovative citizen services, and more.
36 new or updated datasets on the Registry of Open Data: AI analysis-ready datasets and more
This quarter, AWS released 36 new or updated datasets. As July 16 is Artificial Intelligence (AI) Appreciation Day, the AWS Open Data team is highlighting three unique datasets that are analysis-ready for AI. What will you build with these datasets?
A framework to mitigate bias and improve outcomes in the new age of AI
Artificial intelligence (AI) and machine learning (ML) technologies are transforming many industries. But although public sector organizations are realizing the benefits of these technologies, there are many remaining challenges, including biases and a lack of transparency, that limit the wider adoption to unlock the full potential of AI and ML. In this post, learn a high-level framework for how AWS can help you address these challenges and provide better outcomes for constituents.
Largest metastatic cancer dataset now available at no cost to researchers worldwide
The NYUMets team, led by Dr. Eric Oermann at NYU Langone Medical Center, is collaborating with AWS Open Data, NVIDIA, and Medical Open Network for Artificial Intelligence (MONAI), to develop an open science approach to support researchers to help as many patients with metastatic cancer as possible. With support from the AWS Open Data Sponsorship Program, the NYUMets: Brain dataset is now openly available at no cost to researchers around the world.
Optimizing your nonprofit mission impact with AWS Glue and Amazon Redshift ML
Nonprofit organizations focus on a specific mission to impact their members, communities, and the world. In the nonprofit space, where resources are limited, it’s important to optimize the impact of your efforts. Learn how you can apply machine learning with Amazon Redshift ML on public datasets to support data-driven decisions optimizing your impact. This walkthrough focuses on the use case for how to use open data to support food security programming, but this solution can be applied to many other initiatives in the nonprofit space.
Optimizing operations for ground-based, extremely large telescopes with AWS
Ground-based, extremely large telescopes (ELTs), such as the Giant Magellan Telescope (GMT), will play a crucial role in modern astronomy by providing observations of the universe with remarkable clarity and detail. However, managing the vast amount of data generated by these instruments and supporting optimal performance can be a challenging task. AWS provides a suite of cloud-based solutions that can help address these challenges and streamline ELT operations. Learn how various AWS services can be used to optimize data storage, management, and processing, as well as advanced monitoring and remote continuity techniques, leading to improved overall performance and efficiency for ELTs.
Decrease geospatial query latency from minutes to seconds using Zarr on Amazon S3
Geospatial data, including many climate and weather datasets, are often released by government and nonprofit organizations in compressed file formats such as the Network Common Data Form (NetCDF) or GRIdded Binary (GRIB). As the complexity and size of geospatial datasets continue to grow, it is more time- and cost-efficient to leave the files in one place, virtually query the data, and download only the subset that is needed locally. Unlike legacy file formats, the cloud-native Zarr format is designed for virtual and efficient access to compressed chunks of data saved in a central location such as Amazon S3. In this walkthrough, learn how to convert NetCDF datasets to Zarr using an Amazon SageMaker notebook and an AWS Fargate cluster and query the resulting Zarr store, reducing the time required for time series queries from minutes to seconds.