AWS Public Sector Blog
Tag: Amazon Redshift
Data is helping EdTechs shape the next generation of solutions
Forrester estimates that data-driven businesses are growing at an average of more than 30 percent annually. This is also happening at education technology companies. With new data sources have emerging, including real-time streaming data from virtual classrooms, mobile engagement, unique usage, and new learners, these data sources are shaping the next generation of EdTech products that engage learners meaningfully around the world. Learn how four AWS EdStart Members are utilizing data to power their solutions.
How Skillshare increased their click-through rate by 63% with Amazon Personalize
Skillshare is the largest global online learning community for creativity. They offer thousands of inspiring classes for creative and curious people on topics including illustration, design, photography, video, freelancing, and more. Skillshare wanted their members to easily discover relevant content with a seamless discovery process of personalized recommendations. Skillshare decided to test Amazon Personalize from AWS to make these data-fueled recommendations for members with machine learning. This blog post describes their Amazon Personalize solution architecture, their AWS Step Functions process, and the results of their experiment.
Top re:Invent 2021 announcements for K12 education
AWS announced over 85 new services and features at re:Invent 2021, with something to offer for every industry — including K12 education. These new services and features unlock new use cases and lower the barrier to entry for schools looking to adopt cloud technology to better serve their students, parents, and staff. Read on for highlights of some of the key AWS announcements from re:Invent 2021 that can help K12 education.
Australian Bureau of Statistics runs 2021 Census on the AWS Cloud
Earlier this year, the Australian Bureau of Statistics (ABS) ran the Australian Census, the agency’s most significant workload, on Amazon Web Services (AWS). The Census is the most comprehensive snapshot of the country, and includes around 10 million households and over 25 million people. With the COVID-19 pandemic causing lockdowns across the country, ABS needed a digital option for the Census that was accessible and reliable for millions of people. They turned to the cloud.
Elevating cloud security to address regulatory requirements for security and disaster recovery
Learn how you can build a foundation of security objectives practices, including a business continuity and disaster recovery plan, that can be adapted to meet a dynamic policy environment and support the missions of national computer security incident response teams (CSIRT), operators of essential services (OES), digital service providers (DSP), and other identified sector organizations.
AWS Jam Lounge and virtual workshops offer hands-on learning at AWS Public Sector Summit Online
Join us at the upcoming AWS Public Sector Summit Online (April 15-16, 2021) where attendees will have the opportunity to test their knowledge and learn new skills in the AWS Jam Lounge and virtual workshops. Put your skills to the test in the AWS Jam Lounge (Sponsored by Intel and Fortinet) and learn something new by attending virtual workshops
How Times Higher Education accelerated their journey with the AWS Data Lab
Times Higher Education (THE) is a data-driven business that, with the help of AWS, is now realising the value of their data, which enables them to be better informed and make faster decisions for customers. THE provides a broad range of services to help set the agenda in higher education, and their insights help universities improve through performance analysis. THE worked with the AWS Data Lab to create a centralised repository of their data. Launching a data lake helped with providing a cost-effective platform and cataloguing data so they could understand their data and design new products to make use of it.
Modern data engineering in higher ed: Doing DataOps atop a data lake on AWS
Modern data engineering covers several key components of building a modern data lake. Most databases and data warehouses, to an extent, do not lend themselves well to a DevOps model. DataOps grew out of frustrations trying to build a scalable, reusable data pipeline in an automated fashion. DataOps was founded on applying DevOps principles on top of data lakes to help build automated solutions in a more agile manner. With DataOps, users apply principles of data processing on the data lake to curate and collect the transformed data for downstream processing. One reason that DevOps was hard on databases was because testing was hard to automate on such systems. At California State University Chancellors Office (CSUCO), we took a different approach by residing most of our logic with a programming framework that allows us to build a testable platform. Learn how to apply DataOps in ten steps.
Scaling to share unprecedented volume of election donation data, quickly and cost-effectively
Campaign contributions have grown exponentially in the United States. In 1980, there were around 500,000 contributions made; in 2020 alone, the Federal Election Commission (FEC) expects 500 million contributions. Meanwhile, the evolution of technology has changed the way Americans contribute to political campaigns, making it easier to make many small contributions. To meet unprecedented demand for data transparency, the FEC turned to the cloud.
Optimizing your call center to improve citizen services with the cloud
Public sector organizations are experiencing a high volume of requests for information ranging from health to finances to municipal services. At a time when in-person interaction is limited, citizens can call into contact centers to get the insights they need to make real-time decisions about their health and safety. Many organizations are turning to the cloud to quickly scale and deploy a contact center. But, understanding your cloud contact center at granular level can help better serve your constituents.