AWS Public Sector Blog
Tag: AWS CodePipeline
Canada’s Federal Geospatial Platform supports decision-making using AWS
Data has become a new global currency in the digital age, thus the capacity to turn it into useful information is becoming increasingly important. Geospatial data is collected and used by the Canadian government to support goals such as economic growth, environmental management, and social well-being.
NHS Digital launches NHS login with AWS
NHS Digital launched NHS login, a serverless identity platform to facilitate access to a range of health and care apps for residents in England, with AWS, amongst other suppliers. Using the AWS Cloud, NHS Digital achieves scale, high availability, and security for citizens accessing these services, and helps users access NHS services quicker and more simply. NHS login is one of a number of services NHS Digital are hosting on the cloud as part of the UK government’s ‘Cloud First’ policy.
One small team created a cloud-based predictive modeling solution to improve healthcare services in the UK
How do you predict and prepare for your citizens’ health and wellness needs during the COVID-19 pandemic? Healthier Lancashire and South Cumbria Integrated Care System (ICS) quickly scaled a platform on AWS to support the 1.8 million people in their region with Nexus Intelligence, an interactive health intelligence application with a suite of predictive models against various measures of need and health outcomes. Nexus Intelligence not only supported the ICS response to the pandemic, but is expected to help reconfigure and re-invest in services to improve the health and well-being of the population and reduce health inequalities.
Modern data engineering in higher ed: Doing DataOps atop a data lake on AWS
Modern data engineering covers several key components of building a modern data lake. Most databases and data warehouses, to an extent, do not lend themselves well to a DevOps model. DataOps grew out of frustrations trying to build a scalable, reusable data pipeline in an automated fashion. DataOps was founded on applying DevOps principles on top of data lakes to help build automated solutions in a more agile manner. With DataOps, users apply principles of data processing on the data lake to curate and collect the transformed data for downstream processing. One reason that DevOps was hard on databases was because testing was hard to automate on such systems. At California State University Chancellors Office (CSUCO), we took a different approach by residing most of our logic with a programming framework that allows us to build a testable platform. Learn how to apply DataOps in ten steps.
T Digital shares lessons learned about flexibility, agility, and cost savings using AWS
T-Digital, a division of Tshwane University Technology Enterprise Holding (TUTEH) in South Africa, built TRes, a digital platform for students living in student housing and for accommodation providers. TRes connects students with available housing and verified and authorized property owners. It addresses student accommodation needs and helps verified and approved property owners fully allocate their residences, while alleviating administrative burden. With help from AWS Professional Services, T-Digital experienced flexibility, agility, and realized cost savings.
Driving sustainability through youth engagement
Today, speakers at the 24th World Scout Jamboree (WSJ) will introduce nano, a new gamified app developed on the Amazon Web Services (AWS) Cloud that encourages youth around the world to adopt daily sustainability practices to help improve our planet. The Scouts are also launching #ScoutsRecycle, a campaign that leverages the nano app for sustainability. Through nano, scouts will be able to measure the impact of their efforts by calculating the CO2e emissions avoided through #ScoutsRecycle.