Enhance the traveler experience and optimize airport operations with a data management solution built on AWS
This Guidance helps you build data management systems that can both monitor airport operations and enhance the traveler experience. Data is collected from traditional systems of record, securely stored, managed, and transformed into reportable data sets. Purpose-built databases and AWS services are used to build near real-time operational dashboards and traveler applications that are flexible, agile, easy to deploy, and resilient.
Please note: [Disclaimer]
Architecture Diagram
-
Enhanced Traveler Experience
-
Optimizing Operations
-
Enhanced Traveler Experience
-
This diagram shows how to build a data management system that will generate meaningful insights with information from airlines and vendors to enhance the traveler experience.
Step 1
Build an operational data management system with systems of record such as schedules, near real-time flight information, traveler check-ins, loyalty transactions, lounge usage, and transactions in the airport.Step 2
Collect addresses, demographics, and other public data sets with AWS Data Exchange. Ingest private data sets with AWS Storage Gateway and AWS Transfer Family.Step 3
Optionally, enhance the baggage system with a radio-frequency identification (RFID) type of sortation and tracking system. Or, leverage the bar code scanners and baggage sortation events. Combine AWS IoT Greengrass, AWS IoT Core, and Amazon Kinesis to ingest sortation events.
Step 4
Provide staging for ingesting all batch and near real-time data using cost-effective storage classes in Amazon Simple Storage Service (Amazon S3).
Step 5
Use Amazon EMR and AWS Glue to transform your data. Use open standards to build the data lake using the same data as the operational data management system. Use a read pattern schema to make the raw data and curated data readily available for all user roles.Step 6
Build all reportable data sets in Amazon S3, and leverage Amazon Redshift, Amazon Athena, and Amazon QuickSight for analytics.
Optionally, build data marts in Amazon Redshift for heavily used analytics. For miscellaneous requirements, publish the AWS Glue Data Catalog, and use Athena for analysis using the data lake built in Step 5.
Step 7
Use Amazon SageMaker to provide standard artificial intelligence and machine learning (AI/ML) models for operational analytics. You can also use SageMaker to build your own models on top of the data.Step 8
Use purpose-built databases like Amazon DynamoDB, and serverless services like AWS Lambda and Amazon API Gateway, to deliver microservices and events for operational data stores.Build near real-time operational dashboards and customer applications using these microservices.
Step 9
Leverage Amazon DynamoDB Streams and AWS Step Functions to publish flight and customer events to downstream systems, like baggage reconciliation and ground transportation. -
Optimizing Operations
-
This diagram shows how to build a data management system to monitor airport operations in near real-time. It can be used to predict costs, revenue, turnaround times, and potential delays using open data standards, purpose-built databases, and an extensive serverless architecture.
Step 1
Build the data management system with systems of record such as airline schedules, near real-time flight information, weather forecasts, aircraft position data, airport resources, and billing.
Step 2
Leverage AWS Data Exchange to create the data collection from public sources such as the weather and aircraft position. Ingest private data sets with Storage Gateway and Transfer Family.Step 3
Use AWS Panorama or follow Aircraft Turn Tracking to passively collect and leverage aircraft gate turn events.
Step 4
Provide staging for ingesting all batch and near real-time data using cost-effective storage classes in Amazon S3.
Step 5
Transform your data with Amazon EMR and AWS Glue. Use open standards to build a data lake using the same data as the operational data management system. Use a read pattern schema to make the raw data and curated data readily available for all roles.Step 6
Build all reportable data sets in Amazon S3. Build data marts in Amazon Redshift for heavy analytics. For miscellaneous requirements, publish the Data Catalog and use Athena and QuickSight for analysis using the data lake.
Step 7
Use SageMaker to provide standard AI/ML models for operational analytics, or use SageMaker to build your own models on top of the data.Step 8
Use purpose-built databases like DynamoDB and serverless services like Lambda and API Gateway to deliver microservices and events for operational data stores. Build a near real-time operational dashboard and customer applications leveraging these microservices.Step 9
Leverage DynamoDB Streams and Step Functions to publish flight and aircraft movement events to keep the Airport Operating Database and Airport Display systems current.
Well-Architected Pillars
The AWS Well-Architected Framework helps you understand the pros and cons of the decisions you make when building systems in the cloud. The six pillars of the Framework allow you to learn architectural best practices for designing and operating reliable, secure, efficient, cost-effective, and sustainable systems. Using the AWS Well-Architected Tool, available at no charge in the AWS Management Console, you can review your workloads against these best practices by answering a set of questions for each pillar.
The architecture diagram above is an example of a Solution created with Well-Architected best practices in mind. To be fully Well-Architected, you should follow as many Well-Architected best practices as possible.
-
Operational Excellence
To safely operate this Guidance, use Amazon CloudFormation and deploy in your cloud environment. CloudFormation helps you scale your infrastructure and manage your resources through a single operation. We also recommend you use Amazon CloudWatch to increase your observability with application and service-level metrics, personalized dashboards, and logs.
-
Security
For secure authentication and authorization, this Guidance uses AWS Identity and Access Management (IAM) roles that control people and machine access. In addition, file transfers into Amazon S3 are secured through the services' native security features.
To protect resources in this Guidance, all data is encrypted both in motion and at rest. You can also use customer-controlled AWS Key Management Service (AWS KMS) for encryption.
-
Reliability
The serverless components in this Guidance are highly available and automatically scale based on usage, ensuring you have a reliable application-level architecture. With Amazon S3 Select, you can use structured query language (SQL) to filter the contents of an Amazon S3 object and retrieve only the subset of data that you need. Athena allows you to analyze data wherever it lives, while QuickSight powers you with unified business intelligence at hyperscale.
-
Performance Efficiency
To optimize this Guidance, consider adjusting the data input with direct integrations into your own systems through Lambda or AWS Marketplace connectors. A connector is an optional code package that assists with access to data stores that you can subscribe to. You can subscribe to several connectors offered in AWS Marketplace. You can also add more data relevant to your business needs through AWS Data Exchange and adjust the AWS Glue crawler to construct modified data sets to use for forecasting.
-
Cost Optimization
With this Guidance, you benefit from Amazon S3 for inexpensive storage. And with the serverless applications, such as AWS Glue and Lambda, you are charged only for usage. The managed serverless services in this Guidance offer a pay-as-you-go approach where you pay only for the individual services you need, for as long as you need them. The AWS Billing Console and AWS Budgets can help you monitor spending and control costs.
-
Sustainability
By default, the resources in this Guidance are only activated when there are changes in Amazon S3 buckets, ensuring that this Guidance scales to continually match the load with only the minimum resources required. And with the managed services and dynamic scaling that this Guidance deploys, you minimize the environmental impact of the backend services.
Implementation Resources
A detailed guide is provided to experiment and use within your AWS account. Each stage of building the Guidance, including deployment, usage, and cleanup, is examined to prepare it for deployment.
The sample code is a starting point. It is industry validated, prescriptive but not definitive, and a peek under the hood to help you begin.
Related Content
Disclaimer
The sample code; software libraries; command line tools; proofs of concept; templates; or other related technology (including any of the foregoing that are provided by our personnel) is provided to you as AWS Content under the AWS Customer Agreement, or the relevant written agreement between you and AWS (whichever applies). You should not use this AWS Content in your production accounts, or on production or other critical data. You are responsible for testing, securing, and optimizing the AWS Content, such as sample code, as appropriate for production grade use based on your specific quality control practices and standards. Deploying AWS Content may incur AWS charges for creating or using AWS chargeable resources, such as running Amazon EC2 instances or using Amazon S3 storage.
References to third-party services or organizations in this Guidance do not imply an endorsement, sponsorship, or affiliation between Amazon or AWS and the third party. Guidance from AWS is a technical starting point, and you can customize your integration with third-party services when you deploy the architecture.