Category: Amazon Redshift
Driving Big Data Innovation on AWS – ISV Highlights, April 2017
Introduction by Terry Wise, Global Vice President, Channels & Alliances at AWS
What can your data do for you? More importantly, how can insights derived from your data help you drive additional value for end customers?
Our APN partners offer services and solutions that complement what AWS has to offer. As an example, many customers are choosing to build a data lake on AWS. NorthBay is a Big Data Competency Consulting partner that helped architect and implement a data lake on AWS for Eliza Corporation. You can read details of the solution they built here. Today, I want to tell you a bit about four of our AWS Big Data Competency ISVs and what makes them unique: Alteryx, Databricks, SnapLogic, and Treasure Data.
Alteryx
AWS Big Data Competency Holder in Data Integration
How is your time spent when you embark on a new data analytics project? For many, the time required to gather, prepare, and process their data cuts into the time they can spend actually analyzing and learning from their data. Alteryx’s mission is to change the game for these analysts through the company’s self-service data analytics platform. “Alteryx Analytics provides analysts the unique ability to easily prep, blend, and analyze all of their data using a repeatable workflow, then deploy and share analytics at scale for deeper insights in hours, not weeks. Analysts love the Alteryx Analytics platform because they can connect to and cleanse data from data warehouses, cloud applications, spreadsheets, and other sources, easily join this data together, then perform analytics – predictive, statistical, and spatial – using the same intuitive user interface, without writing any code,” says Bob Laurent, VP of product marketing at Alteryx. The company’s products are used by a number of AWS customers, including Chick-fil-A, Marketo, and The National Trust.
Alteryx integrates with Amazon Redshift and provides support for Amazon Aurora and Amazon S3. Using Alteryx on AWS, users can blend data stored in the AWS Cloud, such as data stored in Redshift, with data from other sources using Alteryx’s advanced analytic workflow. Earlier this year, the company virtualized its Alteryx Server platform to make it easy for users to deploy on AWS through the AWS Marketplace. “Organizations can deploy our Alteryx Server platform in the AWS Cloud within minutes, while maintaining the enterprise-class security and scalability of our popular on-premises solution. This gives organizations a choice for how they want to quickly share critical business insights with others in their organization,” explains Laurent.
See Alteryx in action by downloading a free 14-day trial of Alteryx Designer here, or launch Alteryx Server from the AWS Marketplace here. If you’re interested in becoming an Alteryx Partner, click here. To learn more about Alteryx, visit the company’s AWS-dedicated site.
Databricks
AWS Big Data Competency Holder in Advanced Analytics
Are you looking for an efficient way to run Apache® Spark™ as you seek to create value from your data and build a sophisticated analytics solution on AWS? Then take a look at Databricks, founded by the team who created the Apache Spark project. “Databricks provides a just-in-time data platform, to simplify data integration, real-time experimentation, and robust deployment of production applications,” says John Tripier, Senior Director, Business Development at Databricks. The company’s mission is to help users of all types within an organization, from data scientists to data engineers to architects to business analysts, harness and maximize the power of Spark. Users can also take advantage of a wide range of BI tools and systems that integrate with the platform, including Tableau, Looker, and Alteryx. The company works with companies across a wide range of industries, including Capital One, 3M, NBC Universal, Edmunds.com, Viacom, and LendUp.
Databricks is hosted on AWS, and takes advantage of Amazon EC2 and Amazon S3. “Databricks is a cloud-native platform that deploys Spark clusters within the AWS accounts of our 500+ customers. We leverage the compute, storage, and security resources offered by AWS. We find AWS is a reliable and secure environment and enables fast implementation of infrastructure in regions all over the world,” says Tripier.
Want to give Databricks a spin? The company offers a free trial of their software here. Learn more about the Databricks platform here. And if you’re a Consulting Partner interested in learning more about becoming a Databricks Partner, click here. Databricks deploys in all regions, including AWS GovCloud, and is also an AWS Public Sector Partner.
SnapLogic
AWS Big Data Competency Holder in Data Integration
Where does your data come from? For most companies, particularly enterprises, the answer is, a lot of places. SnapLogic is focused on helping enterprises easily connect applications, data, and things between on-premises, cloud, and hybrid environments through its Enterprise Integration Cloud (EIC). True to its name, the company provides Snaps, which are modular collections of integration components built for a specific data source, business application, or technology. “We help customers automate business processes, accelerate analytics, and drive digital transformation,” says Ray Hines, director of strategic partners and ISVs at SnapLogic. The company works with hundreds of customers, including Adobe, Box, and Earth Networks.
The SnapLogic Enterprise Integration Cloud integrates with Amazon Redshift, Amazon DynamoDB, and Amazon RDS. “We provided pre-built integrations with these services because our customers are rapidly adopting them for their cloud data warehousing needs,” explains Hines. The company’s solution can help simplify the onboarding process for Redshift, DynamoDB, and RDS customers. For instance, Snap Patterns provide pre-built data integrations for common use cases and a number of other features (learn more here).
Care to try out SnapLogic for AWS? Click here for a 30-day free trial of SnapLogic Integration Cloud for Redshift or download the data sheet here. You can request a custom demo here. Consulting Partners, learn more about becoming a SnapLogic Partner here.
Treasure Data
AWS Big Data Competency Holder in Data Management
Are you a marketer looking for the ability to use data to provide great experiences to end customers? Are you in sales operations and are you looking to create a centralized dashboard for real-time sales data? Give life to your customer data through Treasure Data. “Treasure Data simplifies data management. Our Live Customer Data platform keeps data connected, current, and easily accessible to the people and algorithms that drive business success,” says Stephen Lee, vice president of business development at Treasure Data. “We provide a turnkey solution that collects data from 300+ sources, stores the data at scale, and provides users the tools to analyze and activate their data in their application of choice.” The company works with customers across industries including Grindr, Warner Brothers, and Dentsu.
“We deployed our solution on AWS because of the scalability, reliability, and global footprint of the AWS Cloud and the ability to deploy without having capital expenditures. With AWS, we can easily deploy our solution in new regions. We’ve also found there to be a strong support ecosystem,” says Lee. Treasure Data’s Live Customer Data Platform integrates with Amazon Redshift, Amazon S3, and Amazon Kinesis, along with many other solutions including Tableau, Chartio, Qlik, Looker, and Heroku (see all integrations and learn some nifty integration recipes). Getting started with Treasure Data is easy. “Our Solution Architects work with our new customers to get their initial data sources set up, after which our customers can be up and running in minutes,” explains Lee.
You can request a custom demo here, or simply email the team directly at info@treasuredata.com. Consulting Partners interested in becoming a Treasure Data partner can visit the company’s partner page here.
Want to learn more about big data on AWS? Click here. Bookmark the AWS Big Data Blog for a wealth of technical blog posts you can look to as you begin to take advantage of AWS for big data.
This blog is intended for educational purposes and is not an endorsement of the third-party products. Please contact the firms for details regarding performance and functionality.
How We Built a SaaS Solution on AWS, by CrowdTangle
The following is a guest post from Matt Garmur, CTO at CrowdTangle, a startup and APN Technology Partner who makes it easy for you to keep track of what’s happening on social media. Enjoy!
Horses were awesome.
If you had a messenger service 150 years ago, using horses was so much better than the alternative, walking. Sure, you had to hire people to take care of horses, feed them, and clean up after them, but the speed gains you got were easily worth the cost. And over time, your skills at building a business let you create systems that could handle each of these contingencies extremely efficiently.
And then cars came around, and you were out of luck.
Not immediately, of course. The first car on the street didn’t put you out of business. Even as cars got more mainstream, you still had the benefit of experience over startup car services. But once the first company grew up that was built with the assumption that cars existed, despite all your knowledge, you were in big trouble.
At CrowdTangle, we build some of the best tools in the world for helping people keep track of what’s happening on social media. We have a team of engineers and account folks helping top media companies, major league sports teams, and others find what they care about in real time (and we’re hiring!). Importantly, we started our company in 2011, which meant that AWS had been around for 5 years, and we could, and did, confidently build our entire business around the assumption that it would exist.
AWS was our car.
It may seem like an exaggeration, but it’s not. We were able to build an entirely different type of organization on AWS than we could have built five years prior. Specifically, it has impacted us in four critical ways: business model, hiring, projections and speed, which of course are all different ways of saying, “cost,” and thus, “survival.”
First is the business model. When we started developing our company, we didn’t consider producing physical media to hold our software, nor did we consider installing it on-premises. By making our model Software as a Service (SaaS), we got a lot of immediate benefits: we were able to allow users to try our product with no more effort than going to a website; we could push features and fixes dozens of times a day; and we could know that everyone would get the same controlled experience. But by taking on the hosting ourselves, we would need to have a significant capital outlay at the start in order to simply deliver our product. Having AWS to begin on without those initial costs made SaaS a viable option for our growing startup.
Next is hiring. AWS has Amazon Relational Database Service (Amazon RDS), a managed database service, which means I don’t need to hire a DBA, since it’s coder-ready (and on Intel Xeon E5s, so we’re certainly not sacrificing quality). AWS has Elastic Beanstalk, a service that makes it simple for us to deploy our application on AWS, which means I can set up separate environments for front- and back-end servers, and scale them independently at the push of a button. Amazon DynamoDB, the company’s managed noSQL database service, helps alleviate me of the need to have four full-time engineers on staff keeping my database ring up and running. We keep terabytes of real-time data, get single-digit millisecond response times, and from my perspective, it takes care of itself. My team can be focused on what matters to our driving the growth of our business, because we don’t need to spend a single hire on keeping the lights on.
Third is projections. If you’re in the horse world, your purchasing model for computers is to run as close to capacity as possible until it’s clear you need a capital outlay. Then you research the new machine, contact your supplier, spend a lot of money at once, wait for shipping, install it, and when it goes out of service, try to resell it and recover some of the cost. In the car world, if I think we might need more machinery, even for a short time, I request an instance, have it available immediately, and start paying pennies or dollars by the hour. If I’m done with that instance? Terminate and I stop paying for it. If I need a bigger instance? I simply provision a bigger instance on the spot.
Finally, I want to talk about speed. Because of our choice to build our solution on AWS, we have a lean team that can provision resources faster, and can constantly work on fun projects rather than having to focus on simple maintenance. Not only can we move quickly on the scoped projects, but we can do cheap R&D for the moonshots. Every new project could be a bust or our next million-dollar product, but they start the same — have an idea, clone an existing environment, put your project branch on it, trot it out for clients to play with, and spin it down when done.
We recently decided that an aggregation portion of our system was slower than we liked, and we researched moving it to Amazon Redshift. To do so, we spun up a small Redshift instance (note: no projections), did initial testing, then copied our entire production database into Redshift (note: R&D speed). “Production” testing proved the benefits, so now we have an entire secondary Amazon Kinesis-Redshift managed pipeline for our system (note: no hiring, despite adding systems), and the speed increase has opened the door for new products that weren’t possible for us under the prior method. How much would that experimentation cost in the horse world? What would it have taken to execute? Would any of those projects have been small enough to be worth taking a chance on? We place small bets all the time, and that’s what helps us remain a leader in our field.
Your next competitor will have grown up in the age of cars. How can you compete when you have horses?
To learn more about CrowdTangle, click here.
The content and opinions in this blog are those of the third party author and AWS is not responsible for the content or accuracy of this post.
APN Partner Webinar Series – AWS Database Services
Want to dive deep and learn more about AWS Database offerings? This webinar series will provide you an exclusive deep dive into Amazon Aurora, Amazon Redshift, and Amazon DynamoDB. These webinars feature technical sessions led by AWS solutions architects and engineers, live demonstrations, customer examples, and Q&A with AWS experts.
Check out these upcoming webinars and register to attend!
Amazon Aurora Architecture Overview
September 26, 2016 | 11:30am-12:30pm PDT
This webinar provides a deep architecture overview of Amazon Aurora. Partners attending this webinar will learn how Amazon Aurora differs from other relational database engines with special focus on features such as High Availability (HA) and 5x Performance compared to MySQL.
Understanding the Aurora Storage Layer
October 3, 2016 | 11:30am-12:30pm PDT
This webinar will dive deep into the Amazon Aurora Storage Layer. Attendees will receive a technical overview of performance and availability features as well as insights into future enhancements.
Amazon Aurora Migration Best Practices
October 10, 2016 | 11:30am-12:30pm PDT
This webinar will cover best practices for migrating from Oracle to Amazon Aurora. Partners attending this webinar will learn about common migration opportunities, challenges, and how to address them.
Selecting an AWS Database
October 17, 2016 | 11:30am-12:30pm PDT
Amazon Aurora, Amazon Redshift, and Amazon DynamoDB are managed AWS database offerings well-suited for a variety of use cases. In this webinar, partners will learn best practices for selecting a database and how each offering fits into the broader AWS portfolio of database services.
Amazon RDS PostgreSQL Deep Dive
October 24, 2016 | 11:30am-12:30pm PDT
Amazon RDS makes it easy to set up, operate, and scale PostgreSQL deployments in the cloud. Amazon RDS manages time-consuming administrative tasks such as PostgreSQL software upgrades, storage management, replication, and backups. This webinar will dive deep into the technical and business benefits of RDS PostgreSQL, including best practices for migrating from SQL Server and Oracle.
We’ll be hosting more educations webinars for APN Partners throughout the end of the year. Stay tuned to the APN Blog for more information!
Upcoming Webinar: How To Drive Exponential Growth Using Unconventional Data Sources with Chartio, Segment, and AWS
Check out an upcoming webinar from Advanced APN Technology Partner and AWS Big Data Competency Partner, Chartio, entitled, “How To Drive Exponential Growth Using Unconventional Data Sources”.
Dan Ahmadi, Director of Growth at Meteor, a large open source platform for building web and mobile apps, will discuss how Meteor uses Segment Sources, Amazon Redshift, and Chartio to combine multiple data sources to create a single-customer view.
In this webinar you will learn:
- How Meteor uses Segment to combine several data sources in Amazon Redshift
- How Meteor uses Chartio to analyze their GitHub data
- The analyses Meteor uses to track:
- Product: trends across repositories for the Stargazer media player
- Adoption: insights on Meteor framework usage
- Engagement: top community contributors and what drives them
- Growth: daily commercial growth patterns and anomalies
The webinar is on September 1st, 2016, 10:00am PT. It features Dan Ahmed, Director of Growth at Meteor, AJ Welch, Data Engineer at Chartio and JJ Nguyen, Product Marketing Manager at Segment.
Chartio was recently featured in a guest post on the Big Data Blog entitled, “How SmartNews Built a Lambda Architecture on AWS to Analyze Customer Behavior and Recommend Content”. Read the post here.
(Note: the register link will take you to a third party site, off the APN Blog. If you register for the webinar you are registering with a third party, not AWS)
PH Tech’s Move to Offer Data-as-a-Service – A Webinar with APN Partner 47Lining
Do you want to learn some best practices for using Amazon Redshift for your data warehousing and business intelligence workloads? Check out an upcoming webinar from Advanced APN Consulting Partner and AWS Big Data Competency Partner 47Lining entitled, “Redshift Jumpstart: PH Tech’s Move to Offer Data-as-a-Service”.
In this webinar, you will learn about:
- How to jumpstart your data warehousing and business intelligence workload using Amazon Redshift.
- Design patterns for ingest, transform, load and visualize flows.
- How to get started in your adoption lifecycle – from PoCs to managed services to performance optimization and system maintenance best practices.
The webinar is on July 20, 2016, 10:00am PT. It features Mick Bass, CEO, 47Lining, and Chad Casady, VP, Information Technology, PH Tech.
47Lining recently published a guest post on the APN Blog entitled, “Why Our Customers Love Amazon Machine Learning” – read more here.
(Note: the register link will take you to a third party site, off the APN Blog. If you register for the webinar you are registering with a third party, not AWS)
Learn about Amazon Redshift in Our New Data Warehousing on AWS Class
As you continue to use data to help drive your mission forward, finding a way to simply and cost-effectively leverage analytics is becoming increasingly important. That is why we’re excited to announce the upcoming availability of Data Warehousing on AWS, a course that helps you leverage the AWS Cloud as a platform for data warehousing solutions.
Data Warehousing on AWS is a new three-day course that is designed for database architects, database administrators, database developers, and data analysts/scientists. It introduces you to concepts, strategies, and best practices for designing a cloud-based data warehousing solution using Amazon Redshift. This course demonstrates how to collect, store, and prepare data for the data warehouse by using other AWS services such as Amazon DynamoDB, Amazon EMR, Amazon Kinesis, and Amazon S3. Additionally, this course demonstrates how you can use business intelligence tools to perform analysis on your data. Organizations who are looking to get more out of their data by implementing a Data Warehousing solution or expanding their current Data Warehousing practice are encouraged to sign up.
Accessing the Courses
These classes (and many more) are available through AWS and our Training Partners. Find upcoming classes in our global schedule or learn more at AWS Training.