Opinion

Are Cloud-based, Serverless Database Platforms Challenging the Traditional On-premise Monoliths?

Humans started storing information systematically long ago. From the times when businesses used punch cards for ‘input’, ‘output’ and data storage to today when there are on-demand, auto-scaling configurations available where the database automatically starts up, shuts down, and scales the capacity based on an application’s needs, the technology gone through a lot of churn. Today’s ‘serverless computing’ enables enterprises to run the databases in the cloud without having to manage any instances. Isn’t that amazing?

The high-profile workloads that play a critical role in engaging and retaining customers, which were once hosted on-premise, are today migrating to the serverless environments, giving the ultimate flexibility to users. This migration has allowed users to replace legacy platforms (with performance challenges and clumsy license issues), to cloud-based, highly scalable solutions, where the challenge of managing infrastructure is minimised. Such instances also eliminate the need to manage storage freeing up the time and budget for business innovation.

The cloud-based, Serverless databases have certainly waged a war on the traditional on-premise databases. While it’s not an easy task for an enterprise to make this switch even at the end of product lifecycle, the former is making a promise of both ease of use and ease of management thus making it a compelling option.

During the AWS re:Invent 2018 held in Las Vegas later last month, I got an opportunity to interact with Anurag Gupta, Vice President, for Analytics, Aurora, and Relational Database Services at AWS about how the value of data has moved from being mere transactional to the most valuable asset that its called the new oil, new gold or new currency. Anurag, who has been involved in designing AWS Aurora – a MySQL and PostgreSQL-compatible relational DB built for cloud, also spoke on issues like the changing value of data, future of serverless computing and how it will directly make an impact on customer experience.

Below are the excerpts from the conversation:

DynamicCIO (DC): What’s your take on the data growth, its quality, trust and other key factors that are now increasingly becoming directly proportionate to an organisation’s growth?

Anurag Gupta (AG): Data grows approximately 10x every five years. That has been true for the last three decades. Any data platform lives for about 15 years, which means you need to scale it to about 1000x before replacing it with a new platform. Most technology leaders miss this calculation and end up planning for just 2-3 years. By the time they finish the implementation, there’s time for a technology refresh. Meanwhile, meaning of data is changing too. There was a time when business transactions inside your ERP was the data. Now, that’s just one part of the data.

(Anurag paused for a while. There was a need to establish the facts through anecdotal example. And what better example could have been than that of Amazon.com!)

AG (Continues): I’d like to take up an inside case of how the data acts as the new fuel in Amazon.com. Think about Amazon Echo, Amazon Alexa. They look like any other device we use. But what’s interesting about these devices is the feedback loop (created by the analysed data) about what questions you ask, accents you weren’t able to understand etc. The device gets better because of that data processing in the background.

Now, think about an Amazon Go store, where unlike a normal store, we know the customers pretty intrinsically. What they picked up, what they put back, what they looked at the front and back of a product etc. It’s almost like mapping the digital journey people take through a website on the path to purchase, and applying that to the physical world.

Yet another interesting example is Amazon Prime’s one-hour delivery. Generally, it takes an hour to get out of your couch, go to the store and come back just to buy an item. Now imagine this being delivered within an hour and by someone who does six other shipments along the route in the same timeframe. Isn’t that extraordinary?

There are organisations which now collect data from your social exhaust on various things to understand your preferences and integrate the same within their systems to customise experience. That’s like data dancing to your tunes. Not only are the tenets but also the application of data have changed. Companies that leverage data to enrich the experience for their customers, are the most successful ones today.

From the stage when data was transactional to now when it creates valuable relationships/interactions, there’s been a  great deal of evolution. The key is in how we collect data so that the customers appreciate and value it, and not in the way it interferes into their personal space. As long as organisations make use of that data in a way that simplifies a customer’s life, issues like data privacy or customer privacy don’t arise.

The discussion so far wasn’t about the nuts and bolts of technology. By just giving 2-3 key examples from within Amazon, Anurag was able to explain how the value of data is moving from transactional to creating value. Anurag was of the opinion that ‘technology is the servant of experiences’. I was now intrigued to know how the new data platforms have evolved to make these experiences better and engaging.

DC: How are enterprises deriving business value from data by providing an enhanced customer experience? What has changed from the past?

 AG: I strongly feel that the key tenets like reporting analysis, data modelling, planning are same as they were earlier. They won’t really go away. What I seriously recommend enterprise data managers is that they should look at the world around them. ‘What would it look like if a company builds a relationship with its customers like it really understands them well, like the best friend?’ That’s the shift we need to adopt. You need to graduate to a level where data helps you make customer interaction as intrinsic and as deep as between best friends. The assumption underneath is that if you provide a better experience, you gain greater share of knowledge, you gain happier customers who then choose to stick with you.

We were back to yet another example of Amazon.com; mostly because it was a convenient and convincing one.

AG (continues) While customers do bother about aspects such as loading of the page refresh etc. of a website, there’s another key aspect to it. That’s about providing sensible recommendations after having known. For example, if you are ordering a book, it makes sense to recommend similar books by the same author (maybe) or by other authors who write similar stuff. But, if you are ordering a 65-inch TV you’d probably not want me to suggest yet another 65-inch TV, because you don’t buy multiple TVs. That’s where the value of data has changed and that’s where the database-platforms of modern day play a critical role.

 That brought us to the discussion of technology nuts and bolts. It was now the time to come to the core of how the whole momentum has started shifting from monolith on-premise databases to a more elastic cloud-based solutions.

DC: What was the thought behind AWS Aurora? Why do you think that was required?

 AG: It’s necessary to understand a few key things here. Firstly, there’s a general orientation towards reinvention. So, if I ask myself: “The time when databases were first being created and made available, (and AWS wasn’t available), to now, how different has it been? How would the design be different keeping in mind the growth in business complexities and diversities? As a technology creator, it is important to challenge the rules that have been established because they may (or may not) be applicable/relevant in today’s context. Secondly, if you think about the cost of providing databases to people, there’s a portion of it that’s the hardware, the cost of which has continuously declined. Then there’s the cost of managing the environments through engineering. By applying managed services, that cost can also be brought down. What hits hard is the software license cost, which escalates at every renewal.

           How can we help our customers overcome this issue?

We advocate that they should use open source. But, there is a huge gap between the ease of use of open source and the availability and the performance of a commercial database.

            Can customers get something that provides the best of both worlds?

That’s where we identified the gap. There was an opportunity to really bring in a lot innovation here that wasn’t thought of earlier. It brought in the additional energy in the database industry where more people believed that something was possible and they started doing that ‘something’ as well. That was the genesis behind AWS Aurora.

 But the fact is that users are still sceptical on the availability, resiliency and robustness of open source databases and therefore they are more inclined towards commercial databases like Oracle, which created dependency of sorts and is hard to switch. Why would a user like to migrate? How can AWS Aurora help customers overcome the challenges they face with traditional DBs? These were some real questions to which the answers were needed. 

DC: You think these new innovations have captured the customer challenges well to offer a better solution?

 AG: One of the things we do is that we collect feedback all the time. Our primary goal is to listen to what customers need that we aren’t yet providing and then go back and build it. Last year, at AWS re:Invent, customers voiced their opinion how they wanted disaster recovery with a very low recovery time objective (RTO) and recovery point objective (RPO). And this year we announced global databases, where we are running the latency lag, between regions across the United States, in 100 to 200 millisecond range. That’s a very low number compared to what people can even imagine. It may not be as good as real time but it is close and the amount of data lost is minuscule compared to a couple of days. Those are the kind of customer concerns we try to solve through product innovations. Another example: Customers are keen to understand that in the new GDPR guidelines how the data will flow, and how should they manage lineages and all the compliance issues. Challenge for us, as a provider of technology, is to not only solve these issues for the most complex customers but also for the individual developer running their start-ups.

            Does the individual developer-based start-up not want security?

Of course, it does.

            Does the large, sophisticated company not want simplicity?

Of course, it does.

So, then the challenge for us, as a technology company, is how do we provide both?

Last year we announced Serverless Aurora and this year we have announced the APIs for the Serverless. The fact is that the developer-based start-ups are not the same as enterprise companies. So, even if Serverless is primarily a ‘cost simplicity’ statement, it paints a broad brush in terms of the value delivered.

DC: Do you think enterprises, at some stage, will be looking at Serverless Database option as an alternative and, seriously?

 AG: Yes, I strongly believe so. If you think about it, we are increasingly taking the layers of the stack and making it look more like how people do programming. And for web programming, you work against a sea of resources. You don’t target a particular location when you are submitting an order, etc. But somehow in the enterprise world, that’s what we do. And when you interact with WhatsApp, it’s just an API and why shouldn’t the rest of my stack look like that? Why is the consumer world so much easier to interoperate with than the business world, where I pay money?

DC: A natural question that arises at this stage is if Aurora is really mature enough to do the heavy lifting and high-end functionalities will it the choice of customers to switch from commercial databases?

 AG: We have several very high-end customers who have already migrated to AWS Aurora. Expedia, and Ticketmaster, Netflix, Dow Jones, Capital One are some such examples. Amazon.com aims at fully migrating from Oracle DB to Aurora by 2020. During the high traffic season (from Thanksgiving to New Year), they have performed very well using Aurora. These are customers that would not make such a technology shift if there is no incentive for them including a better experience for their customers.

 As we came close to our discussion, there were still a few questions to establish a case for an open-source, scalable, dependable database. One of those issues – the biggest sucker of resources in any organisation – is managing the mammoth piece of technology called Database. The complexities are growing at every stage. How does AWS Aurora address those challenges?

 DC: From a manageability point of view, what do you think Aurora will offer? What feedback do you gather when you talk to the clients about the manageability of the database?

 AG: We have been in managed databases for quite some time and Aurora of course is the engine behind. We do patching, provisioning, failovers, backups, restorations, scaling of storage and compute, monitoring and the whole nine yards of database management. Those are all things that just come with a base platform. If you think about Aurora, since we have our own custom storage, we can do scaling up, scaling down, rebalancing of IOs, etc., without any human involvement.

If you think about the advantage of having fleet telematics, where we don’t look at the individual queries but look at the shapes of queries, it’s about how many queries are done. We are not trying to optimise some benchmark but we are trying to optimise what our customers are actually doing. Since we have visibility into that, we can make those corrections. Additionally, since we have so much data coming in, we can start moving towards machine learning models, to decide and predict what people should use. It removes someone’s need to do things on their own. Gradually, that’s where we will move.

Leave a Comment

Your email address will not be published.

You may also like