undefined

How do you balance ever-increasing amounts of data while optimising operational costs?

 

Ann Schlemmer, CEO of Percona, a leader in open source database software, support and services.


Data has become one of the world’s most valuable resources. Companies create more data every day around and for their customers – according to IDC, there will be 181 zettabytes of data worldwide by 2025. To put that figure into context, a zettabyte is a trillion gigabytes. Now multiply that 175 times.

David Reinsel, Senior Vice President at IDC, describes the staggering figure as; “If one were able to store 175ZB onto BluRay discs, then you’d have a stack of discs that can get you to the moon 23 times. Even if you could download 175ZB on today’s largest hard drive, it would take 12.5 billion drives. And as an industry, we ship a fraction of that today.”

However, while banks are usually at the cutting edge of technology, are they actually leading the way around working with data? According to McKinsey, the majority of banks are at the beginning of their journeys around advanced analytics – 75 percent of banks surveyed have just started experimenting with their advanced analytics deployments. 

For banks, keeping up with growing amounts of data and managing the costs to run those operations remains a core challenge. To improve operational efficiency and make those experiments successful, bank IT teams have to look at their approach to handling data

Why is there a problem today?

For decades, customer interactions around banking services would rely on in-person visits to a branch or a trip to the ATM. The growth of online banking, mobile banking, mobile phone apps and the introduction of Open Banking and API integration with third party service providers around customer data has increased the volume of transactions and touchpoints that bank IT systems have to support. Rather than checking your account once a month in a branch, you can now interact with your account at any time, using any device, at far greater levels of detail. Each transaction then creates multiple data points, from entries in customer accounts through to service use and anti-fraud measures. 

Alongside creating this data, banks and financial services organisations have to deal with further challenges. Banks have long-term compliance requirements to meet and they have to meet complex high availability needs as well. Alongside availability and keeping services available 24/7, organisations have to run effective disaster recovery operations. All of these deployments will have to be supported, yet all companies face challenges around lowering operational infrastructure costs. Add to this the economic downturn, and the need to reduce operating costs becomes even more prevalent. 

In order to successfully compete in this highly competitive and regulated market, fintechs need to know what happens in their business as it happens so they can respond with speed and relevance. Given the complex challenges that banks are facing around changing customer behaviours, regulatory changes and data storage and quality, banks and fintech companies have to work smarter around data.

 

What approaches can you take?

Fortunately, there are steps banks and financial institutions can take to optimise their operational costs around data.

The first area is around cloud. Many companies are using cloud services as part of their overall strategy, and public cloud providers enable users to generally pay for what they are using rather than over-provisioning. However, looking at things from a long-term cost perspective, cloud services may not be the right option for all your applications. Sarah Wang and Martin Casado of Andreesen Horowitzfound that global cloud spend went past $ 100 billion, and that many companies could actually save on their operations costs by running their own data centres and infrastructure, even with the cost of teams to run those operations.

There is no one size fits all model for how to support applications, and the same approach applies to data. Financial institutions appreciate the elasticity of the cloud but most often focus primarily on using it to scale up. However, these institutions can cut up to one-third of their costs simply by “rightsizing” their approach to match demand with infrastructure spending.

To check that you have the right approach in place, you can start by auditing your systems and establishing what you really need in place and what is not used. Some companies find that they have additional workloads and database implementations that can be turned off, particularly around testing. 

Alongside removing old and unnecessary workloads, you should consider how to consolidate your workloads too. In the past, you might have wanted to split out additional replicas or cluster nodes to process specific types of workloads, such as those related to reporting. But you might not need those workloads as demand decreases. You can consolidate them and add them back later if needed.

 

From a cloud perspective, you can pick and choose the instances that you run based on how much compute, storage and RAM your images will need. Setting up your default machines can make it easier to get new services and applications up and running, but those defaults may not be what you actually need. Moving to smaller instances can help you decrease costs for each machine that you create and run over time, reducing operating costs considerably. 

 

In addition, hoarding digital data can be expensive. While you might need copies of customer data for resiliency, redundancy and compliance reasons, you may not need all those copies over time. Looking at your architecture for any unnecessary copies can help you cut back on the overall amount of data you store without jeopardising any services or compliance requirements, and can lead to big savings.

Hitting the right notes with tuning performance

One area that often gets overlooked is simple database tuning. With many companies relying on Database as Service options and using outsourced cloud options for their workloads, that level of knowledge around how queries work and where optimisation can be used is often lost. Database optimisation can speed up your instances and processes significantly. By optimising your code and fine-tuning your database infrastructure, you can enhance the flexibility and performance of your database environment, especially if you’re using cloud resources. 

Although it may take a few days or weeks, tuning and optimising your environment eases your workload and can substantially decrease your overall costs. With proper database management, it’s possible to cut your bills in half. With so many database instances in place, and growing all the time, this specialist knowledge can have an incredibly quick payoff. Whether you retain your own database administrators (DBAs) or bring in outside experts or managed service providers to help, this knowledge can have a powerful impact.

Alongside looking at database management skills, you can also see a fast return from investing in your employees by expanding their skill sets. For example, involving DBAs in development or system administration duties helps them become more well-rounded as they experience the challenges and processes that other groups face. Equally, giving your application developers and DevOps teams some grounding in database theory and management can remove some of the fear from managing database instances that can exist.

In addition, training employees to step into site reliability engineering (SRE) roles can build teams that better understand the touchpoints between business units of a financial institution. This insight across business processes not only makes it easier to build the applications and services that customers want, it provides opportunities to flag up areas where cost savings can be made.

As banks make more of their data and want to implement more advanced analytics services to serve customers, their IT teams will have to handle how that data is created, handled, used, managed and eventually removed. The whole lifecycle around data has to be considered. To make savings around this, banks can look at how efficient they currently are around their operations and where they can improve their processes in future.