by Martin James, VP EMEA, Aerospike
Data is crucial in allowing any government or public sector organisation to make informed decisions to drive mission-critical applications appropriately. Whether improving the supply chain, connecting cities for a more efficient traffic flow, to respond to terrorist threats swiftly, real-time data is the linchpin. The trick is to process and manage data so that operational costs, efficiencies, and outcomes are improved while maintaining performance at scale.
Many organisations rely on legacy systems to manage data rather than embark on a digital transformation journey to a modern data architecture that provides greater efficiencies. The Organising for Digital Delivery report from the Digital Economy Council found that, in 2019, the government faced cumulative annual costs of £2.3bn to maintain legacy IT. This represented almost half of the £4.7bn in total that was spent on IT across the government that year. The irony is that rather than start the journey to modernise their data architecture, many public and private sector organisations simply add more servers to legacy systems, adding costs and complexities.
Data is proliferating at an explosive rate as public sector organisations increasingly rely on cloud-native architectures. The amount of data created over the next three years is projected to be more than that created over the past three decades, according to Worldwide Global DataSphere Forecast 2020-24. UK citizens, accustomed to instant digital experiences in other areas of their lives, expect nothing less when interacting with the public sector. The good news is that it is becoming easier to meet this expectation.
Dealing with existing challenges
It’s important to acknowledge the key challenges that public sector IT departments face. First, it is harder to scale to meet demand if additional data cannot be added to existing workloads to support increased transactions, analysis, or operations efficiently. Secondly, it is costly to add memory and storage to existing mainframes. If not added, monitoring and automation tools that allow for observability and seamless processing become inaccessible. Third, unless costly extra bandwidth is utilised, legacy technologies will be slow, compromising the processing of high-volume transactions of critical services, such as fraud detection.
Modern real-time data platforms meet these challenges and support expanding workloads without impacting performance. As more data becomes available, the modern database must scale to ensure each customer experience is instant and accurate. This must happen for 100 or a billion transactions in submillisecond times to meet customer expectations. To do this, data must be optimised for better movement and storage while employing easily managed automation tools so data clusters can be added and data replicated across the data centre ecosystem.
Due to compatibility and scaling issues adding memory to legacy mainframes is costly and complex. The modern real-time data platform that leverages SDDs or flash memory combined with DRAM or a hybrid memory architecture improves performance and allows for a favourable total cost of ownership. Furthermore, a modern real-time data platform must always be available, with fail-safe mechanisms in place to ensure a seamless experience regardless of power outages, weather events, or other issues that would interrupt the performance of many legacy systems.
The process of modernising a data architecture should not be the costly, resource-sucking endeavour many CTOs worry about. With careful planning and selecting the right vendor, the impact on continual production should be minimal.
An example from the private sector well illustrates this. Based in Paris, Criteo is a global Ad Tech company that needed to “future-proof” its data architecture to meet the growing requests to serve billions of advertisements in real-time. Further, to meet their client’s demands, the ads had to match the prospective buyer’s needs. Re-platforming or the process of moving from one data platform to another is exhausting and expensive, but not in this case. Criteo chose the right vendor, and the move met the intense speed and scale needs of Ad Tech and saved the company millions. Criteo reduced its server count by 80 percent annually in spite of increased data loads while dramatically reducing CO2 emissions.
Proving the case for TCO
To test the proposition, Forrester Research worked with us to imagine a ‘composite’ organisation based on several companies that had previously re-platformed with us. These included an international conglomerate with an extensive user base and a well-known brand. The composite organisation used several hundred servers and had a team of developers.
After analysis, Forrester found a reduced server footprint of 55% to 75% on average each year. Prior to using the modern real-time data platform, additional servers were continually added to legacy architectures to achieve performance expectations. With the modern real-time data platform the overall number of required servers was reduced by 50% to 70% in the first year. By the third year, server reductions increased by 60% to 80% as more existing workloads were added. The cost savings ranged from $2.4M to 3.3M over the three-year investment.
Embrace change now to provide services for the future
The relentless growth of data will not slow as more data and connected devices will beget more data. Nor will the demand for instant, accurate outcomes diminish. These will only increase. To meet these realities, state organisations must do more than add servers. It is imperative to choose a data platform vendor that provides persistent performance at unlimited scale while providing a lower total cost of ownership now and in the future.
Recent Comments