By Graham Jarvis, Freelance Business and Technology Journalist
Early in 2019 Brian Chidester, Senior Industry Lead for Public Sector at OpenText, wrote about the trends that will affect the public sector this year. In his blog for the company, it stated that ‘2019 technology trends driving IT modernisation in the Public Sector’, he says that IT security is of paramount importance to the public sector, alongside the need to improve service delivery and achieving cost savings.
Referring to a Gartner report about understanding cloud adoption in government, he comments: “As governments continue to migrate to the cloud, organisations must ensure their shift to the cloud is both cost-effective and secure. This may mean reconsidering how to think about the cloud, improve security and leverage implementation options.”
In the United States, this all amounts to a change in strategy from being Cloud First, to becoming Cloud Smart. “In 2019, shifting from a Cloud First to a Cloud Smart focus will help public sector organisations make sound decisions that will drive modernisation. While ‘Cloud Smart’ is a U.S. government initiative, it focuses on the integration of cloud security, procurement and workforce strategies — it’s a logical next step for governments around the globe. Thinking through the details will help agencies create better and more flexible strategies for implementation, security and acquisition.”
Think strategically
David Trossell, CEO and CTO of Bridgeworks, agrees that it’s important to think strategically about what’s required because the push towards digital government, through digital transformation, can still lead to mistakes being made. They can also be quite costly: the larger the project, the greater the cost and potential reputational damage. However, by being smarter and more flexible, it should be possible to avoid these consequences.
Cloud computing is, therefore, the backbone of this digital agenda because it’s seen as a way to reduce capital expenditure in favour of operational expenditure. Government and public sector organisations have also learnt to accept cloud computing, despite past concerns over security. While bodies, such as the DVLA and even the NHS, are still prone to hacking attacks and data breaches, the fears of cloud computing have largely gone away as the fears of the unknown have dissipated. Nevertheless, this doesn’t mean that the threat has gone away whatsoever.
He adds: “Security is not the responsibility of the cloud provider because that still resides with the user. Seeding data is a big problem for large data sets. Many are still using the cloud as another storage level; or using it to provide that back-up and an offsite disaster recovery facility.” There’s also the problem that may mean government and public sector organisations could find themselves locked in by some of their cloud service providers – despite the need for flexibility.
Broader perspectives
Deloitte also comments in its article, ‘Tech Trends 2019: UK Government and Public Services Perspective’, that the government and public sector IT perspectives are growing broader and more complex to the point that IT leaders in the sector are under pressure to use the latest technological advances, while also learning from the past decades. It says cloud, analytics and the creation of a digital experience have become the new normal with potential to still be leveraged with the sector.
The article comments: “The importance of technology to the business continues to increase. “Teams should evolve their capabilities and practices to take advantage of the mechanisms to improve delivery, transforming their core as well as the public-facing services, adopting agility across the enterprise.”
It also cites the role that artificial intelligence is playing in the public sector, referring to how it is being used in the UK’s National Health Service: “The NHS is using AI and robotics to help put doctors back on the front line, instead of performing back office administrative duties. Hospitals have begun by exploring how to automate parts of referral administration, speed up triage processes and to calculate reimbursement. Initial proofs of concept are already producing strong results, suggesting AI has a key role to play in the future of the NHS.”
Reflecting on the push for digital transformation, it adds: “To make the most of technology adoption, public bodies are finding useful lessons in the private sector — and vice versa. Cashierless stores could serve as models for care exchanges. The NHS can use AI-enabled verification of eligibility that is now becoming common in the insurance sector. Public bodies could use AI and other digital techniques to screen recruits like the private sector is increasingly doing.”
Increasing data ingestion
The Internet of Things will also play an increasing role, which the OpenText believes will inevitably lead to increasing data ingestion management requirements: “Governments embracing IoT isn’t just about making life better for citizens, it also opens new opportunities for cities. For example, London was recently listed as the top smart city government in the world by the Eden Strategy Institute for their ability to gather, process and act upon data and information. Once cities start ingesting data in a “smart” way, they can continually improve processes and further extend tax revenues — giving citizens more for their money.”
Yet, behind all of this is the need to have a fast, efficient and reliable wide area network. WAN optimization and SD-WANs are usually seen as the answer to mitigating the impact of latency, packet loss and jitter. However, they quite often don’t live up to the promises made by their vendors.
SD-WAN limitations
Trossell comments: “SD-WANs are the new kid on the block. They are a great tool, and in the right place they are a great asset, but they don’t fix all the network issues when working with the cloud and datacentres. SD-WAN has many advantages over traditional, dedicated network links, as it has the ability to combine low-cost broadband and non-MPLS WAN connections. This includes broadband links that many cities have of 100 Mb/s or greater broadband connectivity. SD-WAN can also segregate different traffic down the most economic connection path; or combine many to increase bandwidth.” MPLS is expensive in comparison. This is a key factor when, government and public sector IT departments are still expected to produce technology that’ll do more for less cost. He therefore comments:
He adds: “Nevertheless, SD-WAN does not fix the two biggest factors affecting WAN performance: Latency and packet loss (especially if SD-WAN utilises broadband connections). Many think WAN optimisation, which is often part of many SD-WAN products, will solve these issues.”
“However, this can only work with compressible data – any data that is already compressed, deduped, or encrypted-which should be the default for all public service, cannot be passed through WAN optimisation. So, for most organisations WAN optimisation will have no effect in reality because it masks the effect of latency and packet loss by caching the data locally. The only answer to poor WAN performance is to layer WAN data acceleration over the top of SD-WAN.”
New approach
“WAN data acceleration approaches the problem from a completely different angle”, claims Trossell, before commenting: “Rather than trying to squish the data down to give the illusion of faster WAN, it tackles the issue of latency and packet loss and leaves the data alone. Latency is governed by the speed of light, and nobody can change that; the way to improve performance, is to make the best use of your pipe.”
He then explains that if you fill a pipe, a network, with large amounts of data using parallelisation you still have the same latency, while gaining better data throughput of 98%. Packet loss is minimised and handled by using artificial intelligence to address the packet size and number of parallel connections. Government and public sector organisations should therefore support their digital transformation projects, particularly as data volumes are ever-increasing, with WAN data acceleration. This doesn’t mean that they have to buy new infrastructure because much can be achieved with what they already have – including with their SD-WANs by creating a WAN data acceleration overlay.
SME innovation
Yet there is still a tendency to go for the large trusted OEMs. Trossell cites the old adage that says nobody got fired for buying IBM solutions. He questions whether this still rings true these days because there are many smaller vendors that are creating the technology innovation that is required by private, government and public sector organisations.
The UK government is, to a degree, recognising the potential of the smaller IT vendors by pushing for 25% of government procurement to go to SMEs. “Much of the innovation around networks, WAN and other areas is coming from the smaller nimbler companies that can enter a market segment with innovative products that large OEMs would not see as cost effective to enter”, explains Trossell. Bridgeworks is one of those innovative companies, and it can help government and public sector organisations to become smarter with WAN data acceleration.
Disaster Recovery Tips
David Trossell, CEO and CTO of Bridgeworks, finds that the quicker you can move data, the more options you have: “Cloud back-up is a cost-effective method of providing that offsite data security. However, the SLA for cloud providers is not guaranteed and so you need to play the cost-game against them. It’s cheap to put data into the cloud so seed your initial data to at least 3 different cloud providers.”
He also advises that, while storage costs are relatively low, pulling data out of the cloud is expensive. So, this should be the exception. By using 3 different cloud providers you gain flexibility to the point that if one suffers an outage, it should be possible to restore from another provider that’s acting as a disaster recovery site. WAN data acceleration broadens the opportunity to place data in different disaster recovery locations that are situated miles apart. The traditional way to tackle latency is to place disaster recovery sites relatively near to each other – within the same circles of disruption.
Recent Comments