Like many other industries, organisations in the public sector have been keen to make use of the flexibility offered by cloud computing, but are now observing unpredictable and rising costs. Much of which can be mitigated through careful planning and on-premise infrastructure.
Government guidance now recommends switching to a strategy of the most appropriate solution for a problem, rather than a one-size-fits-all or carte blanche approach of shifting all applications to the cloud.
In this blog we will explore some of the challenges encountered by public sector organisations, and the steps they can take to ensure cost-effectiveness, scalability and compliance.
Cloud computing has taken the world by storm over the past two decades. The scalability, flexibility, and on-demand nature of public clouds is unmatched, but at what cost?
Public clouds provide an easy way to match inconsistent computing demands and usage, yes, but what about storage? Storage doesn’t ebb and flow like the demands placed on CPUs and memory by high traffic events or end-of-month billing runs. Storage is persistent, needs to be retained and can’t just be shut off when not used. Even when it is not being accessed, there are still charges for holding onto that data.
All public cloud storage offerings have multiple tiers and methods of moving data between them, so what’s the catch? Whilst putting data into cloud storage is largely free, making use of it, or getting it back out, has variable costs which are hard to predict.
Like many other industries, organisations in the public sector have been keen to make use of the flexibility offered by cloud computing, but are now observing unpredictable and rising costs. Much of which could have been mitigated through careful planning and on-premise infrastructure.
Predictability is king when it comes to running IT infrastructure, but with ever tightening budgets, unexpected bills can cause significant headaches for any organisation. This is felt quite acutely in the public sector, where budgets are tightly controlled and there is a greater sense of responsibility to those groups that they are serving.
On top of that there are features and functionalities needed to ensure data is stored safely and for the correct amount of time, which can drive the cost of cloud computing higher still. It is important to find a balance between what is required and how it can be delivered in the most cost effective way. A self-hosted or co-located system can provide all of these features in a much more predictable fashion than a public cloud.
Public sector branches, including government agencies and the healthcare system, have to safely store data for many, many years. In fact, the data needs to be stored well beyond the lifetime of some of the hardware that they will initially deploy. That’s why it is important to select a storage system that not only ensures that data is safely stored using replication or erasure coding, but can also go through hardware refreshes multiple times over its lifetime and not risk the safety or availability of data.
The more complex any system is to manage, the higher the operational costs will be. Administrators will need specialised training and have to be on-call to handle any issues. Therefore, users should look for systems that are highly-available and self-healing, so that inevitable hardware failures are dealt with transparently and things like failed disks can be replaced in batches, rather than in immediate reactive maintenance.
As the population grows, the number of people that a public sector organisation has to service will increase with it. It is therefore important that a storage system allows organisations to transparently grow their storage capacity without the need for downtime or disruption.
The opposite is also useful too, for example when a project finishes, it can be useful to scale down a cluster, again non-disruptively to allow the hardware to be reused somewhere else.
Some of the datasets held by public sector companies also have requirements around proving that the data store is in fact still original and has not been tampered with. For example, imagine the records of births and deaths held by a government. This data should never change, and we need to ensure that when recalled, the data is as it was originally recorded. Similarly in law enforcement scenarios, digital evidence in the form of body camera footage, or crime scene evidence, need to be preserved until the trial and post-trial.
Snapshots, where a point in time version of a volume is created using metadata, is one way to ensure that original data is able to be read as it was originally written. For object storage, object versioning is a similar feature. When an object is overwritten, the older copy is transparently retained, so that it can also be retrieved at a later date.
Traditionally, tape backups have been used as a way of isolating an immutable copy of any data set, but with increasing use of digital evidence tape, recall times are becoming challenging.
All of these challenges can be addressed in the public cloud, but is that the most cost effective approach? Our recent whitepaper shows that for certain use cases, a co-located storage solution adjacent to a public cloud can provide savings of over 2-3x, even outsourced as a fully managed service. FInd out more below:
One of the most critical gaps in traditional Large Language Models (LLMs) is that they…
Canonical is continuously hiring new talent. Being a remote- first company, Canonical’s new joiners receive…
What is patching automation? With increasing numbers of vulnerabilities, there is a growing risk of…
Wouldn’t it be wonderful to wake up one day with a desire to explore AI…
Ubuntu and Ubuntu Pro supports Microsoft’s Azure Cobalt 100 Virtual Machines (VMs), powered by their…
Welcome to the Ubuntu Weekly Newsletter, Issue 870 for the week of December 8 –…