Cloud computing has become one of the key developments in IT over the past several years, and pretty much everyone is betting that it will become more important in the future.
In fact, it is pretty hard to find anyone who is not a user of cloud services in some way, whether that is simply as a consumer of software-as-a-service such as cloud-hosted email, or something more complex such as building your own cloud-based applications or migrating some on-premise IT functions to a cloud service provider.
However, many companies have held back from taking the plunge because of fears that cloud could prove to be yet another technology lock-in. Those with long memories of the mainframe recall how being too tied to a particular vendor's platform can cost your company dearly, not just in paying for technology itself, but in losing the flexibility that might have been afforded if a different path had been followed.
These kind of worries dog IT chiefs to this day, and have possibly chilled the full growth potential of cloud services as organisations feared backing the wrong platform and being left behind by their rivals. It is easy to dismiss fears such as this, but current trends in the cloud world show how they are justified.
It all started with virtualisation, still the foundation of most cloud platforms. As well as enabling consolidation of servers onto fewer physical hardware nodes, virtualisation meant that applications could be moved around from one host node to another, or theoretically from one data centre to another, if required.
The problem was that different vendors had different ways of packaging virtual machines - and the applications inside them - as image files. VMware had its format, Microsoft had its format, and it looked like never the twain would meet, until common sense prevailed and the industry agreed a standard of sorts in the Open Virtualisation Format.
But all of a sudden, virtual machines are not the vehicle of choice for deploying and managing applications in a cloud-based infrastructure. Developers have come to regard them as too monolithic and inflexible for so-called 'cloud-native' applications that are capable of dynamically scaling out to meet changing demands.
Currently, containers are the technology of choice, as these are more lightweight than a full virtual machine, typically comprising just the application itself and any dependencies it needs to run. In fact, developers have gone further than this, breaking applications down into specific functions that are encapsulated in containers so they can be scaled up and down independently from other parts of the complete application. This is the so-called microservices approach.
But guess what? Different vendors in the container space have once again come up with rival ways of implementing them, such as Docker and Kubernetes. Not just different ways of packaging containerised apps, but fundamentally different approaches to the way they orchestrate sets of containers to deliver the end result.
Whose approach is best? Take your pick. Or rather, back your horse and hope that it comes out as the industry standard, rather than the other approach. Then there is the OpenStack Magnum project, which aims to act as an abstraction layer for integrating any container platform into the open source cloud framework.
But the main problem with containers is that they are quite different from the established virtual machine way of deploying and managing applications. Implementing containers for customers that have already deployed a cloud infrastructure based on virtual machines means big changes to the stack or investing in a new, parallel software stack that may or may not integrate well with what they have already deployed.
So, choose your platform carefully seems to be the moral of the story, or else be prepared to ditch everything you may have invested in so far if something better comes along. It seems that caveat emptor still rules in the IT industry.
Biggest screen ever, Qualcomm Snapdragon 835 and 6GB of RAM for forthcoming Samsung Galaxy Note 8
Windows 10 Chinese Government Edition completed by Microsoft
And even when IoT projects do get completed, one-third aren't considered a success
So, the Frontier Edition launches at the end of June, the Radeon RX Vega in July - and the Ryzen 3 straight after?