By: Dave Russell, Vice President of Enterprise Strategy at Veeam
New technologies attract a lot of hype. Descriptions used to describe new technologies such as ‘revolutionary’ and ‘ground-breaking’ have lost their impact through their overuse. Furthermore, this culture of overpromising makes technologists and customers alike cynical when they don’t see immediate or significant impact of new tech deployments. However, there are numerous examples of technologies that have been subjected to scepticism early on but gone on to become staple parts of the digital economy.
From touchscreen interfaces to the Internet of Things (IoT), this path is so well trodden that Gartner produces its annual hype cycle, which theorises the idea that new technologies go from early adopters’ enthusiasm to inflated expectations, before a sense of disillusionment sets in. As understanding of the technology matures, a more realistic judgement can be made of its use as more viable applications are discovered and deployed.
There are many reasons why new technologies can initially flatter to deceive. It can simply be executed in the wrong way – possibly because the skills do not yet exist to design solutions and troubleshoot problems. Digital transformation is one such example, where businesses feel held back by a lack of skills to implement new technology – with almost half (49%) of IT decision makers surveyed citing this as a concern according to Veeam Data Protection Report 2021. It can be that a technology is simple ahead of its time and the complementary technologies that give it a clear place in the world do not yet exist. Returning to the consumer example of touchscreen devices, the early efforts by Palm and Microsoft to launch personal tablets were flawed by their inability connect wireless to the Internet or sync with PCs and laptops. It was only when wireless technology and cloud computing reached maturity that smartphones and tablets came of age.
Finally, technology can work perfectly well, but not really solve a big enough problem to warrant significant investment. That’s why you often hear talk of ‘killer apps’ or use-cases that will give a new technology purpose and meaning. QR codes are an example of a technology that the world thought it had infinite uses for but struggled to take off until they found their calling in mobile boarding passes and ticketing applications. Experience therefore tells us that just because a new technology might not change the way things are done tomorrow, it doesn’t mean it won’t have a big impact long-term. With that said, it’s fine to get excited by the potential of a new technology. But as an industry we must learn to temper our expectations, and those of our customers, towards how quickly and how far new technologies will create radical and lasting change.
Contain your excitement
Even for those technologies which solve a real problem, are enabled by the right complementary technologies. and are generally understood enough to be successfully tested and deployed, there are other challenges. Any enterprise IT deployment requires investment, upskilling and cultural change from business leaders and employees. That means it can take years to build a compelling enough business case to convince budget holders to incorporate new deployments into their strategy. In addition, once a clear business case has been established – there are regulatory, cybersecurity and data protection requirements to throw into the mix. Given the value modern businesses rightly place on their data and the consequences of failing to manage and protect it, this is something which must be considered as early in the tech lifecycle as possible. If you cannot confidently protect and manage data within an IT service or application, don’t deploy it.
An example of a technology that is moving through the various phases of the hype cycle at a rate of knots is containers – seen by many as a natural evolution of a virtualised environment – but designed to give IT managers greater control and flexibility over their applications. As little as around two years ago, containers had already begun their slide into Gartner’s so-called trough of disillusionment – the phase when businesses have begun to act on the hype but been disappointed by the lack of immediate outcomes.
However, fast-forward the clock to 2021 and containers are already a critical component of DevOps-led infrastructure and application modernisation – with Kubernetes emerging as the dominant container orchestration platform. The business case for using containers enabled by Kubernetes is becoming well established, as microservices-based architectures have gained traction within the enterprise. This opens up new possibilities when it comes to protecting data within containerised environments. A general rule to live and die by is that if you can’t manage data, you can’t protect it. So, deploying Kubernetes adds the vital orchestration layer, meaning there is now a significant opportunity for a single data protection platform that includes virtual, physical, cloud and containerised environments. Establishing more advanced data protection and backup credentials is one of the advancements that will help containers go from IT side project to achieving the return on investment businesses crave.
At the edge of reason
Another technology which presents certain data protection challenges is edge computing. Currently, Gartner places edge computing right at the peak of inflated expectations on its Hype Cycle. As hyperscalers look to extend their ever-expanding data volumes and workloads to the edge and the shift towards remote working creates a greater sense of urgency for businesses looking to transform digitally, the case for edge computing looks compelling. You can view this as a confluence of events that make edge computing more relevant.
However, alongside digital transformation, there are other words on CIOs’ minds: data protection, cybersecurity, cost optimisation and digital skills to name a few. All these are relevant when it comes to taking edge computing from an overhyped proof of concept to a core hybrid infrastructure service. To manage and protect data at the edge, businesses must be able to identify the data they need, back it up and secure it. Not only does this require backup, data and replication capabilities, it also requires specific skills – so often in short supply when it comes to relatively new technologies. Businesses looking to capitalise on edge computing at this stage need to work with specialist partners to ensure their deployments are not just conducted successfully but are done so without putting data at risk or allowing cloud storage costs to spiral out of control. Taking the time to define your businesses’ Cloud Data Management strategy will provide direction and clear objectives, allowing you to measure the success of introducing edge computing to the data management mix.
Taking a strategic view of where technologies you have not successfully deployed before sit within your wider business objectives is crucial for building the business case for them and acquiring the necessary buy-in from budget holders to invest complementary solutions and onboard the necessary skills. For enterprises locked in a race to transform digitally, evolving customer demands along with an increased reliance on cloud and connectivity are forcing their hands. Implementing the latest and greatest technologies to achieve the desired outcomes of digital transformation requires investment in the necessary skills, data management and protection capabilities required to do so successfully, cost-effectively and securely.