With virtualisation comes the potential to reduce costs and enhance productivity, realising savings across server count, carbon footprint, power consumption and cooling requirements. Whether a CIO is looking to go-green-for-green’s-sake, enhance brand image or improve competitiveness, virtualisation is an attractive proposition.
At its core, virtualisation enables organisations to make the most efficient use of available system resources by consolidating applications onto fewer physical servers. As demands in data centre infrastructure change, or in response to traffic spikes, physical resources that aren’t immediately required are automatically turned off, enabling a more efficient, environmentally-friendly, use of resources.
To execute a successful virtualisation deployment, CIOs must first be clear about what they wish to achieve and determine if the technology is a good fit for the business. Within any organisation, disruptive or revolutionary initiatives have the highest chance of failure. By approaching virtualisation as a step-by-step evolution, organisations can boost their success rate.
Moving to a virtual environment
Virtualisation is becoming a commodity, and should be treated accordingly. As the market matures, CIOs are becoming more savvy and waking up to the potentials and limitations of the technology and what can be expected from a vendor. It is a vendor’s responsibility to offer guidance on best practice and organisations should not be afraid to ask tough questions and demand answers.
Determining what percentage of an organisation’s workload can realistically be virtualised is a good first step. We seldom see organisations immediately migrate the majority of workloads to a virtual environment, instead it is common practice to start with less critical workloads, gain experience with the platform and then increase workloads to include mission-critical ones as well.
It is advisable for organisations to migrate further workloads once they are confident with the platform and support offered by a vendor.
Rather than view virtualisation as a standalone project, organizations should have a deeper look at the internal processes which might be impacted by adopting virtualisation technology adoption. This could include provisioning and change management considerations. Otherwise, customers might find that expected benefits of virtualisation, especially improving IT agility, are not achievable. In other words, often the biggest hurdle to the succesfull adoption of virtualisation is not the technology itself, but the processes that surround it.
Such processes include provisioning and change management. Each application involved might compete for computing resources and it is important for organisations to deploy software that grants greater visibility into the IT architecture to determine how applications are running. Greater visibility allows administrators to predict conflict and monitor performance to ensure critical applications receive priority and performance levels are met.
The true cost of virtualisation
Costs savings are the primary reason that enterprises look to move to a virtual environment. Reducing costs whilst driving productivity is in-line with any business objective and virtualisation is a budget-saving technology. That is not to say there are no upfront costs, initial investment is necessary and return on investment (ROI) will result from a reduction in data centre footprint, hardware, maintenance, personnel and management costs. Virtualisation technology alone may not reduce OPEX.
Going green not only enhances a brand’s image but can also reap substantial cost savings. By reducing the number of physical servers, organisations can greatly reduce the amount of rack space required which equates to substantial savings, especially if an organisation is renting space from a data centre provider. In an organisation’s own data centre, decreased power and cooling coupled with recouped floor space can often see savings of up to 50%.
Administration of the virtual environment is critical, but maintenance hours are vastly reduced. Administrators will need to support a smaller footprint of physical servers and the virtual system allows them to add additional applications remotely, while the system is still running, and helps to limit needs for application downtime.
Deploying applications to a physical server often takes a lot of time and resources. This process can be reduced from days or hours to minutes in a virtual environment, allowing organisations to realise costs savings through reduced administration time and travel costs, and increased productivity. Virtualisation reduces personnel costs as organisations no longer need to support sprawling physical hardware, allowing administrators to focus their attention on more important tasks.
Is my data secure?
Virtualisation raises a new set of security concerns. As organisations migrate further workloads into a virtual environment, they expand their technology footprint and the amount of data exposed. In a non-virtual environment, a hack might gain access to just one server, but in a virtual environment, a hacker has access to every virtual instance which runs on a compromised server.
Security risks can arise from personnel within an organisation or from third-party attackers. With an increase in employees wishing to connect personal devices to the office network, it is imperative for enterprises to ensure access to data from personal devices is not abused. Vendors can enable organisations to keep all data off their desktops and end user devices to ensure data does not leave the virtual environment. Employees will be able to view data on their monitors but will be unable to copy or export data from the virtual environment. In fact this is a frequent scenario in the for deployment of Virtual Desktop Infrastructure (VDI).
One way to ensure security is to opt for a vendor that offers kernel-based security policy enforcement infrastructure. This strict security policy guarantees isolation between virtual machines and between each machine and the hypervisor. The Linux Kernel-based Virtual Machine (KVM) implementation with combination of SELinux technology brings military grade security into the virtualiszation technology space and can help with hardening the system against bugs in the hypervisor that might be used as an attack vector aimed towards the host or another virtualised guest.
Risk of vendor lock-in
For too long, vendors have offered closed proprietary technology stacks, with little or no focus on interoperability with other market players. With no pre-defined open standards, virtualisation can become the mother of all lock-ins.
Vendor lock-ins can have a real impact on both financial organisations’ CAPEX and IT efficiency. The inability to move workloads across different platforms, and difficulty in extracting data from virtual environments, can restrict business. With infrastructure that is defined in a way that’s friendly to IT vendors, rather than customers, once users get stuck in one proprietary technology, it’s hard for them to move. Some of the leading vendors in this space are moving towards stricter licensing models that include significant charges for high density workloads, thereby limiting the amount of memory that can be allocated per CPU based on the customer’s licence. This defies one of the founding values of virtualisation: flexibility. With fluctuating workloads, nobody can predict future long-term requirements.
By opting for an open source and open standards policy – focusing on interoperability and portability to end vendor lock-ins – enterprises can ensure they are in control. Harnessing the flexibility offered by virtualisation technology, organisations can react to immediate business needs and accelerate time to market of new initiatives.
Virtualisation is the perfect component of a forward thinking IT strategy, offering a natural progression to cloud computing which allows organisations to take existing virtualised workloads and move them into a cloud environment. Virtualisation can be seen as preparing businesses for a move to the cloud and is the perfect platform to migrate mission-critical workloads.