As IT organisations build their storage plans and create a vision for the future, cloud storage and the paradigm on which it depends should be a priority. Cloud storage technologies can help IT organisations cope with dynamic demands and the deluge of data to deliver competitive IT storage services. Implementations can take advantage of both private and public storage cloud capabilities but, when doing so, organisations must carefully consider the pitfalls of using public cloud storage capabilities.
Public cloud storage continues to gain attention from enterprise IT organisations. Although interest is keen, many organisations are stuck in the “investigation” stage and are reluctant to move large-scale workloads into public cloud storage environments. The reluctance is well founded and often drives IT organisations to build their own private cloud storage environments instead. Although appealing, private cloud infrastructures also require a leap of faith due to the lack of software and hardware availability that can unify an environment into a solid cloud storage implementation that is true to the cloud paradigm.
In either case — public or private — it is likely that a cloud storage implementation includes both private infrastructure and public services. A private infrastructure offers a comforting level of control over the data in the storage cloud, but can be difficult to implement in a manner that satisfies capacity elasticity demands. Alternatively, public services offer easy scalability, but also pollute the environment with a number of difficult-to-solve data integrity issues and risks that can be hard to measure.
Since both public and private cloud infrastructure are likely to exist, it is important to consider the significant issues that arise with public cloud storage. Each issue requires consideration and either dismissal as a relevant factor or, more likely, a mitigation plan to minimise its negative impact on a cloud storage implementation.
Public cloud-storage service providers will occasionally fail to provide data availability
Since data “unavailability” events happen infrequently, identifying remediation steps ahead of time is difficult, particularly when the provider insists on the resiliency of its environment. Unfortunately, unless an IT organisation is a large customer of a public cloud provider, do not expect any special treatment. Instead, plan to live with surprise outages as a rare, but real, fact of life in the cloud storage industry. Large-scale customers wield more influence and can demand specific data availability steps to avoid outages but, even in those situations, vendors might be reluctant to customise their SLAs.
As large amounts of data move to a public service provider, a defacto vendor lock-in occurs
Customers should demand clarity on pricing policies, road maps for the evolution of the provider’s services, and an alternate vendor migration process. Although the movement of large amounts of data to a service provider can make it impractical to move to another vendor, and changes in pricing policies can cause unfavourable economics, other related lock-in issues should be considered, such as metadata, application programming interfaces (APIs) and data migration
Not all data is appropriate for cloud storage
Organisations should segregate business critical data from less important data to manage the appropriateness of placing data into public storage facilities. Moving data outside the direct control of an organisation creates a number of implications that require consideration for the criticality of data.
Tracking and identifying a public cloud service supplier’s SLA compliance may be difficult due to lack of management tools
Tools to prove billing, SLA compliance and successful completion (and data integrity) of a backup operation to a public cloud service can be minimal or nonexistent. Thus, an IT organisation must accept the risk that a cloud storage provider could fail to deliver (or not even know of) services consistent with the expectations described in an SLA.
Legal and procedural issues can compromise access to data
Issues such as a cloud storage client failing to pay consumption fees or a supplier’s financial or legal issues can limit data availability. Data crossing national boundaries, data intermingling with the tainted data of others and supplier bankruptcies are yet to be played out on a legal stage. For example, how do we determine when a file is deleted? How can a deletion be proved? What happens if a deleted file ends up in the hands of a protagonist? Such potential indiscretions require careful scrutiny by an IT organisation’s legal team before data moves to a public cloud storage provider.
Cloud storage suppliers will mingle data to minimise cost through high storage utilisation
Low-cost offerings tend to drive the sharing of storage equipment between storage customers and the commingling of data. For some, this will pose an unacceptable risk in terms of data privacy and ownership in a multi-tenancy environment. IT organisations must realise that requiring a supplier to separate data into isolated hardware will increase costs and potentially require upfront charges by the cloud storage vendor to maintain a separate storage hardware infrastructure, potentially eliminating any price advantages.
The performance of cloud storage is unlikely to match an internal storage infrastructure
Separating users from data and the generic nature of cloud storage serves to deliver a lowest common denominator for performance expectations. Since the user is physically separate from the storage equipment, moving large amounts of data across distances can be impractical. In addition, users will be unable to tweak public service capabilities to suit their specific requirements. Considerations include latency and bandwidth, as well as usage tuning.
Cloud storage forces the re-examination of project management procedures to keep pace with on-demand storage
By its nature, cloud storage improves responsiveness and agility when delivering storage services. In the absence of cloud storage, data centres and IT organisations often plan new service implementations on a time scale measured in months. Identifying specific requirements, defining architectures, engaging vendors and provisioning storage equipment takes time. Cloud storage short-circuits this sometimes lengthy rollout of new storage services and thus compels organisations to rethink project timeline assumptions. Rather than dealing with the lengthy process of bolting in new storage hardware, cloud storage capacity is there for the taking.
Providers are starting small and simple as they attempt to satisfy their customers’ needs
Services such as endpoint backup, individual server backup or email archiving are common, while in contrast, services to manage health information for hospitals are rare due to regulatory and trust concerns. Providers tend to tailor solutions to SMB IT environments, thereby refining capabilities and growing their infrastructures as they go. The SMB market also offers extensive sales opportunities for providers searching for a customer base and quick revenue. The smaller-scale environments of SMBs typically measure in terabytes instead of petabytes and, thus, better suit small cloud provider scalability, service, support and distribution capabilities.
Large enterprises often demand more than storage service providers can offer
Large IT organisations can scale to the multiple-petabyte range, with data centres distributed around a country or the world. With greater scale comes more regulatory concerns that require service providers to pay close attention to, for example, the geographic placement of data. Both Amazon and Microsoft Windows Azure are ramping up to service large customers, but most of the cloud storage service industry leaves much to be desired as far as large enterprises are concerned. Large enterprises are attracted to the promise of cloud storage, but will quickly find that its capabilities are lacking. However, for smaller projects or those not encumbered with compliance regulations, cloud storage providers can help.
About the author:
Gene Ruth is a research director at Gartner, with a focus on enterprise storage technologies.