Overall demand for storage capacity is growing by about 60 percent per year, according to IDC. Another research company, Enterprise Strategy Group, pegs the annual growth rate of data between 30 percent and 60 percent.
“Organizations are having a hard time getting their arms around all that data,” said ESG analyst Lauren Whitehouse. Economic woes are making it even harder, with frozen or scaled-back budgets, while the downturn isn't expected to significantly slow data growth next year.
Stuck in that bind, organizations don't want to have to roll out a gigabyte of capacity in their own data centers for every new gigabyte that's created, analysts said.
“What we'll see more of in companies is a focus on efficiency,” IDC analyst Rick Villars said. They're seeking to increase the utilization of their storage capacity as well as other IT resources.
A big part of that effort is virtualization of storage, which often goes hand in hand with server virtualization and became a mainstream technology in 2008, according to analyst John Webster of Illuminata. Storage vendors are offering more virtualization products and seeing more demand for them, he said. A virtualization capability such as thin provisioning, which lets administrators assign storage capacity to a new application without having to figure out how much it ultimately will need, helps make better use of resources, Webster said.
But in addition to the trend toward disconnecting logical from physical resources, there were a handful of acquisitions this year that signaled other trends in storage world.
1. Brocade-Foundry
On Dec. 19, Brocade Communications and Foundry Networks completed a deal they had announced in July before navigating the roughest waters the financial and credit markets have seen in a generation. The merger, now valued at $2.6 billion, is intended to address a coming merger of SAN (storage area network) and LAN technology.
SAN builders have long relied on Fibre Channel, a specialized networking technology designed not to drop packets. But in most cases, the rest of the enterprise network is based on Ethernet, which is cheaper than Fibre Channel and now available at higher speeds. Maintaining both requires more adapters on storage equipment and adds to an IT department's workload. The two types of networks are headed toward gradual consolidation under the FCOE (Fiber Channel Over Ethernet) standard, which is intended to make Ethernet reliable enough for storage networks. Then, Ethernet can be the network of choice across data centers and keep getting faster.
Brocade wasn't the only company thinking this way. Cisco, which will be the main competitive target of the merged company, bought out Nuova Systems in April and simultaneously announced a line of routing switches designed to connect the whole data center. The flagship Nexus 7000, which Cisco has positioned as one of its most important products ever, is built to scale to 15T bps (bits per second) and has a virtualized version of IOS (Internetwork Operating System) called NX OS. Like the combination of Brocade and Foundry, the Nexus line is likely to help enterprises virtualize their storage and computing resources and eventually streamline networking and management.
EMC and NetApp also introduced FCOE products this year. But the protocol is not expected to be in widespread use until 2010.
2. IBM-Diligent
In April, IBM acquired Diligent Technologies, which specializes in data de-duplication for large enterprise storage systems. The company didn't reveal how much the acquisition cost, but it was a key move in a market that could grow to US$1 billion in annual revenue by 2009, according to research company The 451 Group.
De-duplication systems find identical bits of data in a storage system, treat them as redundant, and eliminate them. So if there are several nearly identical copies of a document, all will be deleted except one copy and the differences that are unique to the other copies.
The Diligent deal was an early move in a year full of de-duplication activity. In June, Hewlett-Packard introduced a suite of de-duplication systems for small and medium-sized businesses and added some features to its HP StorageWorks backup line. And in November, EMC, Quantum and Dell said they would use a common software architecture for data de-duplication products. Dell will enter the de-duplication business next year. It is already a major reseller of EMC gear, under a partnership that in December was extended until 2013.
Data de-duplication can reduce the amount of storage capacity an enterprise requires by as much as two thirds, said ESG's Whitehouse. It has been available before, but this year companies started to integrate it with storage arrays or sell it in appliances, bringing the technology closer to a turnkey solution, she said. They also established data de-duplication as a technology customers could trust, at least for archived material.
“If you eliminate a block of data that somehow negates the value of that data when you recover it … that's a really scary prospect for some companies,” Whitehouse said.
So far, most enterprises are only using it for secondary storage, or the archived information that's backed up for safekeeping, she said. The next step will be to embrace de-duplication for primary storage, the data that applications are using in real time. Users will start to trust the technology enough for that next year, she said. In July, NetApp enhanced its V-Series storage virtualization products so they can perform de-duplication on primary storage systems from third parties such as EMC, Hitachi and HP.
3. EMC-Pi
In late February, enterprise storage giant EMC bought Pi, a provider of software and online services for consumers to keep track of personal information stored locally or online. The deal, which followed the company's 2007 buyout of online backup provider Mozy, was one sign of growing interest in cloud storage.
Handing off personal or corporate data to a third party's hard drives and accessing it via the Internet can be a less expensive alternative to provisioning all that capacity in your data center or home network. It may be used in conjunction with cloud-based applications, but also just for archiving or disaster recovery, Illuminata's Webster said. In many cases, the cloud-storage service can be set up as a target when data is being backed up. The information can be sent to the cloud only or to the cloud and a dedicated tape backup system simultaneously, he said.
With the economy weakening, cloud storage will be big next year, Webster believes. Paying for additional capacity on a monthly basis moves that expense out of the IT department's capital budget and into its operational budget, which tends to be easier to fund when times are tough, he said. It's also relatively quick because nothing needs to be purchased or installed, he added.
A related option, managed services, may also take off in the coming year, Webster said. While keeping their own storage systems in-house, enterprises can pay a vendor such as Brocade or IBM to manage it for them remotely. The vendor can monitor alerts through an appliance at the customer's site and respond if needed. If IT staff needs to be cut back, this may be one way to maintain service levels to the rest of the company, Webster said.