Monday 31 January 2011

It’s time to solve the “Big Problem” stalling today’s server and desktop virtualization projects

Deployment and operation of a successful virtualization strategy depends on several core components: the right people, the right solution, a strong grip on infrastructure investment, and a clear strategy.

Nonetheless, whether it’s a straightforward data consolidation project or a larger virtualization-based migration and integration exercise, most organizations must overcome potential issues relating to flaws in the core components that, if left unaddressed, will hinder a successful project. This is a big challenge for companies, something that research organizations such as Gartner have attested to, but not an insurmountable one.

First and foremost, it’s critical that the IT department keeps on top of managing and controlling storage assets. Poor use of available storage assets, guerrilla purchasing and deployment of storage at a workgroup level, and disconnected storage silos are among the biggest problems. And all result in strains on IT spending, significant storage management overhead, and inherent inefficiency.

However, solving this does not have to entail a widespread rip-and-replace of existing hardware. Far from it, in fact. Intelligent use of software enables you to consolidate your storage estate virtually, rather than just physically, allowing the business to access unused capacity across the resource, rather than trapping it in local silos and workgroups, saving money and improving utilization in one fell swoop.

This is the Software Advantage we focus on. Software-based storage virtualization infrastructure delivers transparency and flexibility, and enables businesses to tackle current and future needs while at the same time bypassing the physical constraints of a hardware-centric storage solution that can hinder workflow.

Consolidating storage is not just a space utilization issue; it is also one of cost saving. A consolidated environment serving virtual and physical machine storage requirements can, if done properly, deliver a range of operational efficiencies from data and back-up management improvements to maximizing availability, all of which reduce the need for costly physical intervention to solve storage resource issues.

When done well, and without the need for a massive investment in new hardware to fit the strategy, storage virtualization software can deliver significant cost savings in the short and long term, enabling you to actually achieve your projected return on investment on your overall virtualization initiative, while also unlocking the asset value of data and reducing the risk of data loss or data management errors.

This Software Advantage in storage virtualization is at the heart of what we do at DataCore. With that in mind, today marks the launch of DataCore’s SANsymphony-V, the newest version of our storage virtualization software. SANsymphony-V delivers the Software Advantage by freeing customers from the high-costs, inadequate performance, inflexibility, and vendor lock in inherent in a hardware-centric approach to storage. It is an open software platform that “future proofs” your business against changing storage requirements caused by server and desktop virtualization initiatives, and enables customers to repurpose existing resources more efficiently and choose lower-cost alternatives when adding new resources. In short, SANsymphony-V solves today’s Big Problem stalling desktop and server virtualization projects: the storage problem.

No comments: