Many organizations are comfortable virtualizing infrastructure for backend or business process applications, but balk when it comes to top-tier transaction or mission-critical ones. Part of the reason is that virtual infrastructure has a habit of overwhelming I/O resources, putting crucial, time-sensitive data in limbo. But while this may seem like a networking issue, it is in fact a storage problem as well, according to DataCore's Augie Gonzalez. But through intelligent software, advanced caching, Flash memory and improved SAN technology, the storage side of the house is quickly catching up to servers and networking.
Cole: Enterprises are still hesitant to trust their virtual infrastructure with mission-critical applications. What needs to happen to overcome this fear?
Gonzalez: Education and proof points. Enterprises need to see first-hand examples of colleagues who confronted the roadblocks trying to virtualize their Tier 1 apps, and overcame them without spending exorbitant money on the solutions. Traditional thinking is that virtualization will bottleneck and slow down critical business applications, yet the productivity and cost-saving advantages are driving the need to move to virtualization. The move is inevitable, and intelligent virtualization software can overcome the performance dilemma by harnessing the latest CPU and memory technologies to dramatically increase performance.
IT has to be more forthcoming and public about their successes. Many are apprehensive to do so, fearing it discloses trade secrets with competitors. We at DataCore have been fortunate in that many of our customers are openly discussing their approach to supporting mission-critical applications in virtual environments. We need to get the word out on the enterprises that are successfully virtualizing their Tier 1 applications. I say, share a little — learn a lot.
Read the rest of the interview: