Translate

Tuesday 29 May 2012

2 virtualization worries that keep admins up at night

System downtime and slow application performance related to storage in virtualized environments are the primary concerns of IT administrators surveyed in a new report on the state of private clouds.
IT administrators are more concerned about performance issues and less about the cost of storage than they were a year ago, according to the report, “2012 State of the Private Cloud Survey” by DataCore Software, a provider of storage virtualization software.
DataCore’s survey, which polled IT administrators in the private and public sectors worldwide, revealed that 63 percent of respondents consider system downtime and slow application performance to be their primary storage-related virtualization concerns, up from 36 percent in 2011.

IT administrators still consider the rising cost of storage to be a problem with virtualization initiatives, but overall it is declining as a major concern, with just over half (51 percent) describing increasing storage costs as one of their biggest problems (down from 66 percent in 2011), the report states.

While increasing storage costs may be less of an issue than last year, storage-related costs continue to comprise a significant portion of virtualization budgets, with 44 percent of respondents saying that storage costs represent more than a quarter of their total budget for virtualization, the report states

Many companies are allocating more money for storage, with 37 percent saying their storage budgets have increased this year, while just 13 percent say they have been cut.

Despite this, more than one in three respondents (34 percent) admit they underestimated the impact server/desktop virtualization would have on their storage costs. For those deploying a private cloud, more than one in four (28 percent) underestimated how storage costs would be affected.

“More companies are virtualizing more servers than ever before, but a notable faction of users – about a third – are underestimating the storage costs associated with server and desktop virtualization projects as well as the storage costs associated with private clouds,” said to Mark Peters, senior analyst with Enterprise Strategy Group.

Throwing more money at storage has not reduced performance concerns for companies that have embraced server and desktop virtualization. Even with an increase in the average storage budget, more companies reported significant problems with storage-related performance, bottlenecks, downtimes and business continuity in 2012, the report states.

“Rather than simply continuing to expand traditional storage solutions, IT managers would be well advised to consider addressing performance and downtime issues with a storage virtualization solution that enables them to apply a simplified management approach to manage their storage resources in a single, logical storage pool," Peters said.

“As virtualization moves from theory to practice, storage-related performance and availability are becoming of greater concern to businesses, but cost concerns haven’t gone away,” said George Teixeira, president and CEO of DataCore Software, who recommends the use of storage hypervisors.

Storage hypervisors ensure high performance and availability in the storage infrastructure through features such as auto-tiering, device interchangeability, thin provisioning and continuous data protection, Teixeira noted. “A storage hypervisor solves the cost issue by enabling enterprises to make greater use of existing storage infrastructure, while reducing the need for large-scale storage hardware,” he said.

Read full article at: http://gcn.com/articles/2012/05/01/datacore-storage-virtualization-survey-top-concerns.aspx

Tuesday 22 May 2012

DataCore Software integrates enterprise-wide Storage Hypervisor Management with VMware vCenter; Empowers vSphere Admins to manage VMs and storage from a single console

"The new DataCore Storage Hypervisor Plug-in for VMware vCenter Server combines two industry-leading management solutions to offer customers the best of both worlds through a single view infrastructure management," said Parag Patel, vice president, Global Strategic Alliances, VMware. "Customers can better manage and realise the full benefits of their desktop, server and storage resources in a VMware virtual environment."
DataCore Software today announced that it has released new plug-in software to integrate the powerful enterprise-wide storage management capabilities of DataCore’s SANsymphony™-V Storage Hypervisor with VMware vCenter™ Server. The management plug-in software is available immediately for download to all SANsymphony-V customers.

VMware vCenter is the de facto standard for managing VMware virtual infrastructures and the SANsymphony-V Storage Hypervisor combines powerful storage virtualisation and enterprise-wide storage management. Together, these capabilities are seamlessly integrated to allow a VMware administrator to efficiently manage all their virtual machines (VM) and storage resources from a single console.

DataCore and VMware – A Combination Delivering Compelling Benefits

VMware administrators are now able to provision, monitor and manage their storage and server resources from a single pane of glass and perform common storage tasks and complex scheduling workflows to clone, snapshot and restore datastores and VMs without having to become storage experts

The vCenter Plug-In helps VMware administrators increase productivity and respond faster to user needs in a number of ways, including:

•Enabling an end-to-end view for VMware to storage from a virtual machine, host or virtual disk perspective; central monitoring and management of storage resources

•Rapidly provisioning virtual disks to VMware vSphere®/ESX® hosts

•Creating and restoring snapshots of datastores where VMs are housed

•Scheduling tasks to coordinate workflows between vSphere and SANsymphony-V servers

•Taking consistent snapshots of a datastore for validation or before making changes to the environment

•Scheduling recovery clones (full copy) or efficient differential snapshots to rapidly restore VMs

•Employing high-availability mirror protection to virtual disks

•Visualising current conditions in easy-to-understand terms to simplify troubleshooting and performing root cause analysis for virtual machines and their storage

•Lowering OpEx and enhancing administrator productivity – through automated workflows, task wizards and lesser need for storage training and specialised skill sets

Simplify Difficult Storage Tasks: Snapshots, Clones and Restorations
The software’s built in wizards and simple command menus take the complexity out of creating, scheduling or capturing snapshots of VMs. Administrators simply select the datastore housing the specific VMs of interest. The plug-in signals the VMware vCenter Server to quiesce those VMs and then triggers SANsymphony-V to take snapshots of the corresponding datastore. Both full clone and differential (lightweight) snapshots are supported. The known positive state in those online snapshots may be used to set up new VMs or quickly recover from malware and user errors that may have damaged their integrity.

Snapshots with a click of the mouse may be scheduled at regular intervals, for instance, Saturday morning at 3:00 a.m. when it will not disrupt daily operations. The workflow can also define how long the snapshots should be retained before being deleted.

VMware administrators want to focus on VMs not storage complexity
Transitioning to a virtualised environment creates new requirements for the effective management of the underlying storage resources that are virtualised. VMware administrators are faced with requests to provision new VMs, set up datastores and manage cloning and snapshot copy operations. There can be a lack of in-depth knowledge of storage systems and no end-to-end management leaving VMware administrators with an incomplete view into how physical resources in the virtual environment are used. This in turn, can increase the amount of time and effort required to manage the environment. Therefore, VMware administrators need to be prepared with the right management tools to take on these tasks and as a result are using VMware vCenter to manage their servers.

The SANsymphony-VPlug-in for VMware vCenter provides a rich set of management capabilities for storage. From the plug-in, vSphere administrators can fulfill the dynamic storage needs of their vSphere/ESX clusters, hosts and virtual machines without having to become a storage expert or having to leave their familiar central console.

DataCore and VMware: The Fully Virtualised Data Centre
“VMware customers around the globe running VMware vSphere rely on DataCore’s solutions to deliver the critical “third dimension” of virtualisation – storage. DataCore has architected its software to cost-effectively shape the shared storage infrastructure required by virtualisation,” states Carlos Carreras, vice president of alliances and business development for DataCore Software. “The SANsymphony-VVMware vCenter Plug-in makes the storage management job significantly easier for VMware administrators. They now have a ‘one-stop’ administrative console to monitor and manage the full set of server and storage resources across their entire virtual data centre.”

For more information:
For more information please see the DataCore SANsymphony™-V Plug-in for VMware vCenter or email DataCore at info@datacore.com .

Data sheet: http://pages.datacore.com/SANsymphony-VforVMwarevCenter.html






Thursday 17 May 2012

Storage hypervisor makes cloud storage fast, affordable and reliable

http://searchcloudprovider.techtarget.com/podcast/Storage-hypervisor-makes-cloud-storage-fast-affordable-and-reliable
Cloud providers like to tout how quickly and easily they can scale their environments to meet customer demands, but nothing puts that to the test like cloud storage growth. Even outside of the cloud, storage demands can easily spiral out of control, and there comes a point when it's no longer economical for a provider to keep adding boxes. Anticipating that challenge before launching its cloud services in 2009, Host.net deployed DataCore Software's storage hypervisor, SANsymphony-V, a platform that Host.net credits for providing its customers with 900 days of consecutive, uninterrupted uptime.

In this case-study podcast, SearchCloudProvider.com site editor Jessica Scarpati gets a crash course on storage hypervisors from George Teixeira, CEO of DataCore Software, before diving into the details of Host.net's deployment with Jeffrey Slapp, vice president of virtualization services at the Florida-based cloud and managed services provider.

Wednesday 16 May 2012

N-TEC and DataCore Software partner to deliver easy to use SAN Appliances

Built-in DataCore Storage Hypervisor optimizes storage performance and Business Continuity


Powered by DataCore, the N-TEC Rapidcore Storage Appliance makes it practical for Small and Mid-size Businesses to virtualize and meet increasing High-Availability and performance demands

DataCore Software announced that N-TEC, a European system builder and storage vendor, will build and deliver a range of Storage Appliances under the new “DataCore Ready” program, which is designed to help system builders grow their storage business and serve the increasing market demand for easy to use storage solutions. The “DataCore Ready” program makes it easy for system builders to develop turnkey storage systems and appliances based on DataCore’s SANsymphony™-V storage hypervisor. As one of the first partners to announce solutions under this new program, N-TEC has launched a new RapidCore Storage Appliance product line which comes pre-installed with DataCore’s storage hypervisor software. The N-TEC RapidCore Storage Appliance provides small and mid-size businesses a cost-effective and easy-to-use, turnkey solution to meet the needs of managing storage growth and the growing demand driven from virtualization projects for greater business continuity and performance.

The “Powered by DataCore" N-TEC RapidCore Storage Appliance is now available as a one-stop turnkey solution offering from N-TEC with pricing for hardware, software, installation, support and services all included. The great flexibility and simplicity provided by these pre-configured appliances, make them the ideal SAN solution for small and mid-size enterprises. The built-in storage hypervisor software ensures high-availability for business continuity, fast performance, optimal resource efficiency and ease of administration.

"Businesses are demanding affordable, reliable, universal and scalable storage solutions. The storage hypervisor software from DataCore complements the N-TEC portfolio ideally. We are very excited about the collaboration and the rapid certification of the solution”, says Sven Meyerhofer of N-TEC. "With the N-TEC RapidCore Storage Appliance ‘powered by DataCore’ we now offer a cost-effective turnkey high-availability solution that integrates the industry’s most advanced storage technologies in an appliance form factor that anyone can easily use."

The N-TEC RapidCore Storage Appliance is available in three versions ranging in capacity from two to 64 terabytes. Additional storage devices can be easily integrated and expanded to add capacity on-the-fly. The solution supports SATA, SAS or SSD storage devices, both iSCSI or FC connectivity are available either included or as an option. High-availability via a continuous mirroring design prevents downtime and data loss. In addition, features such as thin provisioning, automatic failover and failback multipathing (MPIO) and asynchronous replication are integrated within the appliances. The management of the appliance is based on an intuitive GUI which makes it easy for anyone to administer and perform operations via wizards, automation and simple mouse clicks.

"Over many years, N-TEC has established itself as a major manufacturer of storage solutions in the German/European market, that makes N-TEC an ideal first partner to launch our ‘DataCore Ready Program’ for system builders. The RapidCore Storage Appliance is a high-performance solution that is cost-effectively priced to address the growing market for virtualisation projects in the small to mid-size business environments. With this partnership, DataCore opens up the larger volume market potential available to SAN appliances who demand a one-stop turnkey solution", says Siegfried Betke, Director of Business Development and CE Strategic Channel Sales at DataCore Software.

The DataCore Ready Program

The DataCore Ready Program identifies solutions that are trusted to enhance DataCore SANsymphony-V Storage Hypervisor-based infrastructures. While DataCore solutions interoperate with common open & industry standard products, those listed as DataCore Ready have completed additional verification testing thereby providing customer confidence in joint solution compatibility. The DataCore Ready designation is awarded to third party products that have successfully met the verification criteria set by DataCore through the successful execution of a functional test plan.

Through the program, third-party products and solutions can gain exposure to DataCore customers and resellers. The DataCore Ready designation enables partners to generate new revenue opportunities, improve customer satisfaction, and also increases mindshare leveraging the DataCore brand. The DataCore Ready Programme is designed for software vendors, hardware vendors, hosting service providers and cloud technology firms who have demonstrated product and solution compatibility with DataCore products. The DataCore Ready logo helps customers quickly identify DataCore partner’s products and solutions that are tested and optimized for DataCore SANsymphony-V.

About N-TEC: The N-TEC GmbH is headquartered in Munich and has specialized in the development and marketing of storage and server solutions.

Among standard products, N-TEC develops and produces customized solutions. For more information http://www.n-tec.eu

Friday 11 May 2012

Can Storage Hypervisors Enable BYOD for Data Storage?

By George Teixeira, CEO & President DataCore Software

I’m sure you have seen the wider use of the term BYOD (Bring Your Own Device) in our industry and while I know it mainly refers to mobile devices, it got me thinking about why this is an inevitable trend – fueled by customers wanting greater buying power and flexibility. Therefore why shouldn’t this same trend impact their storage devices?

In reality, with storage hypervisors it already has. DataCore’s SANsymphony-V software empowers users with hardware interchangeability (another way to say BYOD) and a powerful, enterprise-wide feature set that works with whatever the existing or latest storage devices happen to be. For those skeptics out there, try it out yourself Download a Storage Hypervisor or give it a Test Drive on our hosted virtual environment.

Why do I say it’s inevitable? Because it has already happened before with server hypervisors and they dramatically make the point. Think how the world has changed in terms of server platforms. We now take for granted that with VMware and Microsoft Hyper-V software in place that the hardware they run on has become largely irrelevant and for the most part a matter of personal preference.

Think about it. When you deploy VMware vSphere or Hyper-V, do you really care if it is running on a Dell, HP, IBM or Intel server platform? Sure you may have a preference, but it clearly a secondary consideration. Instead, other factors like company buying practices, best price or the personal vendor choice of your boss often drive the decision. It is now obvious that after the advent of server virtualization, we now can take it for granted that software architecture is what truly matters and that the underlying hardware will continuously change. The software-based hypervisor architecture is what endures and enables new technologies to be incorporated over time. Specific hardware devices (or brands) are like fads -- they will continue to "come and go." The faster pace of change and the need to continue to drive costs down will make this trend inevitable. BYOD in a corporate sense means greater purchasing power and greater freedom to incorporate ”best value” storage solutions when and where they are needed.

Don’t take my word for it. Thousands of customers have already deployed DataCore SANsymphony-V and are realizing the compelling benefits of hardware independence. Find out more by checking out the SANsymphony-V analyst lab reports and the latest storage hypervisor capabilities like automated storage tiering that let you not only bring new devices into your infrastructure but get the most out of them.

Friday 4 May 2012

Big Smart Storage for a Big Data World - Top 3 Issues to Address, Virtualization and Auto-tiering

http://www.storagereview.com/big_smart_storage_for_a_big_data_world
In 2012, the phrase “big data” is on everyone’s lips, just like virtualization was five years ago. Also just like virtualization, and cloud computing, there is a lot of hype obscuring the more important discussions around IT challenges, skill sets needed and how to actually build an infrastructure that can support effective big data technologies.

When people waxed poetic about the wonders of virtualization, they failed to recognize the new performance demands and enormous increase in storage capacity required to support server and desktop initiatives. The need for higher performance and continuous availability came at a high price, and clearly, the budget-busting impact subsequently delayed or scuttled many projects.

These problems are now being solved through the adoption of storage virtualization that harnesses industry innovations in performance, drives greater commoditization of expensive equipment, and automates storage management with new technologies like enterprise-wide auto-tiering software.
That same dynamic is again at work, only this time with big data.

Business intelligence firms would have you believe that you simply slap their analytics platforms on top of a Hadoop framework and BINGO – you have a big data application. The reality is, Hadoop production deployments are practically non-existent in large enterprises. There are simply not many people who understand and can take advantage of these new technologies. In addition, the ecosystem of supporting technologies and methodologies are still too immature, which means the industry is going through a painful process of trial and error, just as it did with virtualization five years ago. Again, underestimating storage requirements is emerging as a major barrier to adoption as the age-old practice of “throwing hardware at the problem” creates often-untenable cost and complexity.

To put it bluntly, big data does not simply require big storage – it requires big smart storage. Unfortunately, the industry is threatening to repeat major mistakes of the past!
Before I turn to the technology, let me point out the biggest challenge being faced as we enter the world of big data - we need to develop the people who will drive and implement useful big data systems. A recent Forbes article indicated that our schools have not kept pace with the educational demands brought about by big data. I think the following passage sums up the problem well: “We are facing a huge deficit in people to not only handle big data, but more importantly to have the knowledge and skills to generate value from data — dealing with the non-stop tsunami. How do you aggregate and filter data, how do you present the data, how do you analyze them to gain insights, how do you use the insights to aid decision-making, and then how do you integrate this from an industry point of view into your business process?”

Now, back to the product technologies that will make a difference - especially those related to managing storage for this new paradigm.
As we all know, there has been an explosion in the growth of data, and traditional approaches to scaling storage and processing have begun to reach computational, operational and economic limits. A new tact is needed to intelligently manage and meet the performance and availability needs of rapidly growing data sets. Just a few short years ago, it was practically unheard of for organizations to be talking about scaling beyond several hundred terabytes, now discussions deal with several petabytes. Also in the past, companies chose to archive a large percentage of their content on tape. However, in todays on-demand, social media centric world, that’s no longer feasible. Users now demand instantaneous access and will not tolerate delays to restore data from slow and or remote tape vaults.

Instant gratification is the word-of-the-day and performance is critical.
New storage systems must automatically and non-disruptively migrate data from one generation of a system to another to effectively address long-term archiving. Also adding to this problem is the need to distribute storage among multiple data centers, either for disaster recovery or to place content closer to the requesting users in order to keep latency at a minimum and improve response times.

This is especially important for the performance of applications critical to the day-to-day running of a business such as databases, enterprise resource planning and other transaction oriented workloads. In addition to the data these resource-heavy applications produce, IT departments must deal with the massive scale of new media such as Facebook and YouTube. Traditional storage systems are not well suited to solve these problems, leaving IT architects with little choice but to attempt to develop solutions on their own or suffer with complex, low performing systems.

Storage systems are very challenged in big data and cloud environments because they were designed to be deployed on larger and larger storage arrays, typically located in a single location. These high-end proprietary systems have come at a high price in terms of capital and operational costs, as well as agility and vendor lock-in. They provide inadequate mechanisms to create a common, centrally managed resource pool. This leaves an IT architect to solve the problem with a multitude of individual storage systems usually compromised of varied brands, makes and models, requiring a different approach to replication across sites and some form of custom management.

With this in mind, there are major issues to address:
  • Virtual Resource Management and Flexibility: It takes smart software to transition the current landscape of monolithic, independent solutions into a virtual storage resource pool that incorporates and harnesses the large variety of storage hardware and devices available today. High Capital and Operational Costs: To reduce operational costs, achieving greater productivity through automation and centralized management is essential. Just as importantly, empowering companies with more of an ability to commoditize and allow hardware inter-changeability is critical to cost-effective storage scalability. No one can afford “big bucks” to throw more expensive “big box” solutions at a big data world. Hypervisors such as VMware and Microsoft Hyper-V have opened up purchasing power for users. Today, the use of a Dell, HP, IBM or Intel server has become largely a personal company preference. This same approach is required for storage and has driven the need for hypervisors.
  • Scalability Demands Performance and Continuous Availability: As data storage grows and more applications, users and systems need to be serviced, the requirement for higher performance and availability rise in parallel. It is no longer about how much can be contained in a large box, but, how do we harness new technologies to make storage faster and more responsive. Likewise, no matter how reliable any one single storage box may be, it does not compare to many systems working together over different locations to ensure the highest level of business continuity, regardless of the underlying hardware.
  • Dynamic Versus Static Storage: It is also obvious we can no longer keep up with how to optimize enterprise data for maximum performance and cost trade-offs. Software automation is now critical. Technologies such as thin-provisioning that span multiple storage devices are necessary for greater efficiency and better storage asset utilization. Powerful enterprise-wide capabilities that work across a diversity of storage classes and device types are becoming a must-have. Automated storage tiering is required to migrate from one class of storage to another. As a result, sophisticated storage hypervisors are critical, regardless of whether the data and I/O traffic is best-migrated from high-speed dynamic RAM memory, solid state drives, disks or even cloud storage providers.
Intensive applications now need a storage system that can start small, yet scale easily to many petabytes. It must be one that serves content rapidly and handles the growing workload of thousands of users each day. It must also be exceedingly easy to use, automate as much as possible and avoid a high degree of specialized expertise to deploy or tune. Perhaps most importantly, it must change the paradigm of storage from being cemented to a single array or location to being a dispersed, yet centrally managed system that can actively store, retrieve and distribute content anywhere it is needed. The good news is, storage hypervisors like DataCore’s SANsymphony-V, deliver an easy-to-use, scalable solution for the IT architect faced with large amounts of data running through a modern-day company.

Yes, big data offers big potential. Let’s just make sure we get storage right this time to avoid repeating the big mistakes of the past.

Thursday 3 May 2012

DataCore Software 2012 Private Cloud and Storage Virtualization Survey: Performance Bottlenecks and Downtime Top Storage-Related User Concerns

DataCore Software 2012 Private Cloud and Storage Virtualization Survey: Performance Bottlenecks and Downtime Top Storage-Related User Concerns

A Perfect Storm of Possibilities for Storage Virtualization

http://vmblog.com/archive/2012/05/01/datacore-software-2012-private-cloud-and-storage-virtualization-survey-performance-bottlenecks-and-downtime-top-storage-related-user-concerns.aspx

According to Mark Peters, senior analyst, Enterprise Strategy Group, “If the DataCore survey shows anything, it's that the time is ripe for storage virtualization, both to meet the business objectives associated with virtualization projects and to reduce the risks associated with those initiatives. More companies are virtualizing more servers than ever before, but a notable faction of users – about a third – are underestimating the storage costs associated with server and desktop virtualization projects as well as the storage costs associated with private clouds.

“Rather than simply continuing to expand traditional storage solutions, IT managers would be well advised to consider addressing performance and downtime issues with a storage virtualization solution that enables them to apply a simplified management approach to manage their storage resources in a single, logical storage pool. For the respondents in DataCore's survey – and others – who are not using storage virtualization, I would say that logic, availability, and need are all aligned to say it's time to take a serious look.”


Government Computer News: 2 virtualization worries that keep admins up at night


System downtime and slow application performance related to storage in virtualized environments are the primary concerns of IT administrators surveyed in a new report on the state of private clouds.

IT administrators are more concerned about performance issues and less about the cost of storage than they were a year ago, according to the report, “2012 State of the Private Cloud Survey” by DataCore Software, a provider of storage virtualization software.

DataCore’s survey, which polled 289 IT administrators in the private and public sectors worldwide, revealed that 63 percent of respondents consider system downtime and slow application performance to be their primary storage-related virtualization concerns, up from 36 percent in 2011. About 10.8 percent of the respondents state they work with government.

IT administrators still consider the rising cost of storage to be a problem with virtualization initiatives, but overall it is declining as a major concern, with just over half (51 percent) describing increasing storage costs as one of their biggest problems (down from 66 percent in 2011), the report states.

While increasing storage costs may be less of an issue than last year, storage-related costs continue to comprise a significant portion of virtualization budgets, with 44 percent of respondents saying that storage costs represent more than a quarter of their total budget for virtualization, the report states

Many companies are allocating more money for storage, with 37 percent saying their storage budgets have increased this year, while just 13 percent say they have been cut.

Despite this, more than one in three respondents (34 percent) admit they underestimated the impact server/desktop virtualization would have on their storage costs. For those deploying a private cloud, more than one in four (28 percent) underestimated how storage costs would be affected.

...“As virtualization moves from theory to practice, storage-related performance and availability are becoming of greater concern to businesses, but cost concerns haven’t gone away,” said George Teixeira, president and CEO of DataCore Software, who recommends the use of storage hypervisors.

Storage hypervisors ensure high performance and availability in the storage infrastructure through features such as auto-tiering, device interchangeability, thin provisioning and continuous data protection, Teixeira noted. “A storage hypervisor solves the cost issue by enabling enterprises to make greater use of existing storage infrastructure, while reducing the need for large-scale storage hardware,” he said.

The online survey was conducted in March 2012. The survey asked a series of questions about virtualization and its impact on storage.

Wednesday 2 May 2012

DataCore Software Presents the Storage Hypervisor to "IDC Virtualisation and Cloud Conferences" ; Also on 22 May, DataCore will join IDC to present “Evolution of the Datacentre Conference” in London.

Live from the “IDC Virtualisation and Cloud Conference” in Dublin, DataCore Software, one of the industry’s premier providers of storage virtualisation software, is presenting the latest innovations of its storage hypervisor, SANsymphony-V.

DataCore have additionally been invited to host a customer testimonial slot from the Irish League of Credit Unions, detailing how they successfully use DataCore’s SANsymphony-V storage hypervisor to virtualise, manage and protect the data storage at 500 credit unions.
Later this month on 22 May, DataCore will also join IDC to present at their “Evolution of the Datacentre Conference” in London.

The IDC London Conference takes place on 22 May 2012 in the Grand Connaught Rooms. For more information on the IDC Evolution of the Datacentre Conference in London, visit http://www.cvent.com/events/evolution-of-the-datacentre-conference/event-summary-76ca2804ef284446827655acfff38e0d.aspx .

Partnership and attendance of DataCore Software at IDC Europe Conferences is a natural fit for both organisations. The results of a recent European IDC survey conducted at the end of 2011 showed that 35.8 per cent of European companies continue to invest in their infrastructure despite a flat economic climate. Storage and virtualisation are the cited by IDC as the top investment categories. Wider infrastructure challenges such as Private Cloud, “Big Data,” and storage efficiency, are identified to dramatically change the short-term market adoption of storage virtualisation.

Hence the reason why DataCore’s Regional Manager, Rupert Collier, will lead the discussion among attending IDC delegates on the virtues of changing the way organisations view storage, promoting effective data management through the deployment of a portable, independent storage hypervisor to alleviate their data centre issues.

Rupert’s slot will be followed by a customer testimonial from the Irish League of Credit Unions, who provide infrastructure services for 500 credit unions (not-for-profit regional banks) across Ireland and who have successfully deployed DataCore’s SANsymphony-V in a mirrored environment to guarantee high availability, business continuity, DR and better performance.

“It’s great to be invited to the IDC conferences to present customer success stories,” noted Rupert Collier, Regional Manager, DataCore Software. “Given the IDC European state of the market survey late last year, it appears that having DataCore in attendance addresses delegates’ concerns on how to actively identify alternative ways to improve usage of existing storage assets and how to avoid proprietary vendor lock in when expansion inevitably occurs. Most of our customers experience an immediate return on investment by implementing the storage hypervisor that delivers better utilisation and performance for their storage. It is the perfect complement to dynamic, virtualised infrastructures and cloud environments which cannot afford failures or degradations to performance.”

DataCore’s SANsymphony-V storage hypervisor centralises, manages and optimises the use of heterogeneous storage devices – regardless of price, performance, model, or hardware – and integrates flash memory, SSDs, hard drives and cloud storage. The auto-tiering functionality automates the progression and demotion of data across different storage devices – based on performance and cost criteria. Also, by integrating a cloud array gateway, DataCore expands auto-tiering to the cloud level, so non-critical backup and archiving data can be outsourced.

IDC’s Dublin Conference occurs on 1 May 2012 at Dublin’s Burlington Hotel. For more on the IDC Virtualisation and Cloud Conference in Dublin, visit http://www.cvent.com/events/virtualisation-and-cloud-ireland-conference/event-summary-a80359b954cd4d858cb334bb739f0db4.aspx .