Translate

Tuesday 24 December 2013

Monday 23 December 2013

DataCore deployed at over 10000 customer sites and selected as Software-defined Storage Vendor in Advantage Phase of Gartner Group “IT Market Clock"

DataCore Surpasses 10,000 Customer Sites Globally As Companies Embrace Software-Defined Storage

DataCore, a leader in software-defined storage, today announced that the company experienced significant customer adoption of its ninth-generation SANsymphony-V platform in 2013. As the company surpassed 10,000 customer sites globally, new trends have materialised around the need for businesses to rethink their storage infrastructures with software architecture becoming the real blueprint for the next wave of data centres.
“The remarkable increase in infrastructure-wide deployments that DataCore experienced throughout 2013 reflects an irreversible market shift from tactical, device-centric acquisitions to strategic software-defined storage decisions. Its significance is clear when even EMC concedes the rapid commoditization of hardware is underway. Their ViPR announcement acknowledges the ‘sea change’ in customer attitudes and the fact that the traditional storage model is broken,” said George Teixeira, president and CEO at DataCore. “We are clearly in the age of software defined data centres, where virtualisation, automation and across-the-board efficiencies must be driven through software. Businesses can no longer afford yearly ‘rip-and-replace’ cycles, and require a cost-effective approach to managing storage growth that allows them to innovate while getting the most out of existing investments.”
In addition to the mass customer adoption, DataCore’s software was recently nominated as a software-defined storage vendor in Gartner’s “IT Market Clock for Storage, 2013,” published September 6, 2013. The report, by analysts Valdis Filks, Dave Russell, Arun Chandrasekan et al., identifies software-defined storage vendors in the Advantage Phase, and recognises two main benefits of software-defined storage: 
“First, in the storage domain, the notion of optimising, perhaps often lowering, storage expenses via the broad deployment of commodity components under the direction of robust, policy-managed software has great potential value. Second, in the data centre as a whole, enabling multi-tenant data and workload mobility among servers, data centres and cloud providers without disrupting application and data services, would be transformational.”
Three major themes in 2013 shaped the software-defined storage market and defined the use cases of DataCore’s new customers:
Adoption and Appropriate Use of Flash Storage in the Data Centre
As more companies rely on flash to achieve greater performance, a unique challenge is arising when it comes to re-designing storage architectures. While the rule of thumb is five percent of workloads require top tier performance, flash vendors are doing their best to convince customers to go all flash, despite the low ROI. Instead, businesses have turned to auto-tiering software to make sure applications are sharing flash and spinning disk, based on the need to optimise performance and investment. Going beyond other implementations, DataCore has redefined automation and mobility of data storage with a new policy-managed paradigm that makes auto-tiering a true ‘enterprise wide’ capability that works across multiple vendor offerings and the many levels and varied mix of flash devices and spinning disks.
The IT Manager of the world’s largest construction project, Tim Olssen, of the Femern Tunnel connecting Scandinavia to Germany, is one such intelligent flash advocate:
“What we have achieved here with DataCore storage virtualisation software sets us on the road to affordable, flexible growth with synchronous mirroring to eliminate storage related downtime. Add the blistering speed of Fusion-io acceleration and we have created a super performing, auto tiered storage network, that does as the tunnel itself will do; connects others reliably, super fast and without stoppages”, Tim Olsson, IT Manager Femern A/S Denmark.
Virtualising Storage while Accelerating Performance for Tier-One Applications
Demanding business applications like databases, ERP and mail systems create bottlenecks in any storage architecture due to their rapid activity and intensive I/O and transactional requirements. To offset this, many companies buy high-end storage systems while leaving terabytes of storage unused. Now, though, businesses are able to combine all of their available storage and virtualise it, independent of vendor – creating a single storage pool. Beyond virtualisation and pooling, DataCore customers report faster application response times and significant performance increases – accelerating I/O speeds up to five times.
One such company is Quorn Foods, part of the Marlow Foods giant, who experience significant performance increases bringing peak data mining times down from 20 minutes to 20 seconds using DataCore’s SANsymphony-V platform.
Fred Holmes is the head of IT, Quorn Foods  summarises, “Like all things in IT, dramatic improvements to the infrastructure remain invisible to the user who only notices when things go wrong. But in this instance, no one could fail to notice the dramatic leaps in performance that was now afforded. This is in no small part down to the way that DataCore’s SANsymphony-V leverages disk resources, assigning I/O tasks to very fast server RAM and CPU to accelerate throughput and to speed up response when reading and writing to disk.”
Software Management of Incompatible Storage Devices and Models
Many data centres feature a wide variety of storage arrays, devices and product models from a number of different vendors – including EMC, NetApp, IBM, Dell and HP – none of which are directly compatible. Interestingly, DataCore customers report that the issue of incompatibility generally surfaces more when dealing with different hardware models from the same vendor than between different vendors, and thus have turned to management tools that treat all hardware the same.
Take the US’s third largest independent teaching hospital, the Maimonides Medical Centre in New York, with more than 800 doctors relying on its information systems to care for patients around-the-clock.
“Over the past 12 years, our data centre has featured eight different storage arrays and various other storage devices from three different vendors,” said Gabriel Sandu, chief technology officer at Maimonides Medical Center. “By using DataCore’s SANsymphony software and currently with its latest iteration of SANsymphony-V R9, we have been able to seamlessly go from one storage array to the next with no downtime to our users. We are able to manage our SAN infrastructure without having to worry or be committed to any particular storage vendor. DataCore’s technology has also allowed us to use midrange storage arrays to get great performance – thereby not needing to go with the more expensive enterprise-class arrays from our preferred manufacturers. DataCore’s thin provisioning has also allowed us to save on storage costs as it allows us to be very efficient with our storage allocation and makes sure no storage goes unused.”

Sunday 22 December 2013

QLogic certifies adapters for software-defined storage; announces Fibre Channel Adapters and FabricCache are DataCore Ready

QLogic FlexSuite Gen 5 Fibre Channel and FabricCache Adapters certified DataCore Ready
The emerging software-defined storage space hit a new milestone after QLogic announced that it has added support for DataCore’s SANsymphony-V virtualization offering. 
http://siliconangle.com/blog/2013/12/18/qlogic-certifies-adapters-for-software-defined-storage/

QLogic® FlexSuite™ 2600 Series 16Gb Gen 5 Fibre Channel adapters and FabricCache™ 10000 Series server-based caching adapters are now certified as DataCore Ready, providing full interoperability with SANsymphony-V storage virtualisation solutions from DataCore Software.


DataCore SANsymphony-V is a comprehensive software-defined storage platform that solves many of the difficult storage-related challenges raised by server and desktop virtualisation in data centres and cloud environments. The software significantly improves application performance and response times, enhances data availability and protection to provide superior business continuity and maximises the utilisation of existing storage investments. QLogic FabricCache adapters and FlexSuite Gen 5 Fibre Channel adapters, combined with SANsymphony-V, allow data centres to maximise their network infrastructure for a competitive advantage.

“QLogic channel partners and end-users can now confidently deploy award-winning QLogic adapters with SANsymphony-V to optimise network performance and make the most of their IT investments,” said Joe Kimpler, director of technical alliances, QLogic Corp. “Customers can choose the best QLogic solution—FabricCache adapters for high-performance clustered caching or FlexSuite Gen 5 adapters for ultra-high performance—to best handle their data requirements.”


“DataCore has a long history of collaborating with QLogic to help solve the storage management challenges of our mutual customers,” said Carlos M. Carreras, vice president of alliances and business development at DataCore Software. “QLogic high-performance Gen 5 Fibre Channel adapters and innovative, server-based caching adapters combine with SANsymphony-V to cost-effectively deliver uninterrupted data access, improve application performance and extend the life of storage investments, while providing organisations with greater peace of mind.”

Wednesday 11 December 2013

DataCore Software Defined Storage and Fusion-io Reduce Costs and Accelerate ERP, SQL, Exchange, SharePoint Applications

BUHLMANN GRUPPE, a leader in steel piping and fittings, headquartered in Bremen, Germany has implemented a storage management and virtual SAN infrastructure based on DataCore’s SANsymphony-V software. SANsymphony-V manages and optimizes the use of both the conventional spinning disks (i.e. “SAS” drives) and the newly integrated flash memory-based Fusion-io ioDrives through DataCore’s automatic tiering and caching technology. With the new DataCore solution in place, the physical servers, the VMware virtual servers and most importantly the critical applications needed to run the business – including Navision ERP software, Microsoft Exchange, SQL and SharePoint – are now failsafe and run faster.

DataCore and Fusion-io = Turbo Acceleration for Tier 1 applications
After successfully testing the implementation, the migration of the physical and virtual servers onto the DataCore powered SAN was carried out. A number of physical servers, the Microsoft SQL and Exchange system and other file servers now access the high performance DataCore storage environment. In addition, DataCore now manages, protects and boost the performance for storage serving 70 virtual machines under VMware vSphere that host business critical applications – including the ERP system from Navision, Easy Archive, Easy xBase, Microsoft SharePoint and BA software.

"The response times of our mission-critical Tier 1 applications have improved significantly; performance has been doubled by the use of DataCore and Fusion-io," says Mr. Niebur. "The hardware vendor independence provides storage purchasing flexibility. Other benefits include the higher utilization of disk space, the performance of flash based hardware, as well as faster response times to meet business needs that we experience today – and in the future – combined they save us time and money. Even with these new purchases involved, we have realized saving of 50 percent in costs – compared to a traditional SAN solution."

Read the full Case Study:
http://www.datacore.com/Libraries/Case_Study_PDFs/Case_Study_-_Buhlmann_Gruppe_US.sflb.ashx

Wednesday 4 December 2013

A Defining Moment for the Software-Defined Data Center

Article from: http://elettronica-plus.it/a-defining-moment-for-the-software-defined-data-center/

George Teixeira
For some time, enterprise IT heads heard the phrase, “get virtualized or get left behind”, and after kicking the tires, the benefits couldn’t be denied and the rush was on. Now, there’s a push to create software-defined data centers. However, there is some trepidation whether these ground-breaking, more flexible environments can adequately handle the performance and availability requirements of business-critical applications, especially when it comes to the storage part of the equation.

While decision-makers had good reason for concern, they now have an even better reason to celebrate as new storage virtualization platforms have proven to overcome these I/O obstacles.

Just as server hypervisors provided a virtual operating platform, a parallel approach to storage is quickly transforming the economics of virtualization for organizations of all sizes by offering the speed, scalability and continuous availability needed for realizing the full benefits of software-defined data centers. Particular additional benefits being widely reported include:
  • Elimination of storage-related I/O bottlenecks in virtualized data centers
  • Harnessing flash storage resources effectively for even greater application performance
  • Ensuring fast and always available applications without a major storage investment
Performance slowdowns caused by I/O bottlenecks and downtime attributed to storage-related outages are two of the foremost reasons why enterprises have held back from virtualizing their tier-1 applications, like SQL Server, Oracle, SAP and Exchange. This fact comes across clearly in the recent Third Annual State of Virtualization Survey conducted by my company.

In the survey, findings showed 42 percent of respondents noted performance degradation or inability to meet performance expectations as an obstacle preventing them from virtualizing more of their workloads. Yet, effective storage virtualization platforms are now successfully overcoming these issues by using device-independent adaptive caching and performance boosting techniques to absorb wildly variable workloads, enabling applications to run faster virtualized.

To further increase tier-1 application responsiveness, companies often spend excessively on flash memory-based solid state disks (SSDs). The survey also reveals that 44 percent of respondents found disproportionate storage-related costs were an obstacle to virtualization. Again, effective storage virtualization platforms are now providing a solution with such features as auto-tiering, which optimize the use of these premium-priced resources alongside more modestly priced, higher capacity disk drives.

Such an intelligent software platform constantly monitors I/O behavior and can intelligently auto-select between server memory caches, flash storage and traditional disk resources in real-time to ensure the most suitable class or tier of storage device is assigned to each workload based on priorities and urgency. As a result, a software defined data center can now deliver unmatched tier-1 application performance with optimum cost efficiency and maximum return on existing storage investments.

Once I/O intensive tier-1 applications are virtualized, the storage virtualization platform ensures high availability. It eliminates single points of failure and disruption through application-transparent physical separation, stretched across rooms or off-site with full auto-recovery capabilities for the highest levels of business continuity. The right platform can effectively virtualize whatever storage is on a user’s floor, whether direct-attached or SAN-connected, to achieve a robust and responsive shared storage environment necessary to support highly dynamic virtual IT environments.

Yes, the storage virtualization platform is a defining moment for the software defined data center. The performance, speed and high availability needed for mission-critical databases and applications in a virtualized environment has been realized. Barriers have been removed and there’s a clear and supported path for realizing greater cost efficiency.

Still, selecting the right platform is critical to a data center. Technology that is full-featured and has been proven “in the field” is essential. Also, it’s important to go with an independent, pure software virtualization solution in order to avoid hardware lock-in, and to take advantage of the future storage developments that undoubtedly will come.