Tuesday 30 April 2013

The Irish League of Credit Unions uses DataCore SANsymphony-V storage virtualisation software

ILCU achieves an optimised virtual environment with Disaster Recovery, Highly Performing Applications at 60% cost reduction to alternative storage environments.

The Irish League of Credit Unions (ILCU) has adopted DataCore’s™ SANsymphony-V storage hypervisor solution. The ILCU represents 484 financial institutions across Ireland, with combined assets of €13.6 Billion and membership of over 3 Million in credit unions...
Malcolm Moir, Infrastructure Services Manager at ILCU’s Dublin HQ:- “Our very ethos is about providing equitable, reliable, shared & stable assets to our user base.  We knew that there had to be a better way to service them with a highly available environment that also offered Disaster Recovery.”
“We started this project with the main aim of decreasing costs, modernising our environment and achieving reliable Disaster Recovery. Undoubtedly, DataCore’s SANsymphony-V is the cornerstone to this achievement. We are proud of the service we can now offer the business.” He summarises.

As DataCore’s SANsymphony-V solution has been running for 12 months, we will leave the final comments on reliability to Malcolm “The stability of the DataCore storage hypervisor has impressed me. It’s simply an install and forget product that we rarely need to change. And at ILCU, it’s proven. We did have a failover, but it was totally oblivious to the business, with no interruption of services and a simple notification by email, storage is failed over automatically. That’s massively reassuring.”

To read the full write-up, see:  

Wednesday 24 April 2013

Are you a DCIE? Please join the DataCore Certified Technical Professionals Group on Linkedin

As a DataCore Certified Implementation Engineer (DCIE) you may join the DataCore LinkedIn group to network and exchange best practices with other DCIEs.

Also, as a DCIE you are eligible to additional membership benefits including:

· Use of the DCIE logo on your business cards, letterhead, resumes, social media profiles

· Opportunities to Network, Share, Exchange with other DCIE members

· Promote yourself as a DataCore storage virtualization professional

· Receive exclusive invitations to DCIE Round Tables and technical events

Please connect with other DCIE professionals by joining: DataCore Certified Technical Professionals Group

Monday 22 April 2013

How a virtualized storage infrastructure improves performance

Is using flash a good way to improve performance in a virtual environment? Can a virtualized storage infrastructure be an option?

By Jon William Toigo Read the full article at:

Flash does some wonderful things. I like flash as cache in the hybrid role, personally. You can expedite a certain number of disks by writing files -- or data sets -- that are being accessed a lot on the disk drive. [You can do that] instead of writing that to a flash device temporarily and serving those requests out of the flash layer. That's a perfectly acceptable use of flash, and I've seen it used very wisely…. I'm not down on flash completely, but I think there's a lot of oversell right now around the idea of tier-zero arrays and the idea of having all-flash arrays.

For the money, I could probably do things a lot faster by virtualizing my storage. And that sounds weird; I'm saying don't virtualize or be cautious about how you virtualize your servers, but I'm not saying be cautious about how you virtualize your storage. Storage is a lot easier to virtualize than the workloads that run on servers…

Now, if I can go above all that, go above the layers of value-added software and just go across the storage hardware itself, everybody is selling a box with Seagate hard drives. There's no difference between brand X and brand Y at the hardware level. So we can virtualize that. We can surface those value-added functions that maybe you want to preserve at that virtualization layer and basically spread that goodness to all the storage rigs and only do the ones that have the name X, Y or Z on the side of the box. That drops your storage costs considerably. The best implementation of a virtualized storage infrastructure I've seen is at DataCore Software in Fort Lauderdale. I use their DataCore SANsymphony R9 product on about 4 petabytes of storage that I have in my lab. So basically, what we do is we virtualize the storage, which is nothing but aggregating it all under the control of the software controller. I have a dual redundant server that runs this controller software so there's failover in case one of the server heads dies, and I carve virtual volumes out of all the massive amount of disk I've got and read and write to them through a layer of memory cache. And then, instead of using flash, I use DRAM, which is much more resilient than flash and doesn't lose half of its performance when you write it. The first time you're writing to your flash card, you're going to get full speed out of it. The second time, you have to erase the cells that have been written before you write to them again. So, you have a decrease of 50% of the performance of a flash card.

There have been some other kinds of technologies introduced to try and spoof that a little bit, but the bottom line is that's how flash works. So flash is a bit of an oversell…

So do the math. Go into it with your eyes wide open and tell the vendor you want a test first. The other nice thing about a virtual storage infrastructure is that when you want to move workloads around using vMotion or something, you can move the data that goes with it around too -- that virtual volume can move with the workload, so it's going to save you a lot of money in terms of your basic storage spend, and a lot of money in terms of how many times you need to replicate the same data to take advantage of all those cool things they talked about in the VMware brochure.

Partners - Last chance to jump onboard Wednesday's webinar -

"How, When and Why to Install software storage virtualisation - quick revenue opportunities."

Joint Webinar with DataCore's Kevin Davids and Mark Walker from Commtech: 
Learn how to:- 
1.How the installation of DataCore SANsymphony-V removes storage-related roadblocks to virtualisation deals.
2.How the installation of SANsymphony-V can dramatically speed up the adoption and standardisation of virtual infrastructures.  
3. And when budgets have changed, installation of SANsymphony-V means that you do not have to compromise the solution or most importantly in today's climate, decrease your profit margin to be successful.

Friday 19 April 2013

DataCore adds new distrubutor, Commtech, in UK and Eire to sell the Storage Hypervisor and new VDI solutions in Ireland and UK - 
Commtech, a leading independent VAD in Ireland and the UK, has been appointed to  resell SANsymphony-V and VDi, Virtual Desktop Server (VDS) software solutions.The announcement follows an extensive 2012 overview of DataCore’s distribution strategy and reach in Northern Europe and represents the final part of the review process. Bjarne Poulsen, Regional Director, DataCore Software, Northern Europe, comments: “Commtech is an established route to market for storage and virtualisation vendors in Ireland and has more recently been building an effective capability in the UK.” He continued:  “Commtech’s regional influence, strong storage and virtualization expertise and their deep desire to work with us to promote the new DataCore Virtual Desktop Server (VDS), will enrich their partners with a cost-effective Virtual Desktop infrastructure (VDI) solution at 75% reduced cost compared to alternative offerings.  This will prove critical in the current Irish and the UK economic climate and will kick-start cash-strapped VDI projects.”
Mark Walker, UK Country Manager for Commtech, agrees. “Developing a relationship with DataCore is key to our growth strategy. We are focused on developing the market for next generation storage and data management technologies, such as flash memory arrays and cloud-integrated storage. A number of our existing vendors are DataCore alliance partners, so there is a distinct synergy between the profile of our existing resellers and DataCore’s target partners. It’s a positive announcement for our existing customers, who will now have access to DataCore’s SANsymphony-V as the integrating software layer to underpin and enable their storage virtualisation projects. VDI is also coming of age for our partners, and we expect the uptake on DataCore’s Virtual Desktop Server (VDS) to be immediate.”
Training of partners commences immediately with a series of joint webinars throughout Q2 2013.
Commtech joins Hammer and Avnet as one of the three official DataCore distribution partners in the UK. Go to

Thursday 18 April 2013

Software-defined Storage Makes Economic Sense and Frees You From Hardware-defined Lock-in!

-George Teixeira, CEO & President, DataCore Software 

What does “software-defined” really mean?      Download: "Software-defined Storage" Paper

Beware of Storage Hardware Vendors’ Claims That They Are “Software-defined”
It has become “it’s about the software, dummy” obvious. But watch what the sales pitches claim. You’ll see storage hardware heavyweights leap for the “we are really software” bandwagon, claiming that they are “software-defined storage,” hoping to slow the wheels of progress through their marketing talk. But, it’s the same old song they sing every year. They want you to ignore the realities driving today’s data centers and diverse IT infrastructures – they want you to not change from your past buying practices – they want you to buy more hardware! They want you to forget that the term “software-defined” is being applied selectively to what runs only on their storage hardware platforms and that when you buy their feature set it will not work across other components and vendors storage systems. Beware, their clever sales pitches may sound like “software-defined” but the end objective is clear: “buy more hardware.”

Software is the basis for flexibility and smart storage virtualization and management software can improve the utilization of storage resources so that you optimize and right-size to meet your needs. Hardware-defined by definition is rigid and inflexible therefore it leads to purchasing more than you want since you don’t want to underestimate your needs. Software can also allow the latest innovations like Flash-memory SSDs to be easily incorporated into your infrastructure without having to “rip and replace” your existing storage investments.

In other words, hardware-defined is the mantra for storage hardware vendors who want you to “buy more hardware” and repeat the same process every year versus getting the most value from your investments and “future-proofing” your infrastructure. Software-defined means optimize what you already have, whereas “Hardware-defined = Over Provisioning and Oversizing.”

Software Is What Endures Beyond Hardware Devices that “Come and Go”
Think about it. Why would you want to lock yourself into this year’s hardware solution or have to buy a specific device just to get a software feature you need? This is old thinking, and before virtualization, this was how the server industry worked. The hardware decision drove the architecture. Today with software-defined computing exemplified by VMware or Hyper-V, you think about how to deploy virtual machines versus are they running on a Dell, HP, Intel or IBM system. Storage is going through this same transformation and it will be smart software that makes the difference in a “software-defined” world.

So What Do Users Want From “Software-defined Storage,” and Can You Really Expect It to Come From a Storage Hardware Vendor?
The move from hardware-defined to a software-defined virtualization-based model supporting mission-critical business applications is inevitable and has already redefined the foundation of architectures at the computing, networking and storage levels from being “static” to “dynamic.” Software defines the basis for managing diversity, agility, user interactions and for building a long-term virtual infrastructure that adapts to the constantly changing components that “come and go” over time.

Ask yourself, is it really in the best interest of the traditional storage hardware vendors to go “software-defined” and avoid their platform lock-ins?

Remember One Thing: Hardware-defined = Over Provisioning and Oversizing
Fulfilling application needs and providing a better user experience are the ultimate drivers for next generation storage and software-defined storage infrastructures. Users want flexibility, greater automation, better response times and “always on” continuous availability. Therefore IT shops are clamoring to move all the applications onto agile virtualization platforms for better economics and greater productivity. The business critical Tier 1 applications (ERP, databases, mail systems, Sharepoint, OLTP, etc.) have proven to be the most challenging. Storage has been the major roadblock to virtualizing these demanding Tier 1 applications. Moving storage-intensive workloads onto virtual machines (VMs) can greatly impact performance and availability, and as the workloads grow, these impacts increase, as do costs and complexity.

The result is that storage hardware vendors have to over-provision, over-size for performance and build in extra levels of redundancy within each unique platform to ensure users can meet their performance and business continuity needs.

The costs needed to accomplish the above negate the bulk of the benefits. In addition, hardware solutions are sized for a moment in time versus providing long term flexibility, therefore enterprises and IT departments are looking for a smarter and more cost-effective approach and are realizing that traditional “throw more hardware” solutions at the problem are no longer feasible.

Tier 1 Apps Are Going Virtual; Performance and Availability Are Mission Critical
To address these storage impacts, users need the flexibility to incorporate whatever storage they need to do the job at the right price, whether it is available today or comes along in the future. For example, to help with the performance impacts, such as those encountered in virtualizing Tier 1 applications, users will want to incorporate and share SSD, flash-based technologies. Flash helps here for a simple reason: electronic memory technologies are much faster than mechanical disk drives. Flash has been around for years, but only recently have they come down far enough in price to allow for broader adoption.

Diversity and Investment Protection; One Size Solutions Do Not Fit All
But flash storage is better for read intensive applications versus write heavy transaction-based traffic and it is still significantly more expensive than a spinning disk. It also wears out. Taxing applications that prompt many writes can shorten the lifespan of this still costly solution. So, it makes sense to have other choices for storage alongside flash to keep flash reserved for where it is needed most and to use the other storage alternatives for their most efficient use cases, and to then optimize the performance and cost trade-offs by placing and moving data to the most cost-effective tier that can deliver acceptable performance. Users will need solutions to share and tier their diverse storage arsenal – and manage it together as one, and that requires smart and adaptable software.

And what about existing storage hardware investments, does it make sense to throw them away and replace them with this year’s new models when smart software can extend their useful life? Why “rip and replace” each year? Instead, these existing storage investments and the newest flash hardware devices, disk drives and storage models can easily be made to work together in harmony; within a software-defined storage world.

Better Economics and Flexibility Make the Move to “Software-defined Storage” Inevitable
Going forward, users will have to embrace “software-defined storage” as an essential element to their software-defined data centers. Virtual storage infrastructures make sense as the foundation for scalable, elastic and efficient cloud computing. As users have to deal with the new dynamics and faster pace of today’s business, they can no longer be trapped within yesterday’s more rigid and hard-wired architecture models.

“Software-defined” Architecture and Not the Hardware is What Matters
Clearly the success of software-defined computing solutions from VMware and Microsoft Hyper-V have proven the compelling value proposition that server virtualization delivers. Likewise, the storage hypervisor and the use of virtualization at the storage level are the key to unlocking the hardware chains that have made storage an anchor to next generation data centers.

“Software-defined Storage” Creates the Need for a Storage HypervisorWe need the same thinking that revolutionized servers to impact storage. We need smart software that can be used enterprise-wide to be the driving force for change, in effect we need a storage hypervisor whose main role is to virtualize storage resources and to achieve the same benefits – agility, efficiency and flexibility – that server hypervisor technology brought to processors and memory.

Virtualization has transformed computing and therefore the key applications we depend on to run our businesses need to go virtual as well. Enterprise and cloud storage are still living in a world dominated by physical and hardware-defined thinking. It is time to think of storage in a “software-defined” world. That is, storage system features need to be available enterprise-wide and not just embedded to a particular proprietary hardware device.

Be cautious and beware of vendors’ claims that deliver hardware but talk “software-defined.”

Monday 15 April 2013

DataCore Software launches “DataCore Ready” in response to growing worldwide demand for SANsymphony-V storage hypervisor software.
The DataCore Ready Programme identifies solutions trusted to strengthen SANsymphony™-V-based infrastructures. While DataCore solutions interoperate with common open and industry standard products, the DataCore Ready designation ensures that these solutions have successfully executed a functional test plan and additional verification testing to meet a superior level of joint solution compatibility. Customers who leverage DataCore Ready offerings benefit from quality assurance, reduced risk and lower integration costs. The DataCore Ready logo helps customers quickly identify products and solutions that are optimized for SANsymphony-V.

DataCore Ready Partners Speak to the Benefits of the Programme for Organizations Worldwide

 · Fusion-io: “The Fusion ioMemory platform and DataCore storage virtualization software achieve new levels of performance, high availability and energy efficiency when deployed together. For Fusion-io, DataCore Ready status gives customers the confidence to get up and running as quickly as possible with cost-effective, high-performance solutions,” said Fusion-io Vice President of Alliances, Tyler Smith. Fusion-io accelerates databases, virtualization, cloud computing, big data and mission-critical applications through its integrated hardware and software solutions.

· Microsoft: Microsoft has been a strategic platform partner for more than a decade. DataCore solutions are designed to work hand-in-hand with Microsoft technology and the company is an active member in the Microsoft Partner Solutions Center and Microsoft System Center Alliance. DataCore’s line of SANsymphony-V solutions fully support Microsoft Windows Server 2012 platforms. In addition, Microsoft System Center 2012, SQL Server 2012 and Exchange 2012 are all optimized and certified as DataCore Ready.

· Dot Hill Systems Corp.: A provider of world-class storage solutions and software, Dot Hill recently announced that its AssuredSAN™ 3000 series storage arrays are DataCore Ready and provide full interoperability with SANsymphony-V. This combination enables end users to simplify storage management, boost performance and greatly improve data availability. As noted by Dot Hill’s Senior Director of Marketing, Jim Jonez, “SANsymphony-V software from DataCore complements the Dot Hill AssuredSAN 3000 series storage hardware to deliver an affordable solution boasting a wide range of features normally found only on high-end enterprise solutions. Working in close collaboration with leading technology partners, such as DataCore, we can deliver more powerful storage solutions to the end user at a very compelling price.”

· TwinStrata: TwinStrata delivers cloud-integrated storage solutions that seamlessly combine the flexibility of cloud-based technologies with the robustness of traditional storage. DataCore’s SANsymphony-V integrates seamlessly with TwinStrata’s CloudArray® cloud storage appliance. The DataCore Ready solution offers a simple, transparent and cost-effective way to manage, offload and augment on-premise storage environments with space and power saving storage located in the cloud. According to TwinStrata CEO, Nicos Vekiarides; “We are pleased to achieve DataCore Ready qualification for CloudArray. Leveraging a hybrid cloud strategy allows customers to have ultimate flexibility and greater purchasing power when it comes to storage demands.”
Growing List of Industry Partners Already On-Board with DataCore Ready
A sampling of DataCore Ready partners can be found at A few additional names include:
· SSD/Flash: LSI, Samsung and STEC
· Servers/Storage: Dell, HP, Intel, Nexsan, and X-IO
· Networking: Brocade, Cisco, Emulex and Qlogic
· Software: Citrix, VirtualSharp and VMware

DataCore Ready Programme Builds Greater Customer Confidence

“This programme benefits partners and vendors by offering the highest level of confidence that DataCore’s SANsymphony-V seamlessly integrates with their offerings for optimal performance,” said Carlos M. Carreras, vice president of alliances & business development for DataCore Software. “Today, DataCore works with thousands of organizations worldwide that have realized the range of benefits we offer through our powerful software-defined storage solution. The DataCore Ready designation extends this value proposition to the products and services that are tested and proven to work well with DataCore.”

Thursday 11 April 2013

A primer on availability in virtualized storage environments

Check out the recent: A primer on availability in virtualized storage environments
written by Richard Jenkins of DataCore Software.
Why is storage so important to any infrastructure, and why should it be highly available and virtualized?
For businesses looking toward the era of "software defined storage", it means that the solution to the most common problems faced by anyone administrating storage (performance, flexibility, availability, manageability, scalability) is already here, you just have to look.

Read the full post:

The evolution of virtual storage: A movement...

Richard Jenkins of DataCore Software recently explored IT evolution in relation to virtual storage environments, noting that virtual server provisioning has emerged as a cost effective way to create highly available systems. It also enabled companies to make more strategic hardware choices by raising utilization rates. Rather than invest in a completely new server, data center operators can now deploy a new VM to handle more workloads, and tasks are more easily shifted from one VM to another to maximize reliability. Although storage has always been an essential component of these systems, Jenkins pointed out that it took longer for companies to begin exploring virtualization in this area.

Wednesday 10 April 2013

Real World Storage Virtualization Use Cases: Companies Realize Faster Performance and Five-Nines Reliability for their Tier-1 Business Critical Applications and Databases

DataCore SANsymphony-V Storage Hypervisor Supercharges I/O Intensive Virtualized Tier-1 Applications; SQL, Exchange and SharePoint Lead the Way 

DataCore Software is being used by 1000's of companies and organizations around the globe. In this post we will showcase 3 companies that are realizing the benefits of the software-defined data center and simultaneously improving the performance of their Tier-1, mission-critical business applications. DataCore's SANsymphony-V storage hypervisor boosts the speed, throughput and availability of their virtualized, I/O intensive Tier-1 applications like Microsoft SQL Server, SharePoint and Exchange, as well as SAP, Oracle and others. These and many more customers report two to five times faster application performance and the achievement of better than 99.999 percent uptime after virtualizing their existing storage with SANsymphony-V.

Three Real World Use Cases:

San Gorgonio Memorial Hospital: 500 Percent Improvement in Microsoft SQL Database Performance; Better Response Times Enable More Precise Treatment and Care

Download full customer presentation:

San Gorgonio Memorial Hospital in Banning, California has achieved a completely software-centric data center and is reaping the benefits of having virtualized its critical healthcare and database applications. Richard Trower, senior programmer and database administrator, is able to deliver non-stop business operations serving more than 500 physicians, nurses and healthcare professionals who rely upon the system’s availability to fulfill San Gorgonio’s mission to “provide safe, high quality, personalized healthcare services.”

When San Gorgonio migrated its data center to a new facility, Trower took the opportunity to virtualize his infrastructure. He took all the physical servers offline, virtualized and restored everything on virtual machines, with all of the hospital’s primary servers now running on VMware and SANsymphony-V. The storage hypervisor-powered SANs support the mission critical databases for the hospital: McKesson-Paragon, McKesson HPF and all of the virtual supporting servers. The McKesson-Paragon database runs on a high performance, RAID-10, while the others are on RAID-5s. According to Trower, with his DataCore™ powered RAID-10, tasks like defragmentation of indexes, which previous took him an hour and a half, can be completed in about three minutes.

“I cannot stress how absolutely paramount uptime and speed are in healthcare,” stated Trower. “When systems go down, our ability to deliver services to our patients is affected, which is unacceptable. By virtualizing San Gorgonio’s infrastructure with VMware and DataCore, we can now assure non-stop performance and availability to the healthcare professionals who rely on IT to provide proper care and treatment for our patients.”

Trower further explains that in moving to a software-defined data center environment, he was not only able to reduce the infrastructure’s physical footprint by half, but he was enabled with greater freedom of choice and flexibility in future hardware investments. “SANsymphony-V is the backbone of our storage infrastructure,” furthers Trower. “Virtualization has allowed us to increase efficiencies, performance and availability, which not only makes our faculty happy, but also our business office. This freedom of choice allows us to make wiser purchasing decisions and gain truly maximum ROI from our hardware investments.”

Great Plains Communications Powers Its Applications with DataCore Storage Virtualization and Fusion-io
 to Achieve Unmatched Performance and No Downtime
Telecommunications and internet service provider,
Great Plains Communications is enhancing the performance and resiliency of virtualized corporate applications using VMware and DataCore’s SANsymphony-V. The company’s software-defined data center is split between two facilities, located 10 miles apart. Their multi-tiered storage infrastructure incorporates cutting edge, high performance flash drives from Fusion-io, along with a variety of other SAS and SATA drive devices. Corporate applications such as Oracle, Microsoft Exchange, Lync and SQL Server are powered by VMware vSphere and user mailboxes are hosted by LinuxMagic MagicMail.
Chris Jones, information technology manager, shares that Great Plains Communications’ infrastructure is 90 to 95 percent virtualized. “Our databases, applications, web servers and email servers are a critical part of our business and we rely upon them to provide services to our tens of thousands of customers,” explains Jones. “Our ability to deliver on-demand internet, emails and unified communications to not only our employees, but more importantly our customers, is positively critical. In this industry, there is plenty of competition and service downtime and outages are the fast track to lost customers.”

“In today’s competitive service provider market,” continues Jones, “response times and always-on availability are king. Having a storage environment powered by the
SANsymphony-V Storage Hypervisor and Fusion-io has had an enormous impact on performance and our business critical applications respond at a simply blinding speed. I feel we have the fastest I/O performance available on the market and running various reports have gone from taking several minutes to mere seconds.”
Thorntons Accelerates Microsoft SQL, Exchange; Reduces 10-Hour Backup Times to 3.5 Hours; Continuous Availability Keeps Businesses Moving and Drives Sales

Download the full case study:

DataCore customer Thorntons operates 167 gasoline and convenience retail stores, car washes and travel plazas in Kentucky, Illinois, Ohio, Tennessee, and Florida. Thorntons dove into server virtualization in 2007 and has now virtualized more than 90 percent of the company’s systems. Senior Network Engineer, James Haverstock, reports that Thorntons has virtualized all of their production SQL servers and data. Six production SQL VMs maintain around 150 SQL databases consuming several terabytes (TB) of storage. Their Exchange server also runs on virtual machines, all powered by the SANsymphony-V virtualized storage platform and hypervisor.

“We have many Tier-1 applications virtualized through VMware and DataCore,” said Haverstock. “Of particular importance to us, were the uptime and performance of Exchange and SQL. Once we made the move to a SANsymphony-V powered storage environment, our data and system response improved dramatically. Needless to say, our user community, which depends upon the availability of these applications, was very pleased.”

Bottom-line: The deployment of SANsymphony-V significantly boosted overall systems performance. SANsymphony’s RAM cache vastly decreased latency for all of Thorntons’ storage, resulting not only in faster application performance but also much faster backups. A monthly profit/loss report backup previously required about 10 hours to run. “We’ve got it down to about three and a half now,” explained Haverstock. “Overall backup times, whether for VMs, SQL databases or full system, were cut nearly in half.”

Learn More:
For more on virtualizing and running I/O intensive Tier-1 applications, go to: