Translate

Friday, 20 January 2017

DABCC: Hyper-converged Storage Podcast with Doug Brown and Sushant Rao

DABCC-Radio-Feature-Image
In episode 268, Douglas Brown interviews Sushant Rao, Sr. Director of Product & Solutions Marketing at DataCore Software. Sushant and Douglas discuss DataCore Software’s hyper-converged storage solution. Sushant is another deep technical expert from DataCore and does a great job diving deep in to hyperconvergence, DataCore, storage, parallel processing and much more! This is a very technical deep dive from one of the industries true experts.

This is 4th in a series of podcasts with DataCore Software, if you missed the previous issues then look no further:

Tuesday, 17 January 2017

Predictions - Parallel Processing Software Will be a 'Productivity Disrupter' and Game Changer in 2017

VMblog 2017 prediction

VMblog 2017 Virtualization and Cloud Prediction 
Contributed by George Teixeira, President and CEO, DataCore Software
With so much computing power still sitting idle - despite all of the incredible technology advancements that have occurred - in 2017, the time is right for parallel processing software to go mainstream and unleash the immense processing power of today's multicore systems to positively disrupt the economic and productivity impact of what computing can do and where it can be applied.
New software innovations will make 2017 a breakout year for parallel processing. The key is that the software has to become simple to use and non-disruptive to applications to allow it to move from specialized use cases to general application usage. By doing so, the impact of this will be massive because application performance, enterprise workloads and greater consolidation densities on virtual platforms and in cloud computing that have been stifled by the growing gap between compute and I/O will no longer be held back. This will be realized with new parallel I/O software technologies now available that are easy to use, require no changes to the applications and are capable of fully leveraging the power of multicores to dramatically increase productivity and overcome the I/O bottleneck that has been holding back our industry; this is the catalyst of change.
Parallel processing software can now go beyond the realm of specialized uses such as HPC and areas like genomics that have focused primarily on computation, and impact the broader world of applications that require real-time responses and interactions. This includes mainstream applications and storage that drive business transactions, cloud computing, databases, data analytics, as well as the interactive worlds of machine learning and the Internet of Things (IoT).
The real driver of change is the economic and productivity disruption. Today, many new applications such as analytics are not practical because they require hundreds if not thousands of servers to get the job done; yet each server is becoming capable of supporting hundreds of multi-threading computing cores, all available to drive workloads that until now have sat there idle, waiting for work to do. We are ushering in an era where one server will do the work of 10 -- or 100 servers -- of the past. This will be the result of parallel processing software that unlocks the full utilization of multicores, leading to a revolution in productivity and making a new world of applications affordable to mainstream IT in 2017.
The Impact on Real-time Analytics and Big Data Performance will be Profound
The combination of faster response times and the multiplying impact on productivity through parallelization will fuel the next step forward in ‘real-time' analytics, big data and database performance. DataCore sees this as the next step forward in 2017. Our background in parallel processing, real-time I/O and software-defined storage has made our company uniquely well positioned to take advantage of the next big challenge in a world that requires the rate and amount of interactions and transactions to happen at a far faster pace with much faster response times.
The ability to do more work by doing it in parallel -- and to react quickly -- is the key. DataCore sees itself as helping to drive the step function change needed to make real-time analytics and big data performance practical and affordable. The implications on productivity and business decision making based on insights from data in areas such as financial, banking, retail, fraud detection, healthcare, and genomics, as well as machine learning and Internet of Things type applications, will be profound.
The Microsoft Impact Arrives: Azure Stack, Hybrid Cloud, Windows and SQL Server 2016
The success and growth of Microsoft's Azure Cloud has already become evident, however the real impact is the larger strategy of how Microsoft has worked to reconcile the world of on-premise and cloud computing. Microsoft was one of the first cloud vendors to recognize that the world is not just public clouds but that it will continue to be a mix of on-premise and cloud. Microsoft's Azure Stack continues to advance in making it seamless to get the benefits of cloud-like computing whether in the cloud or within a private cloud. It has become the model for hybrid cloud computing. Likewise, Microsoft continues to further integrate its Windows and server solutions to work more seamlessly with cloud capabilities.
While Windows and Azure get most of the attention, one of the most dramatic changes at Microsoft has been how it has reinvented and transformed its database offerings into a true big data and analytics platform for the future. It is time to take another look at SQL Server 2016; it is far more powerful and capable, and now deals with all types of data. As a platform, it is primed to work with Microsoft's large eco-system of marketplace partners, including DataCore with its parallel processing innovations, to redefine what is possible in the enterprise, the cloud, and with big data performance and real-time analytic use cases for traditional business applications, as well as new developing use cases in machine learning, cognitive computing and the Internet of Things.
Storage has Transformed; It's Servers + Software-Defined Infrastructure!
We are the midst of an inevitable and increasing trend in which servers are defining what storage is. Escalating this trend DataCore used parallel I/O software technologies to power off-the-shelf multicore servers to drive the world's fastest storage systems in terms of performance, lowest latencies and best price-performance. Traditional storage systems can no longer keep up and are on the decline, and as a result, are increasingly being replaced by commodity servers and software-defined infrastructure solutions that can leverage their power to solve the growing data storage problem. The storage function and associated data services are now being driven by software and becoming another "application workload" running on these cost-efficient server platforms, and this wave of flexible server-based storage systems are already having a disruptive industry impact.
Marketed as server-SANs, virtual SANs, web-scale, scale-out and hyper-converged systems, they are a collection of standard off-the-shelf servers, flash cards and disk drives - but it is the software that truly defines their value differentiation. Storage has become a server game. Parallel processing software and the ability to leverage multicore server technology is the major game-changer. In combination with software-defined infrastructure, it will lead to a productivity revolution and further solidify "servers as the new storage." For additional information, see the following report:http://wikibon.com/server-san-readies-for-enterprise-and-cloud-domination/
What's Beyond Flash?
Remember when flash was the next big thing? Now it's here. What is the next step -- how do we go faster and do more with less? The answer is obvious; if flash is now here and yet performance and productivity are still an issue for many enterprise applications especially database use cases, then we need to parallelize the I/O processing. Why? It multiplies what can be done as a result of many compute engines working in parallel to process and remove bottlenecks and queuing delays higher up in the stack, near the application, so we avoid as much device level I/O as possible and drive performance and response times far beyond any single device level optimization that flash/SSD alone can deliver. The power of the ‘many' far exceed what only ‘one' can do - combining flash and parallel I/O enables users to drive more applications faster, do more work and open up applications and use cases that have been previously impossible to do.
Going Beyond Hyper-Convergence: Hyper-Productivity is the Real Objective
As 2017 progresses, hyper-converged software will continue to grow in popularity but to cement its success, users will need to be able take full advantage of its productivity promise. The incredible power of parallel processing software will enable users to take advantage of what their hardware and software can do (see this video from ESG as an example).
Hyper-converged systems today are in essence a server plus a software-defined infrastructure, but often they are severely restricted in terms of performance and use cases and too often lack needed flexibility and a path for integration within the larger IT environment (for instance not supporting fibre channel, which often is key to enterprise and database connectivity). Powerful software-defined storage technologies that can do parallel I/O effectively provide a higher level of flexibility and leverage the power of multicore servers so fewer nodes are needed to get the work done, making them more cost-effective. Likewise, the software can incorporate existing flash and disk storage without creating additional silos; migrate and manage data across the entire storage infrastructure; and effectively utilize data stored in the cloud.
Data infrastructures including hyper-converged systems can all benefit from these advances through advanced parallel I/O software technologies that can dramatically increase their productivity by untapping the power that lies within standard multicore servers. While hyper-converged has become the buzzword of the day, let's remember the real objective is to achieve the most productivity at the lowest cost, therefore better utilization of one's storage and servers to drive applications is the key.
The Next Giant Leap Forward - Leveraging the Multiplier Impact of Parallel Processing on Productivity
This combination of powerful software and servers will drive greater functionality, more automation, and comprehensive services to productively manage and store data across the entire data infrastructure. It will lead to a new era where the benefits of multicore parallel processing can be applied universally. These advances (which are already before us) are key to solving the problems caused by slow I/O and inadequate response times that have been responsible for holding back application workload performance and cost savings from consolidation. The advances in multicore processing, parallel processing software and software-defined infrastructure, collectively, are fundamental to achieving the next giant leap forward in business productivity.

Thursday, 12 January 2017

Channel EMEA: DataCore Software Appoints Carrie Reber Vice President of Worldwide Marketing


Industry Veteran Joins Executive Team to Expand Core Business and Capitalise on Parallel Processing Potential
DataCore Software, a leading provider of Hyper-converged Virtual SAN, Software-Defined Storage and Parallel I/O Processing Software, today announced the appointment of Carrie Reber as vice president of worldwide marketing. Reber will drive the company’s sales and channel initiatives to capitalise on the compelling productivity and performance benefits of parallel processing software. She is a well-known technology marketing and communications veteran with extensive experience building and leading marketing teams, developing and managing strategic positioning and communications plans, and raising industry visibility for innovative B-to-B software companies.
With more than 20 years of experience in the technology industry, Reber most recently served as vice president, international marketing at Datto. Prior to that, she worked in senior positions at Infinio, and built the marketing organisation and go-to-market initiatives that supported Veeam’s rapid growth from startup to $100 million in revenue. Earlier, she served as director of product public relations and analyst relations at Quest Software, and worked in various marketing capacities for companies including Aelita Software, Legent Corp. and CompuServe.
As DataCore’s revolutionary Parallel I/O technology continues to drive record performance and price-performance for its DataCore™ Hyper-converged Virtual SAN and SANsymphony™ Software-Defined Storage solutions, the company has experienced double-digit growth in key regions in the Americas, EMEA and APAC. A key focus of Reber’s will be to continue to build DataCore’s global brand recognition while spotlighting the advantages and value proposition of Parallel I/O, which unleashes the power and productivity of parallel processing to drive storage performance and record response time for database, analytics, real-time and transactional processing applications.
“I am delighted that someone of Carrie’s caliber and experience is joining our team,” said George Teixeira, president and CEO of DataCore. “As a company with more than 10,000 successful customer deployments, we need to promote our achievements and reach out to new customers to grow our core business. Strategically, we need to capitalise on our breakthrough parallel processing technology and exploit the new opportunities it will open in 2017. Carrie’s strong execution and management will be instrumental in helping DataCore make these strategic initiatives successful.”

Tuesday, 10 January 2017

AccelStor's NeoSapphire All-Flash Arrays Now Certified DataCore Ready

AccelStor, the software-defined all-flash array provider, is proud to announce that its NeoSapphire 3605 and NeoSapphire 3611 have passed DataCore Ready certification. These Fibre Channel all-flash arrays achieved remarkable results under DataCore's software-defined storage software platform SANsymphony. Users now have even better options for storage virtualization under this new partnership.
AccelStor's NeoSapphire 3605 and NeoSapphire 3611 performed excellently, positioning as tier 1 storage in the DataCore environment. Under DataCore's test with SANsymphony powered by its DataCoreTM Parallel-I/O technology, these all-flash arrays performed well even under severe stress testing, achieving an I/O response time of less than 1ms while retaining huge IOPS performance. Moreover, AccelStor's FlexiRemap technology performs optimally in real-world DataCore cases, improving the overall user experience of the system.
The NeoSapphire series proved to have far superior performance when compared to its competition, especially in the most difficult scenario: 4KB random write testing. The marriage between AccelStor and DataCore can take IT infrastructure performance to the next level. DataCore SANsymphony and DataCore Hyper-converged Virtual SAN software is used by over 10,000 customer sites. The advantage of storage virtualization software is its considerable latitude in supporting various applications and multiple types of storage, easily streamlining systems that use flash storage as well as legacy disks. It works by defining all available storage pools in tiers from 1 to 15. As data is constantly accessed, frequently-handled data will gradually move to the fastest: tier 1, where NeoSapphire arrays can enhance the performance of the whole system.
The NeoSapphire 3605 and 3611 all-flash arrays use Fibre Channel interfaces to minimize latency for Storage Area Networks (SANs). Compatible with 16G Fibre Channel, they are also conveniently able to integrate with existing SANs via 8G backwards compatibility. Multipathing features provide redundancy for I/O balance, fail-over, and seamless fault recovery. These features not only boost IT staff productivity and make deployment easier, but place AccelStor's all-flash array solutions in a strong position for storage virtualization applications.
"AccelStor is pleased to be a strategic technology partner with DataCore," said Alan Lin, Senior Manager of AccelStor, "we're thrilled to have our products as the infrastructural backbone for critical systems using DataCore's SANsymphony environment. Both AccelStor and DataCore share the same vision of providing our customers with leading technology and product enhancements to satisfy their every need."
"Our DataCore Ready Program identifies first-class storage solutions that are verified to enhance DataCore software-defined storage infrastructures. Today, we are delighted to pass the DataCore Ready Certification to a leading all-flash array provider like AccelStor. The intensive testings proofed that the combination of AccelStor's NeoSapphire all-flash arrays and SANsymphony with the unique Parallel-I/O technology play perfectly together, takes software-defined storage to a whole new level and provide extreme performance at ultra-low latency ", said Alexander Best, Director Technical Business Development at DataCore.
For product details and specifications, visit the NeoSapphire 3605 and NeoSapphire 3611 product page at https://www.accelstor.com/overview.php?id=16 and https://www.accelstor.com/overview.php?id=17

Monday, 9 January 2017

Major University School of Business Chose DataCore - Virtualization Review Interview on Reasons Why?

virtualization review
I always enjoy the opportunity to communicate with decision makers to learn what their organization does, problems faced, technology considered to address problems, what was finally selected and why, and advice they'd offer to others.
Recently, I had the opportunity to exchange messages with Keith Kunkle, Sr. Information Processing Consultant, University of Wisconsin-Milwaukee, Lubar School of Business concerning their choice of a DataCore storage virtualization solution. I wanted to understand a bit more about their choice of DataCore's technology.
Please introduce yourself and your organization .
My name is Keith Kunkel and I'm a Sr. Information Processing Consultant at the University of Wisconsin-Milwaukee, Lubar School of Business. I manage the Lubar data center which consists of approximately 40 physical and virtual servers. I also manage Lubar's active directory, Oracle and SQL installations. In addition, I provide desktop support on an as needed basis.
The Lubar School of Business has been in operation for over 50 years and offers bachelor, masters and doctoral degrees. We have over 100 faculty and staff with a student population of approximately 4,200. Lubar also houses an SAP University Competence Center (UCC) which is one of only six UCCs throughout the world. The center provides hosting services and technical support to universities that participate in SAP's University Alliance.
What problem did you need to solve?
I was looking for a way to reduce our data center CAPEX and OPEX. Many of our physical servers were at the end of life or very close and our technology budget wouldn't handle replacing all of them in the same budget year. The other issue was trying to stay current with the firmware and driver updates for all of these servers.
We have several servers that are relatively new with significant internal storage capacity that I wanted to take advantage of. I also was looking for a way to maximize the utilization of our existing storage devices. I have 2 NAS devices and 2 SAN devices that were no longer being used for their original purposes but had a couple years before end of life.
Tell me more about the reduction in CAPEX and OPEX.
First, I want to reiterate that this is just for the Lubar School of Business and not the entire university. The CAPEX reduction comes from not having to purchase as many physical servers to support Lubar's mission.
As an example, our average physical server costs $7,000 - $9,000 whereas the last server that I purchased as a part of this project cost approximately $16,000 - $17,000. That server will handle 3 – 7 virtual servers depending on the resource requirements of each virtual server. If I take an average of 5 virtual servers that would have each required a physical box at an average cost of $8,000, I'm saving $23,000 in hardware costs.
There also is the cost of the operating systems. I can buy 1 copy of the current Windows Server Data Center version and install as many virtual servers as the host server resources will allow whereas each physical server requires its own license. I haven't mentioned the cost of the SAN and NAS storage because those costs have already been absorbed by their original project.
The OPEX savings are in the utility costs because I have fewer servers that need cooling and electricity. Also, the fewer physical servers I have to maintain, the less time I have to spend updating the drivers and firmware so I have more time that can be spent on other priorities.
What products did you consider before making a selection?
I had looked at Atlantis, Maxta, SIOS, and StarWind.
Tell me more about the testing you conducted during your selection process .
I didn't test any of the products to include DataCore's SANsymphony. The main reason is that most of the products that I evaluated wouldn't support the configuration that I wanted to test. I'm using Microsoft's Hyper-V as the hypervisor and a number of the products didn't support that. One of the products only supported internal storage so there wasn't any point in pursuing that option any further.
Several pieces of hardware that are being used for this project weren't available at the time that I made the purchase, which is why I didn't test DataCore's product. There were a number of physical servers that needed to be retired due to the age of their operating systems and the hardware was used as an intermediate solution until I could get the virtual servers configured.
Why did you select this product?
DataCore was the only product that had the features I was looking for. I'm using Hyper-V for our hypervisor and several of the products that I reviewed didn't support it at that time. The remainder had other limiting factors such as working with internal storage only.
Other vendors sold their products as a package so you got hardware with the management software. I have a primary hardware vendor and while some of the vendors would use the same hardware vendor, I didn't need new hardware for this project. In addition, many of the configurations incorporated significantly more storage capacity than I needed. DataCore's licensing process allowed me to tailor the software license to match my storage needs.
What were the features that DataCore had that the other products didn't? 
DataCore allows me to use internal and external storage. This feature was one of the primary selling points. As I mentioned earlier, the servers for this project have significant internal storage (~ 1 TB per server) and I have a pair of SANs and a pair of NASs that have a combined storage capacity of over 12 TB for a total of over 16 TB of storage. This amount is more than adequate for current and future storage requirements based on current growth statistics.
SANsymphony is hardware agnostic and not bundled into package. I can use my existing servers instead of having to purchase additional hardware in order to take advantage of the hyper convergence software.
I can fine tune compute and storage resources to meet our data center needs. At the time of my evaluations, many of the vendors had preconfigured solutions in regards to compute and storage resources.
What tangible benefit have you received through the use of this product? 
I am able to maximize the efficiency of my existing storage. DataCore allows me to incorporate my existing servers' internal storage with the external NAS and SAN storage into a combined storage pool. The software manages the pool so I don't have to be concerned with how much storage is available on a specific device.
What advice would you offer others who are facing similar circumstances? 
I would highly recommend DataCore as a solution. The packages offered by other vendors have an advantage of one vendor for hardware and software but based on my research, there's a premium that you will pay. Even if I had to purchase new hardware, I feel that I am further ahead because I'm able to tailor the amount of compute and storage resources I need.
Dan's Take: Flexibility Is the Key 
I've had the opportunity to speak with a number of other DataCore customers and have heard very similar stories from all of them.
They all tell me that they were trying to support complex virtualized computing environments and wanted to deploy a virtualized or software defined storage environment in support of that virtualized computing environment. I'm often told that DataCore's technology simplified the environment and made it far easier to manage, more reliable and even more flexible.
This story, like the others, underscores my belief that virtual processing solutions should be part of a entire virtualized environment in order to allow the organization the most flexibility. The ideal environment should include access, application, processing, storage and networking virtualization and the management and security tools that support them.