There weren’t any telescopes on earth 65 million years ago when the six-mile-wide asteroid that ended the reign of the dinosaurs approached. Even if there had been, there’s not much to be done about a gazillion tons of rock moving faster than a speeding bullet.
Big Data – ultra large scale data storage and analysis – is the data center equivalent of that big rock, but it’s not arriving without warning and it doesn’t have to be an extinction-level event for IT professionals. To the contrary, it offers a unique opportunity to re-architect your storage management infrastructure in a way that makes it more adaptable in every respect and more easily aligned to business needs.
As profiled in the New York Times last Sunday, Big Data is transforming business, government, and education. One researcher reported that a study of 179 large companies showed that those adopting the data-driven decision-making that Big Data makes possible “achieved productivity gains that were 5 percent to 6 percent higher than other factors could explain.”
This shows that Big Data is more than just big. It’s restless, too, best used when hot. Let it cool off and you lose the situational awareness that can lead to big-time financial rewards. It’s not just a matter of storing a gazillion bytes – you can’t possibly store it all, so your retention policies have to change, and the need to widely share it as quickly as possible means your networking strategies have to change as well.
Fortunately, there’s a fundamental storage technology that can be a big help in adapting to Big Data: the storage hypervisor. Even better, the benefits of this software layer, which insulates you from all the hardware variables that Big Data can throw your way, kick in long before Big Data arrives. A storage hypervisor is an agent of change: you get the pay-off today and a future-proof storage infrastructure.
A storage hypervisor enables you to pool resources, automatically allocate space and direct traffic to the optimal tier, cache data near applications for higher performance, and manage it all centrally.
Resource pooling has the most immediate impact, because you can aggregate all of your storage capacity, without regard for brand, model, or interface, and easily reclaim unused space. Looking forward, this capability is key to integrating on-premise storage with cloud storage – a necessity to keep from getting squashed by Big Data.
The automation offered by a storage hypervisor gives you just-in-time storage allocation for highly efficient use of disk space, and the ability to dynamically direct workloads to the right storage resource (auto-tiering), based either on access frequency or business rules, so that the hottest data gets the most attention. With auto-tiering and bridging capabilities like Cloud gateways, the right storage resource includes not only disk devices, but solid state disks or flash memory devices for performance, or Cloud storage providers for virtually unlimited capacity. This makes it easy to balance data value and the need for speed against price/capacity constraints, something that Big Data is going to make ever more necessary.
A storage hypervisor can also cache data in main memory for rapid retrieval and fast updates. This turbocharges native disk array performance, especially if it’s combined with self-tuning algorithms. The result is that even off-premises storage can look local – again, a big “win” for Big Data.
Finally, with Big Data, your storage infrastructure is only going to get bigger, so centralized management of all your storage resources is a must-have. It gives you the equivalent of a universal remote for storage, no matter what it is or where it’s located, and is key to managing the mirroring and replication needed for high-availability, disaster-proof storage.
The fallout from Big Data is going to transform business computing at every level, so if you don’t want to end up a data dinosaur, now’s the time to transform your infrastructure with a storage hypervisor. A good place to start is Jon Toigo’s Storage Virtualization for Rock Stars series, starting with Hitting the Perfect Chord in Storage Efficiency, which will give you a good overview of how a storage hypervisor can help you increase engineering, operational, and financial efficiency.
[Photo Source: http://commons.wikimedia.org/wiki/File:Impact_event.jpg]
News and events in the UK. Information, commentary and updates on virtualisation, FC SAN, iSCSI, high-availability, remote replication, disaster recovery and storage virtualization and SAN management solutions.
Translate
Monday 27 February 2012
Wednesday 15 February 2012
DataCore Software Joins Microsoft Partner Solutions Center
http://www.marketwatch.com/story/datacore-software-joins-microsoft-partner-solutions-center-2012-02-14
DataCore Software announced that it has joined the Microsoft Partner Solutions Center (MPSC). Located on the Microsoft campus in Redmond, the MPSC facility brings together world-class technology companies dedicated to the rapid development of secure, end-to-end business solutions for enterprise customers and partners.
This membership further strengthens DataCore's relationship with Microsoft along with the company's recent joining of the Microsoft System Center Alliance and the introduction of the SANsymphony-V Monitoring Pack for System Center Operations Manager 2008 R2, which provides constant visibility into the health, performance, and availability of virtualized storage resources from within System Center.
"We welcome DataCore to the MPSC and our partner ecosystem," said David Hayes, Senior Director, Microsoft Partner Solutions Center. "As with all of our partners in the MPSC, DataCore brings a unique offering that helps address various infrastructure challenges our customers face today. With DataCore's storage hypervisor, customers can now better monitor and manage their dynamic storage environments, while allowing for increased availability, speed, and utilization."
DataCore Software announced that it has joined the Microsoft Partner Solutions Center (MPSC). Located on the Microsoft campus in Redmond, the MPSC facility brings together world-class technology companies dedicated to the rapid development of secure, end-to-end business solutions for enterprise customers and partners.
This membership further strengthens DataCore's relationship with Microsoft along with the company's recent joining of the Microsoft System Center Alliance and the introduction of the SANsymphony-V Monitoring Pack for System Center Operations Manager 2008 R2, which provides constant visibility into the health, performance, and availability of virtualized storage resources from within System Center.
"We welcome DataCore to the MPSC and our partner ecosystem," said David Hayes, Senior Director, Microsoft Partner Solutions Center. "As with all of our partners in the MPSC, DataCore brings a unique offering that helps address various infrastructure challenges our customers face today. With DataCore's storage hypervisor, customers can now better monitor and manage their dynamic storage environments, while allowing for increased availability, speed, and utilization."
Friday 10 February 2012
All Virtual is Not Enough; Managing Physical, Virtual, Old and New is Real World
Yes, we think the “all or nothing” proposition offered by vendors that can’t address both the virtual and physical world is a mistake, and an expensive one at that. Virtualization and Cloud computing comes with an assumption that it is better to replace your existing investments in servers and storage and start over to meet the higher demands for performance and availability. DataCore sees this as a major obstacle and has thus designed its storage hypervisor to work across existing storage investments; it improves and supplements them with a powerful feature set. Managing both the physical and virtual and the mix of old and new platforms and device types cannot be ignored.
It is interesting to note the large number of new vendors that have jumped on the pure ‘all virtual’ model and have designed their solutions solely to address the virtual world. They do not deal with managing physical devices or support migrating from one device type to another, or support migrating back and forth between physical and virtual environments.
These virtualization-only vendors tend to speak about an IT nirvana in which everyone and everything that is connected to this world is virtual, open and simple - devoid of the messy details of the physical world. Does this sound anything like most IT shops?
Virtualization solutions must work and deliver a unified user experience across both virtual and physical environments. Solutions that can't deal with the physical device world do not work in the real world where flexibility, constant change, and migrations are the norm.
It is interesting to note the large number of new vendors that have jumped on the pure ‘all virtual’ model and have designed their solutions solely to address the virtual world. They do not deal with managing physical devices or support migrating from one device type to another, or support migrating back and forth between physical and virtual environments.
These virtualization-only vendors tend to speak about an IT nirvana in which everyone and everything that is connected to this world is virtual, open and simple - devoid of the messy details of the physical world. Does this sound anything like most IT shops?
Virtualization solutions must work and deliver a unified user experience across both virtual and physical environments. Solutions that can't deal with the physical device world do not work in the real world where flexibility, constant change, and migrations are the norm.
Wednesday 8 February 2012
Kaspersky Labs Selects DataCore Storage Virtualization to Safeguard Ten Petabytes of Storage and Accelerate Microsoft Hyper-V
See Kaspersky Lab and DataCore Case Study:
http://www.datacore.com/Libraries/Case_Study_PDFs/Case_Study_-_Kaspersky_Lab.sflb.ashx
Kaspersky Lab, the world's largest privately-held Internet security and anti-virus company, has chosen DataCore SANsymphony™-V storage hypervisor software to virtualise and protect its build-out of 10 petabytes of disk storage located in two, separate Moscow data centres.
Virtualization World Top Story: Kaspersky Lab selects DataCore Storage Hypervisor
http://virtualizationworld365.info/news_full.php?id=21064&title=Kaspersky-Lab-selects-DataCore-Storage-Hypervisor
Kaspersky Lab, has chosen DataCore SANsymphony™-V storage hypervisor software. With the SANsymphony-V virtualization layer, Kaspersky Lab has unified the management of its diverse storage systems from IBM, HP and NetApp, added a new level of business continuity safeguards and delivered faster response times by accelerating the performance and flexibility of shared storage in support of its many Microsoft Hyper-V virtual machines.
“It was clearly a strategic decision for us to integrate a storage virtualization platform in order for us to benefit from greater flexibility, higher levels of business continuity and unified management,” notes Alexey Ternovsky, Senior Infrastructure Technology Researcher at Kaspersky Lab. “We evaluated a number of different virtualization approaches and found that DataCore was the perfect fit to meet our demands and help us lower the rising costs for managing and extending storage in the long run. The DataCore storage hypervisor gives us the power to manage huge amounts of data and lets us build our own storage infrastructure with our choice of features and performance characteristics.”
Kaspersky Lab’s decision was primarily driven by SANsymphony-V’s ability to leverage existing storage environments and enhance their enterprise functionality by providing a higher level of availability for business continuity; by adding centralized and simplified management over all their storage assets; by increasing performance; by enabling fast remote site disaster recovery capabilities and by maximizing the utilization of their current investments. Strategically, the flexibility to expand the storage infrastructure vendor and hardware independently on an as-needed base were critical requirements.
Over the next year, the two Kaspersky locations plan to virtualize nearly ten petabytes of their data (i.e., ten thousand terabytes, or 10,000,000,000,000,000 bytes) spread over a mix of storage devices. This will enable the company to offer both centralized IT services for internal requests as well as new services for external customers who rely on Kaspersky Lab’s data center services and solutions. The DataCore SANsymphony-V storage hypervisor software will be installed at the two data centers and will synchronously mirror all the production data currently located on diverse storage systems each with different characteristics tied to each vendor’s hardware-based platforms.
High Performance Shared Storage for Microsoft Hyper-V
In the Kaspersky Lab environment, very high performance is critical and the performance demands will continue to grow alongside the ever expanding environment of Microsoft Hyper-V virtual servers and desktops. Ternovsky and his team identified SANsymphony-V to be the most efficient way to provide shared storage for the overall virtualized infrastructure. As the performance needs continue to grow, more memory can be added to boost cache performance and new server technology can be easily inserted without interruption to further scale performance.
SANsymphony-V helps Kaspersky Lab to expand and optimize capacity on an as-needed basis without hardware vendor lock-in or having to acquire huge amounts of unused disk resources. With lots of automated storage capabilities, such as transparent auto-failover, systems can stay up when underlying devices fail. Moreover, powerful “Quick Serve” commands automate the management of storage provisioning. In addition, auto-tiering automates the progression and demotion of data across different storage devices – based on performance and cost criteria. Bottom-line: SANsymphony-V greatly automates and simplifies the administration and storage tasks found in the huge SAN environment of Kaspersky Lab.
“The flexibility, performance and management scalability of the DataCore solution make it an effective shared storage solution for our large number of virtual machines running under Microsoft Hyper-V,” adds Ternovsky. “We now benefit from a better cost infrastructure which allows hardware interchangeability adding a new level of savings and agility to managing our systems and their cost. Our IT administrators have a lot of know-how with Microsoft Windows and virtual machines under Hyper-V, so it was important that SANsymphony-V is not only Microsoft-certified but is intuitive and familiar – with a similar look and feel and many automated features that make storage administration in such a big environment much more practical and easy.”
About Kaspersky Lab
Kaspersky Lab is the world's largest privately-held Internet security and anti-virus company. It delivers some of the world’s most immediate protection against IT security threats, including viruses, spyware, crimeware, hackers, phishing, and spam. The company is ranked among the world’s top four vendors of security solutions for endpoint users. Kaspersky Lab products provide superior detection rates and one of the industry’s fastest outbreak response times for home users, SMBs, large enterprises and the mobile computing environment. Kaspersky® technology is also used worldwide inside the products and services of the industry’s leading IT security solution providers. Learn more at www.kaspersky.com. For the latest on antivirus, anti-spyware, anti-spam and other IT security issues and trends, visit www.securelist.com.
Tuesday 7 February 2012
Did Virtualization Begat a Monster?
As with all disruptive technology changes, virtualization brought tremendous productivity and cost savings gains, but its widespread adoption has also created a new challenge: managing what some have called the “Virtual Machine Sprawl Monster.” The ease of provisioning has increased dramatically the number of virtual machines deployed. As a consequence, even greater performance, availability and administrative management is required to support these sprawling virtual environments. These requirements will only grow as systems (both virtual and physical), platforms and applications continue to proliferate, as they most surely will.
Fundamentally, the main challenge facing IT managers is: “How can I manage and control all this complexity with a tighter budget and fewer skilled resources?” The answer is straight forward: with very smart software, a new mindset focused on architecture versus devices, and the alignment of people and processes to this new reality.
Is the real bottleneck for an IT manager the time required to get things done or the growing infrastructural chaos?
Both the lack of time and the complexity of managing dynamic infrastructures have made IT manager’s job more difficult. Therefore, the IT manager position must evolve from technician to architect. Too many of the current tasks that IT manager must perform or oversee are platform or vendor specific or tied to purpose-built hardware devices and, therefore, require specialized training or tools to properly use them. Also legacy systems and new models don’t work well together.
Instead of simply addressing IT as a set of discreet technologies and platforms, the IT manager must create an environment in which hardware components become pooled and interchangeable and can be managed as a whole. This higher-level viewpoint is needed to cost-effectively meet the demanding and dynamic requirements for more performance, higher availability and greater productivity. For this to succeed, smart management software that works infrastructure-wide and new levels of automation are necessities. Automation is one of the things software can do very well.
The good news is that the smart software required, such as storage hypervisors, are now available. They are easy to use, enable hardware interchangeability, automate difficult repetitive tasks and manage resources as pools.
Also, to be cost-effective today, hardware must become an interchangeable component, so that you can go to the open market and get the best price for the hardware you need. Products like VMware and Hyper-V have already had a major impact in on server hardware selection and solutions like DataCore’s storage hypervisor will do the same for storage devices.
Software is where the intelligence lies and it is the key to better management. DataCore’s storage hypervisor, for instance, offers a comprehensive architecture to control the four main challenges of storage management: meet performance needs, ensure data protection and disaster recovery, cost-effectively pool and tier storage resources and optimize the utilization of infrastructure-wide storage capacity.
How does DataCore address these needs?
The simple answer is we do this by providing an architecture to manage storage – a storage hypervisor. DataCore is smart, easy to use software that delivers powerful automation and hardware interchangeability. It embraces and controls both the virtual and physical worlds and extends the capabilities of both existing and new storage assets.
DataCore’s storage hypervisor pools, auto-tiers and virtualizes existing and new storage devices including the latest high-performance and premium priced memory-based storage technologies (Flash Memory/SSDs) and cost-effective gateways to remotely located Cloud Storage. It provides an architecture that manages the many storage price/performance trade-offs while providing a best fit of storage resources to meet the dynamic workloads and applications of the real world. From a performance standpoint, caching software and self-learning algorithms are in place to boost performance often doubling or tripling response times. Auto-failover and failback software provides the highest levels of availability and continuous access to storage. Auto-tiering manages the I/O traffic to ensure that data is in the right place (SSD, fast storage arrays, capacity storage, Cloud storage) to get the best performance at the lowest possible cost. Automated thin provisioning makes it simple and quick to service application system disk needs and fully optimize the overall utilization of storage capacity.
From a people and process standpoint, DataCore’s storage hypervisor provides a simple and common way to manage, pool, migrate and tier all storage resources. The software accelerates performance and automates many time-consuming tasks including data protection and disk space provisioning.
Fundamentally, the main challenge facing IT managers is: “How can I manage and control all this complexity with a tighter budget and fewer skilled resources?” The answer is straight forward: with very smart software, a new mindset focused on architecture versus devices, and the alignment of people and processes to this new reality.
Is the real bottleneck for an IT manager the time required to get things done or the growing infrastructural chaos?
Both the lack of time and the complexity of managing dynamic infrastructures have made IT manager’s job more difficult. Therefore, the IT manager position must evolve from technician to architect. Too many of the current tasks that IT manager must perform or oversee are platform or vendor specific or tied to purpose-built hardware devices and, therefore, require specialized training or tools to properly use them. Also legacy systems and new models don’t work well together.
Instead of simply addressing IT as a set of discreet technologies and platforms, the IT manager must create an environment in which hardware components become pooled and interchangeable and can be managed as a whole. This higher-level viewpoint is needed to cost-effectively meet the demanding and dynamic requirements for more performance, higher availability and greater productivity. For this to succeed, smart management software that works infrastructure-wide and new levels of automation are necessities. Automation is one of the things software can do very well.
The good news is that the smart software required, such as storage hypervisors, are now available. They are easy to use, enable hardware interchangeability, automate difficult repetitive tasks and manage resources as pools.
Also, to be cost-effective today, hardware must become an interchangeable component, so that you can go to the open market and get the best price for the hardware you need. Products like VMware and Hyper-V have already had a major impact in on server hardware selection and solutions like DataCore’s storage hypervisor will do the same for storage devices.
Software is where the intelligence lies and it is the key to better management. DataCore’s storage hypervisor, for instance, offers a comprehensive architecture to control the four main challenges of storage management: meet performance needs, ensure data protection and disaster recovery, cost-effectively pool and tier storage resources and optimize the utilization of infrastructure-wide storage capacity.
How does DataCore address these needs?
The simple answer is we do this by providing an architecture to manage storage – a storage hypervisor. DataCore is smart, easy to use software that delivers powerful automation and hardware interchangeability. It embraces and controls both the virtual and physical worlds and extends the capabilities of both existing and new storage assets.
DataCore’s storage hypervisor pools, auto-tiers and virtualizes existing and new storage devices including the latest high-performance and premium priced memory-based storage technologies (Flash Memory/SSDs) and cost-effective gateways to remotely located Cloud Storage. It provides an architecture that manages the many storage price/performance trade-offs while providing a best fit of storage resources to meet the dynamic workloads and applications of the real world. From a performance standpoint, caching software and self-learning algorithms are in place to boost performance often doubling or tripling response times. Auto-failover and failback software provides the highest levels of availability and continuous access to storage. Auto-tiering manages the I/O traffic to ensure that data is in the right place (SSD, fast storage arrays, capacity storage, Cloud storage) to get the best performance at the lowest possible cost. Automated thin provisioning makes it simple and quick to service application system disk needs and fully optimize the overall utilization of storage capacity.
From a people and process standpoint, DataCore’s storage hypervisor provides a simple and common way to manage, pool, migrate and tier all storage resources. The software accelerates performance and automates many time-consuming tasks including data protection and disk space provisioning.
Subscribe to:
Posts (Atom)