Two recent articles got me thinking about the current limitations of tiered storage, and where it’s going. Ellen Messmer at Network World reported on some comments by Gartner analyst Stanley Zaffos about the chaos in the public cloud storage market. Some vendors, such as Amazon and Nirvanix, are gung-ho on public cloud storage, while others, like Iron Mountain and EMC, have pulled out of the business. While the cost comparison looks more than favorable—Gartner’s estimates is about 75 cents to a dollar per gigabyte per month for in-house storage vs. as low as 3 cents for cloud storage—Zaffos rightly points out the many variables IT needs to consider, among them latency, limited bandwidth, and security. In his remarks, He called out a couple of alternatives to public cloud storage, including a hybrid approach mixing public and private storage—in effect, making public cloud storage just another tier.
At CTOEdge, Mike Vizard has been talking for some time about the rising importance of data management and storage systems to deal with a world in which “multiple tiers of storage are soon going to be the rule rather than the exception.” In a recent blog post, he discusses building a new tier of storage for what he calls “warm data,” which requires access speeds somewhere between production applications and everything else (e.g., backup, archive, DR). He noted IBM’s new Netezza High Capacity Appliance, which isn’t so much a storage tier as a packaged storage application that gives quicker access to petabyte-sized data warehouses to satisfy compliance requirements.
These are two extremes of the storage spectrum, and a good illustration of some of the drivers behind the evolution of tiered storage, which is all about matching networked storage capabilities to application and business needs (e.g., speed, capacity, availability, security). In an ideal world, every application would have exactly the right kind of storage (i.e., infinite tiering granularity). In the real world, differences among technologies, vendors, protocols, and the like make this impractical, and the actual granularity possible with tiered storage is very low, which means that each tier is a compromise.
Like Mike Vizard, I’m convinced that the evolution of storage is taking us in the direction of more and more tiers to deliver a better match to application requirements. We’re going to see low-cost cloud storage treated as just another tier, for services like archiving and long-term backup. We’re going to see increasingly capable private cloud solutions with an increasing variety of tiers within them. And we’re going to see the “fragmentation” of what today we call “managed services” into an increasingly granular menu of capabilities like high availability, disaster recovery, compliance support, and similar “tiers” of storage-based services.
Storage virtualization software, with its ability to abolish storage silos, is going to be a major part of this evolution. The key will be its adoption by more and more service providers like Host.net and External IT who will use it in their own data centers and their customers’ data centers to provide seamless access to storage that matches application and business needs at a very fine granularity, whether in-house or out in the cloud. The result will be a storage environment in which IT can basically dial in as many tiers as they need, without worrying about the cake collapsing on them.
Photo by Pink Cake Box.
News and events in the UK. Information, commentary and updates on virtualisation, FC SAN, iSCSI, high-availability, remote replication, disaster recovery and storage virtualization and SAN management solutions.
Translate
Wednesday 29 June 2011
Friday 17 June 2011
Are Storage Costs Driving You Crazy?
Almost thirty years ago novelist and screenwriter Rita Mae Brown defined insanity as “doing the same thing over and over again but expecting different results.” I’m quite certain she didn’t have storage hardware purchasing in mind when she wrote that, but it’s a perfect description of the usual IT strategy in this regard: buy more.
To be fair, I’ll admit that it’s not like IT has a lot of choice. Factors like server and desktop virtualization (Gartner predicts a 600% growth in storage capacity over the next three years due to these technologies alone), increasingly onerous data retention regulatory requirements, and the growth of rich media are driving an ever-increasing demand for storage. But the situation is definitely one in which “business as usual” is simply insane. It’s time for an intervention.
To explain, I’m going to turn to Jon Toigo, Chairman of The Data Management Institute LLC, who is a master of the “picture worth a thousand words.”
Even a cursory glance at his infographic shows the primary reasons why storage hardware is driving IT crazy.
First off, storage accounts for between one-third and three-quarters of IT hardware spending.
Second, the annualized total cost of ownership for storage is four to five times the initial acquisition expense. Putting in the boxes is just the start of your adventure: now you have to manage the storage, protect the data (e.g., backup, mirroring, etc.), and pay for power and cooling.
Third—and this is definitely crazy-making—once you’ve got all this wonderful storage in place, you find that you can’t use it efficiently: only about 30% of your storage ends up doing anything really useful. The rest is inert, allocated but unused, orphaned, or used to store stuff that’s of no use to your business.
That means you have to buy three times as much storage as you need just to make sure you don’t run out. But that just puts you back in the same insane loop, and you won’t get different results no matter how much storage you throw at the problem.
So you need to stage an intervention. And it can’t be a half-way effort, either.
The solution to this craziness is storage virtualization, which can boost your storage efficiency dramatically, make data protection easier, and reduce your storage management burden, all of which will drive down your storage costs as a percentage of IT spend. (You can get more details on storage virtualization and storage efficiency in Jon’s new white paper, Storage Virtualization for Rock Stars.)
But you have to take it all the way. Leaving storage virtualization capabilities locked up in proprietary array vendor controllers will leave you with islands of storage and put you right back in the inefficiency loop, albeit at a higher level. You’ll be hostage to your storage hardware vendor when it comes to expanding your storage infrastructure, and find it difficult to eke the maximum useful life out of your arrays as they reach end-of-support.
Better to declare your entire independence from storage craziness with storage virtualization software that’s vendor-agnostic and endures over multiple generations of hardware. That way you can squeeze the maximum efficiency out of your existing storage infrastructure and be free to add new storage from any source, on your schedule and within your budget, while your storage virtualization software continues to handle the details of efficiency, data protection, and management as it always has.
Now that’s sanity.
Graphic Source: Toigo Partners International, LLC
Wednesday 15 June 2011
New IDC White Paper on DataCore ROI/TCO, New Jon Toigo Paper on Storage Efficiency
Check out the Updated Industry Analyst and Expert Opinions Page on the DataCore Website
Recently updated with the latest reports and the analysts like those at Gartner Group who are briefed and cover DataCore Software
http://www.datacore.com/Testimonials/What-Others-are-Saying/Analyst-Reports.aspx
White Paper on Storage Virtualization Software and Storage Efficiency
Download the latest 'Must Read' white paper on Storage Efficiency by Jon Toigo: STORAGE VIRTUALIZATION FOR ROCK STARS Part 1: Storage Efficiency
It does a great job of overviewing the benefits of Storage Virtualization Software and covering the compelling economics.
IDC White Paper on DataCore: Achieving the Full Business Value of Virtualization with a Scalable Software-Based Storage Virtualization Solution
An excellent paper on the value proposition of virtualization software for storage by IDCs Rick Villars, the top expert on virtualization and storage virtualization: http://www.datacore.com/Libraries/Document_Library/The_Business_Value_of_Software-based_Storage_Virtualization.sflb.ashx
IDC Opinion & Summary
Today's information-based economy demands that IT managers enhance the business value of existing and planned IT investments while simultaneously reducing the costs of IT operations. Server consolidation and broader adoption of server virtualization are some of the key strategies IT teams are counting on to meet these goals.
Dealing with storage, however, is one of the most critical technical and economic roadblocks to broad adoption of virtualization in many of these organizations. Limits include the up-front direct cost associated with replacing storage with complex networked storage systems, the added operational cost of managing networked storage systems, and the inherent inefficiencies of many of these storage systems.
Storage virtualization software such as DataCore's SANsymphony-V addresses many of these challenges. It allows organizations to make better use of "in place" storage assets while also ensuring that IT organizations can fully achieve a return on their investments in a virtualized server environment. They can achieve these goals by quickly taking advantage of the rapid cost declines and performance increases available in today's standard server platforms. IDC finds that the use of virtualized storage with solutions such as DataCore's makes it possible for companies to:
Recently updated with the latest reports and the analysts like those at Gartner Group who are briefed and cover DataCore Software
http://www.datacore.com/Testimonials/What-Others-are-Saying/Analyst-Reports.aspx
White Paper on Storage Virtualization Software and Storage Efficiency
Download the latest 'Must Read' white paper on Storage Efficiency by Jon Toigo: STORAGE VIRTUALIZATION FOR ROCK STARS Part 1: Storage Efficiency
It does a great job of overviewing the benefits of Storage Virtualization Software and covering the compelling economics.
IDC White Paper on DataCore: Achieving the Full Business Value of Virtualization with a Scalable Software-Based Storage Virtualization Solution
An excellent paper on the value proposition of virtualization software for storage by IDCs Rick Villars, the top expert on virtualization and storage virtualization: http://www.datacore.com/Libraries/Document_Library/The_Business_Value_of_Software-based_Storage_Virtualization.sflb.ashx
IDC Opinion & Summary
Today's information-based economy demands that IT managers enhance the business value of existing and planned IT investments while simultaneously reducing the costs of IT operations. Server consolidation and broader adoption of server virtualization are some of the key strategies IT teams are counting on to meet these goals.
Dealing with storage, however, is one of the most critical technical and economic roadblocks to broad adoption of virtualization in many of these organizations. Limits include the up-front direct cost associated with replacing storage with complex networked storage systems, the added operational cost of managing networked storage systems, and the inherent inefficiencies of many of these storage systems.
Storage virtualization software such as DataCore's SANsymphony-V addresses many of these challenges. It allows organizations to make better use of "in place" storage assets while also ensuring that IT organizations can fully achieve a return on their investments in a virtualized server environment. They can achieve these goals by quickly taking advantage of the rapid cost declines and performance increases available in today's standard server platforms. IDC finds that the use of virtualized storage with solutions such as DataCore's makes it possible for companies to:
- Consolidate storage and server assets
- Increase the number of virtualized servers running on individual physical servers while doubling storage utilization rates for installed storage
- Leverage lower-cost/higher-capacity storage tiers that can significantly cut the cost of acquiring new storage assets
- Improve application and information availability while shrinking backup times
- Significantly reduce the cost to meet the performance and business continuity objectives of virtualized IT organizations
Tuesday 14 June 2011
DataCore Software Names Former Citrix Executive as Vice President of Worldwide Marketing, More...
Linda Haury to direct DataCore's strategic marketing efforts
“DataCore is positioned to become a powerhouse in one of the industry’s fastest growing software segments,” said Haury. “Coming out of the desktop and server virtualization space, I can tell you firsthand how essential they are to virtualization projects. Needless to say, I’m eager to commence this exciting role.”
Virtual Strategy Magazine:
http://www.virtual-strategy.com/2011/06/09/datacore-software-names-former-citrix-executive-vice-president-worldwide-marketing
Sandhill.com:
http://www.sandhill.com/news/exec_moves.php?id=569
IT Director:
http://www.it-director.com/business/change/news_release.php?rel=25506
SNSeurope:
http://www.snseurope.com/news_full.php?id=18505&title=DataCore-Software-names-former-Citrix-executive-as-Vice-President-of-Worldwide-Marketing
VirtualizationWorld:
http://www.virtualizationworld365.net/news_full.php?id=18505&title=DataCore-Software-names-former-Citrix-executive-as-Vice-President-of-Worldwide-Marketing
VMblog:
http://vmblog.com/archive/2011/06/09/datacore-software-names-former-citrix-executive-as-vice-president-of-worldwide-marketing.aspx
ChannelEMEA:
http://www.miamiherald.com/2011/06/09/2258385/datacore-software-names-former.html
Big Data: The Time is Now for Managing It and Leveraging the Advantages
http://www.dbta.com/Articles/Editorial/Trends-and-Applications/Big-Data-The-Time-is-Now-for-Managing-It-and-Leveraging-the-Advantages-75883.aspx
Big Data Storage
Leveraging the advantages big data has to offer in terms of global insight and better customer understanding requires smarter data management practices. Consider the storage side of the issue, another matter that makes big data perplexing to many organizations and administrators. A sizable segment of companies with fast-growing data stores in the IOUG survey spend more than one-fourth of their IT budgets on storage requirements.
Data growth is quickly becoming out of control and drives over-spending when it comes to data storage. Despite increasing IT budgets, the growing costs associated with storing more data create a data affordability gap that cannot be ignored. The response to this growth has been to continue funding expansion and add hardware. Technology advances have enabled us to gather more data faster than at any other time in our history which has been beneficial in many ways. Unfortunately, in order to keep pace with data growth, businesses have to provision more storage capacity which drives costs.
While there has been a relentless push in recent years to store multiplatform data on ever-larger disk arrays, big data demands moving in a different direction. "In contrast to years past, where information was neatly compartmentalized, big data has become widely distributed, scattered across many sites on different generations of storage devices and equipment brands-some up in the clouds, others down in the basement," George Teixeira, president and CEO of DataCore Software, tells DBTA.
As result, centralized storage is "both impractical and flawed," Teixeira says. "It's impractical because it's incredibly difficult to lump all that gargantuan data in one neat little bucket. It's flawed because doing so would expose you to catastrophic single points of failure and disruption." To address widely distributed data storage, he recommends approaches such as storage virtualization software, backed up by mirror images of data which are kept updated in at least two different locations. "This allows organizations to harness big data in a manner that reduces operational costs, improves efficiency, and non-disruptively swap hardware in and out as it ages," he says.
CIO update: The Pros and Cons of SSD in the Enterprise
http://www.cioupdate.com/trends/article.php/11047_3933741_2/The-Pros-and-Cons-of-SSD-in-the-Enterprise.htm
Storage virtualization software helps shape the shared storage infrastructure required by virtual IT environments and takes good advantage of SSDs where appropriate, reducing the write duty-cycle by caching upstream of the cards to minimize actual writes to media. This effectively extends the useful life of these premium-priced assets.
"It can also make modest sized SSDs appear to have much larger logical capacity by thin provisioning its resources," said Augie Gonzalez, director of Product Marketing at DataCore Software. "The sage advice is consider SSDs as one element of the physical storage configuration, but put your money on device-independent storage virtualization software to take full advantage of them."
TechNet Radio: IT Time - The “Caring and Feeding” for Shared Storage in Clustered Hyper-V Environments
http://64.4.11.252/en-us/edge/technet-radio-it-time-the-caring-and-feeding-for-shared-storage-in-clustered-hyper-v-environments.aspx?query=1
Its IT Time, and in today’s episode Blain Barton interviews Augie Gonzalez, Director of Product Marketing at DataCore Software. Tune in for this lively conversation as they discuss how DataCore’s SANsymphony-V can help solve potential storage related issues in Hyper-V clusters.
'Virtualizing' disparate storage resources
http://searchvirtualstorage.techtarget.com/tip/Virtualizing-disparate-storage-resources
Would you like to get a point-in-time snapshot of your disparate storage resources regardless of which disk array supplier houses your data? Would you like to create a universal storage pool out of your Fibre Channel, SCSI, EIDE, and IBM SSA drives? Would you like to substantially improve performance yields from equipment already on your floor?
According to DataCore Software, these are just a few of the advantages that network storage pools powered by the company's SANsymphony are providing - without shutting down, rewiring, or overloading application servers.
"Currently, anyone who has legacy SCSI disk arrays alongside Fibre Channel and EMC boxes are having to use various, vendor-specific utilities to monitor and control their storage resources," says Augie Gonzalez, director of product marketing for DataCore Software. "With SANsymphony, a central administrator can take overall custody of disparate storage resources in a homogeneous way. No longer do they have to be overwhelmed by each particular disk vendor's administrative nuances."
Gonzalez says SANsymphony's built-in caching algorithms are adding new value to existing storage assets. "Many people have equipment on the floor that doesn't live up to its performance potential," he says. "Now you can come in after the fact and accelerate their I/O response time..."
“DataCore is positioned to become a powerhouse in one of the industry’s fastest growing software segments,” said Haury. “Coming out of the desktop and server virtualization space, I can tell you firsthand how essential they are to virtualization projects. Needless to say, I’m eager to commence this exciting role.”
Virtual Strategy Magazine:
http://www.virtual-strategy.com/2011/06/09/datacore-software-names-former-citrix-executive-vice-president-worldwide-marketing
Sandhill.com:
http://www.sandhill.com/news/exec_moves.php?id=569
IT Director:
http://www.it-director.com/business/change/news_release.php?rel=25506
SNSeurope:
http://www.snseurope.com/news_full.php?id=18505&title=DataCore-Software-names-former-Citrix-executive-as-Vice-President-of-Worldwide-Marketing
VirtualizationWorld:
http://www.virtualizationworld365.net/news_full.php?id=18505&title=DataCore-Software-names-former-Citrix-executive-as-Vice-President-of-Worldwide-Marketing
VMblog:
http://vmblog.com/archive/2011/06/09/datacore-software-names-former-citrix-executive-as-vice-president-of-worldwide-marketing.aspx
ChannelEMEA:
http://www.miamiherald.com/2011/06/09/2258385/datacore-software-names-former.html
Big Data: The Time is Now for Managing It and Leveraging the Advantages
http://www.dbta.com/Articles/Editorial/Trends-and-Applications/Big-Data-The-Time-is-Now-for-Managing-It-and-Leveraging-the-Advantages-75883.aspx
Big Data Storage
Leveraging the advantages big data has to offer in terms of global insight and better customer understanding requires smarter data management practices. Consider the storage side of the issue, another matter that makes big data perplexing to many organizations and administrators. A sizable segment of companies with fast-growing data stores in the IOUG survey spend more than one-fourth of their IT budgets on storage requirements.
Data growth is quickly becoming out of control and drives over-spending when it comes to data storage. Despite increasing IT budgets, the growing costs associated with storing more data create a data affordability gap that cannot be ignored. The response to this growth has been to continue funding expansion and add hardware. Technology advances have enabled us to gather more data faster than at any other time in our history which has been beneficial in many ways. Unfortunately, in order to keep pace with data growth, businesses have to provision more storage capacity which drives costs.
While there has been a relentless push in recent years to store multiplatform data on ever-larger disk arrays, big data demands moving in a different direction. "In contrast to years past, where information was neatly compartmentalized, big data has become widely distributed, scattered across many sites on different generations of storage devices and equipment brands-some up in the clouds, others down in the basement," George Teixeira, president and CEO of DataCore Software, tells DBTA.
As result, centralized storage is "both impractical and flawed," Teixeira says. "It's impractical because it's incredibly difficult to lump all that gargantuan data in one neat little bucket. It's flawed because doing so would expose you to catastrophic single points of failure and disruption." To address widely distributed data storage, he recommends approaches such as storage virtualization software, backed up by mirror images of data which are kept updated in at least two different locations. "This allows organizations to harness big data in a manner that reduces operational costs, improves efficiency, and non-disruptively swap hardware in and out as it ages," he says.
CIO update: The Pros and Cons of SSD in the Enterprise
http://www.cioupdate.com/trends/article.php/11047_3933741_2/The-Pros-and-Cons-of-SSD-in-the-Enterprise.htm
Storage virtualization software helps shape the shared storage infrastructure required by virtual IT environments and takes good advantage of SSDs where appropriate, reducing the write duty-cycle by caching upstream of the cards to minimize actual writes to media. This effectively extends the useful life of these premium-priced assets.
"It can also make modest sized SSDs appear to have much larger logical capacity by thin provisioning its resources," said Augie Gonzalez, director of Product Marketing at DataCore Software. "The sage advice is consider SSDs as one element of the physical storage configuration, but put your money on device-independent storage virtualization software to take full advantage of them."
TechNet Radio: IT Time - The “Caring and Feeding” for Shared Storage in Clustered Hyper-V Environments
http://64.4.11.252/en-us/edge/technet-radio-it-time-the-caring-and-feeding-for-shared-storage-in-clustered-hyper-v-environments.aspx?query=1
Its IT Time, and in today’s episode Blain Barton interviews Augie Gonzalez, Director of Product Marketing at DataCore Software. Tune in for this lively conversation as they discuss how DataCore’s SANsymphony-V can help solve potential storage related issues in Hyper-V clusters.
'Virtualizing' disparate storage resources
http://searchvirtualstorage.techtarget.com/tip/Virtualizing-disparate-storage-resources
Would you like to get a point-in-time snapshot of your disparate storage resources regardless of which disk array supplier houses your data? Would you like to create a universal storage pool out of your Fibre Channel, SCSI, EIDE, and IBM SSA drives? Would you like to substantially improve performance yields from equipment already on your floor?
According to DataCore Software, these are just a few of the advantages that network storage pools powered by the company's SANsymphony are providing - without shutting down, rewiring, or overloading application servers.
"Currently, anyone who has legacy SCSI disk arrays alongside Fibre Channel and EMC boxes are having to use various, vendor-specific utilities to monitor and control their storage resources," says Augie Gonzalez, director of product marketing for DataCore Software. "With SANsymphony, a central administrator can take overall custody of disparate storage resources in a homogeneous way. No longer do they have to be overwhelmed by each particular disk vendor's administrative nuances."
Gonzalez says SANsymphony's built-in caching algorithms are adding new value to existing storage assets. "Many people have equipment on the floor that doesn't live up to its performance potential," he says. "Now you can come in after the fact and accelerate their I/O response time..."
Thursday 9 June 2011
DataCore Software Helping New Zealand's Leading Port Company to Build Virtualized Storage Infrastructure
http://virtualization.tmcnet.com/topics/virtualization/articles/183063-datacore-software-helping-new-zealands-leading-port-company.htm
A key player in storage virtualization and iSCSI SAN software, DataCore Software Corporation recently announced the deployment of its DataCore SANsymphony storage virtualization software by Ports of Auckland, New Zealand’s leading port company.
Ports of Auckland is going to use the software to create a dynamic and resilient virtual storage infrastructure. In order to deploy SANsymphony advanced storage virtualization software, the company will not require to make changes in hardware. Thus the investment is expected to help cost-effectively support the port company’s server virtualization environment as well as its business model.
One of the largest in New Zealand, Ports of Auckland supports the international trade on a 24/7 basis by providing a broad range of cargo handling and logistics services. Although the server infrastructure at Ports of Auckland is extensively virtualized, its aging Storage Area Network (SAN) was disrupting IT processing at critical hours, the company revealed in a press release. But upgrade was difficult as it would have disrupted its operations. By deploying DataCore SANsymphony, the IT team is now able to keep applications running while switching out equipment with just a few minutes’ notice, officials at DataCore explained in the release.
According to managing director of APAC at DataCore Software Peter Thompson (News - Alert), DataCore’s storage virtualization software allows Ports of Auckland to move to a shared pool of storage and realize greater efficiency and business continuity as well as the ability to quickly react to new and expanding workloads placed on its systems.
“With DataCore, we have seen the benefits first-hand of improved uptime and being able to do operational maintenance without affecting our business. This gives us peace of mind. We’re also getting better utilization out of our storage hardware, and need less of it to achieve greater performance. With DataCore in place, our performance has been nothing short of phenomenan,” lead systems engineer at Ports of Auckland Craig Beetlestone commented in a statement.
The deployment of DataCore SANsymphony storage virtualization software has given Ports of Auckland and the IT team confidence that they are now in control of the company’s infrastructure.
“Looking forward, it’s good knowing that SANsymphony’s advanced storage virtualization features stay in place even as the hardware underneath changes. There’s none of the throw away we’ve experienced in the past when the functional intelligence was buried in the devices,” Beetlestone further explained.
The Port of Aukland Case Study and more info are available at: http://www.datacore.com/Testimonials/Ports-of-Auckland.aspx
Ports of Auckland is New Zealand’s leading port company, serving the country’s vital international trade with a broad range of cargo handling and logistics services on a 24x7, 365-day-a-year basis. Ports of Auckland is responsible for running two maritime ports and an inland port in the Auckland region.
In 2010, the ports handled cargo equivalent to 13 percent of the country’s total GDP, twice as much as any other New Zealand port. Its east-coast Auckland seaport accounts for 37 percent of New Zealand’s total container trade. Ports of Auckland connects New Zealand’s importers and exporters with more than 165 international ports in nearly 70 countries.
Ports of Auckland Embarks on Storage Virtualization with DataCore Software to Keep Business Processes Flowing Smoothly and Reduce IT Costs
http://www.virtual-strategy.com/2011/06/02/ports-auckland-embarks-storage-virtualization-datacore-software-keep-business-processes-f
http://virtualization.sys-con.com/node/1858216
Computer Technology Review: Ports of Auckland deploys DataCore Software to keep business processes smooth and reduce IT cost
http://www.wwpi.com/index.php?option=com_content&view=article&id=12833:ports-of-auckland-deploys-datacore-software-to-keep-business-processes-smooth-and-reduce-it-costs-&catid=231:storage-software&Itemid=2701181
DataCenter Journal: Storage Virtualization for Ports of Auckland
http://www.datacenterjournal.com/index.php?option=com_content&view=article&id=4742:storage-virtualization-for-ports-of-auckland&catid=142:the-daily-buzz&Itemid=100366
http://vmblog.com/archive/2011/06/02/ports-of-auckland-embarks-on-storage-virtualization-with-datacore-software-to-keep-business-processes-flowing-smoothly-and-reduce-it-costs.aspx
A key player in storage virtualization and iSCSI SAN software, DataCore Software Corporation recently announced the deployment of its DataCore SANsymphony storage virtualization software by Ports of Auckland, New Zealand’s leading port company.
Ports of Auckland is going to use the software to create a dynamic and resilient virtual storage infrastructure. In order to deploy SANsymphony advanced storage virtualization software, the company will not require to make changes in hardware. Thus the investment is expected to help cost-effectively support the port company’s server virtualization environment as well as its business model.
One of the largest in New Zealand, Ports of Auckland supports the international trade on a 24/7 basis by providing a broad range of cargo handling and logistics services. Although the server infrastructure at Ports of Auckland is extensively virtualized, its aging Storage Area Network (SAN) was disrupting IT processing at critical hours, the company revealed in a press release. But upgrade was difficult as it would have disrupted its operations. By deploying DataCore SANsymphony, the IT team is now able to keep applications running while switching out equipment with just a few minutes’ notice, officials at DataCore explained in the release.
According to managing director of APAC at DataCore Software Peter Thompson (News - Alert), DataCore’s storage virtualization software allows Ports of Auckland to move to a shared pool of storage and realize greater efficiency and business continuity as well as the ability to quickly react to new and expanding workloads placed on its systems.
“With DataCore, we have seen the benefits first-hand of improved uptime and being able to do operational maintenance without affecting our business. This gives us peace of mind. We’re also getting better utilization out of our storage hardware, and need less of it to achieve greater performance. With DataCore in place, our performance has been nothing short of phenomenan,” lead systems engineer at Ports of Auckland Craig Beetlestone commented in a statement.
The deployment of DataCore SANsymphony storage virtualization software has given Ports of Auckland and the IT team confidence that they are now in control of the company’s infrastructure.
“Looking forward, it’s good knowing that SANsymphony’s advanced storage virtualization features stay in place even as the hardware underneath changes. There’s none of the throw away we’ve experienced in the past when the functional intelligence was buried in the devices,” Beetlestone further explained.
The Port of Aukland Case Study and more info are available at: http://www.datacore.com/Testimonials/Ports-of-Auckland.aspx
Ports of Auckland is New Zealand’s leading port company, serving the country’s vital international trade with a broad range of cargo handling and logistics services on a 24x7, 365-day-a-year basis. Ports of Auckland is responsible for running two maritime ports and an inland port in the Auckland region.
In 2010, the ports handled cargo equivalent to 13 percent of the country’s total GDP, twice as much as any other New Zealand port. Its east-coast Auckland seaport accounts for 37 percent of New Zealand’s total container trade. Ports of Auckland connects New Zealand’s importers and exporters with more than 165 international ports in nearly 70 countries.
Ports of Auckland Embarks on Storage Virtualization with DataCore Software to Keep Business Processes Flowing Smoothly and Reduce IT Costs
http://www.virtual-strategy.com/2011/06/02/ports-auckland-embarks-storage-virtualization-datacore-software-keep-business-processes-f
http://virtualization.sys-con.com/node/1858216
Computer Technology Review: Ports of Auckland deploys DataCore Software to keep business processes smooth and reduce IT cost
http://www.wwpi.com/index.php?option=com_content&view=article&id=12833:ports-of-auckland-deploys-datacore-software-to-keep-business-processes-smooth-and-reduce-it-costs-&catid=231:storage-software&Itemid=2701181
DataCenter Journal: Storage Virtualization for Ports of Auckland
http://www.datacenterjournal.com/index.php?option=com_content&view=article&id=4742:storage-virtualization-for-ports-of-auckland&catid=142:the-daily-buzz&Itemid=100366
http://vmblog.com/archive/2011/06/02/ports-of-auckland-embarks-on-storage-virtualization-with-datacore-software-to-keep-business-processes-flowing-smoothly-and-reduce-it-costs.aspx
Tuesday 7 June 2011
DataCore Software's SANsymphony-V Takes Industry by Storm
Solving the "Big Challenges" Stalling Today's Server and Desktop Virtualization Projects, Brings DataCore Widespread Industry Accolades
http://vmblog.com/archive/2011/06/01/datacore-software-s-sansymphony-v-takes-industry-by-storm.aspx
“We implemented DataCore software to protect our VMware environment from glitches in our storage infrastructure,” states Warren D. Nisbett, senior assistant managing director, Management Information Systems Unit, St. Kitts-Nevis-Anguilla National Bank Ltd. “SANsymphony-V automatically replicates our virtual machine images and their data in real-time to our other hot site. Moreover, our system performance has greatly improved. We stand behind our selection of SANsymphony-V because it performs as advertised.”
DataCore Software announced broad praises and widespread momentum for its newest software release, SANsymphony- V. This next-generation solution eliminates storage-related barriers that prevent IT organizations from realizing the financial and operational goals of their virtualization initiatives.
“We have large data storage needs and found the flexibility we were seeking in SANsymphony-V,” states Peter Dobler, assistant vice president, Corporate MIS Dept., Northeast Health. “Moreover, SANsymphony-V’s interface is head and shoulders above what I am used to for storage management. It is very intuitive and easy to use. SANsymphony-V has made our end-to-end virtualization initiatives a reality. We virtualized our core network with Cisco and our servers with both VMware vSphere and Citrix XenServer. SANsymphony-V was the final piece. Storage virtualization with SANsymphony-V rounded the whole thing out and it has worked really well.”
According to George Teixeira, president and CEO of DataCore Software, “The industry accolades and positive customer feedback speaks to the timeliness, relevance and value implicit in SANsymphony-V. While many had looked at virtualization from the narrow perspective of servers and desktops, they now recognize how our storage virtualization software forms the critical third dimension to a hardware-independent strategy.”
Announced in January 2011, SANsymphony-V software enables data centers to use existing equipment and conventional storage devices to achieve the robust and responsive shared storage environment necessary to support highly dynamic virtual IT environments. This contrasts sharply with the expensive “rip and replace” approaches being proposed to support desktop and server virtualization projects.
“We deployed SANsymphony-V at all of our global affiliate locations,” says Jane Cebula, director, Global Infrastructure, SI Group. “We needed to protect our VMware environment by ensuring that data was synchronously mirrored and highly available. Now our data is secure because with SANsymphony-V we can replicate data in real-time to a secondary location, rather than contracting out for our D/R needs.”
In recent weeks, many independent assessments of DataCore’s impact to the virtualization segment have been conducted. These include:
Industry Analyst Recognition:
- Enterprise Strategy Group (ESG) Lab Validation Report – Results of comprehensive hands-on testing and performance benchmarking validated key value propositions of SANsymphony-V. The ESG Lab team summarized its test conclusions by stating: “ESG Lab firmly believes that it would benefit any organization considering or implementing an IT virtualization project to take a long look at DataCore SANsymphony-V R8 storage virtualization software. It is robust, flexible, and responsive and delivers major value in terms of utilization, economics, improved response times, high-availability, and easy administration.”
- Taneja Group Profile – “Building the Virtual Infrastructure with DataCore SANsymphony-V” was released earlier this year by leading industry analyst firm, Taneja Group. As part of the profile, Taneja highlights DataCore’s solutions and use cases in Microsoft’s environment and touts that “For Hyper-V users that would like to build an enterprise-capable virtual infrastructure, DataCore SANsymphony-V is an ideal fit.”
Product Reviews:
- eWEEK “Product to Watch” – eWEEK, the industry leader for strategic technology information, named SANsymphony-V one of its “Products to Watch” for 2011. Editors distinguished DataCore for using “software to reshape server and disk resources already in use in order to meet the unpredictable workloads that virtual machines and virtual desktops throw at newly consolidated data centers.” The “Products to Watch” article appears in the February 21, 2011 print edition of eWEEK.
- IAIT Independently Validates SANsymphony-V – After closely examining SANsymphony-V newly-added features and capabilities, independent IT analysis experts at the Institute for the Analysis of IT components (IAIT) concluded that the product “offers an extremely efficient central administration tool for storage virtualization with synchronous mirroring for high availability, asynchronous replication for disaster recovery and further important function like virtual disk pooling, thin provisioning, snapshots for backups and restores as well as CDP for Continuous Data Protection.” To download the full IAIT SANsymphony-V product review, please visit: DataCore Software – Product Reviews.
Awards:
- Everything Channel’s CRN 2011 Partner Programs Guide / 5-Star Partner Rating – DataCore was named to the Everything Channel CRN 2011 Partner Program Guide and was awarded a 5-Star Partner rating for the second straight year. As the definitive authority on vendors who have robust partner programs or products that service solution providers offer directly to the IT channel, Everything Channel recognized the DataCore Partner Program as offering a program that provides the best possible partnering elements for channel success.
- 2011 Network Computing Award – DataCore gained additional recognition in Europe by winning one of the the UK’s top networking accolades by scooping the Data Center Product of the Year Award at the prestigious 2011 Network Computing Awards. Established in 2005 to recognize best-of-breed, easy to use solutions that make the lives of network administrators and managers easier and more effective, readers had the opportunity to vote across a two month time period across several categories. The “DataCore Tackles Storage Virtualization Barrier” review appeared in the February issue of Network Computing.
More Information
Extensive reference material and supporting videos of SANsymphony-V may be found at the SANsymphony-V launch page: http://www.datacore.com/Software/Products/SANsymphony-V.aspx.
DataCore Software's SANsymphony-V Garners Widespread Momentum
http://it.tmcnet.com/topics/it/articles/181706-datacore-softwares-sansymphony-v-garners-widespread-momentum.htm
SANsymphony-V has garnered widespread momentum and praise. It is designed to remove the storage-related barriers that prevent IT organizations from realizing the financial and operational goals of their virtualization initiatives.
SANsymphony-V is a version of DataCore’s SANsymphony storage virtualization software that is tuned specifically for virtual server and virtual desktop environments. It can also be used in physical environments, as well as a mix of virtual and physical.
TMCnet recently reported that the solution is capable of solving difficult storage-related challenges introduced by server and desktop virtualization, cloud computing and more general expansion, business continuity, and disaster recovery initiatives.
Friday 3 June 2011
127 Million and Counting
That’s the number of person-hours lost each year to IT outages in North America and Europe, according to a new survey commissioned by CA Technologies. It’s an eye-catching number, equivalent to almost 64,000 people unemployed for a year, but there were other numbers in the survey that caught my attention.
For instance, a third of the organizations surveyed didn’t have a disaster-recovery policy in place, despite the fact that most them understood that downtime can damage their reputation, staff morale, and customer loyalty, and almost a fourth of the companies said that failure to recover data would be disastrous. The numbers also revealed that employee productivity suffered almost as much during data recovery as it did during the downtime itself.
Even more interesting was the fact that something like 95% of those lost hours come from small businesses with less than 500 employees. Of course, that’s mostly due to the fact that there are so many small businesses compared to larger firms, but I’m sure there’s also a “we can’t afford disaster recovery or high availability” mentality at work. How many of those lost hours could be reclaimed if businesses realized that in the era of virtualization, DR and HA no longer require specialized hardware and expensive duplication of resources?
Server virtualization combined with storage virtualization software makes it possible to use standard server hardware and disks—fancy arrays entirely optional—to make disaster recovery what it should always be—a last resort—by assuring business continuity and zero-downtime recovery with synchronous replication. It works if you have 200 TB of data spread across multiple metro clusters, and it works if you’re a small business dependent on a private cloud. Adding storage virtualization software to your virtualization playbook means you can put all your hardware to work, all the time, and stop worrying about being a downtime statistic.
Photo by Justin Marty
For instance, a third of the organizations surveyed didn’t have a disaster-recovery policy in place, despite the fact that most them understood that downtime can damage their reputation, staff morale, and customer loyalty, and almost a fourth of the companies said that failure to recover data would be disastrous. The numbers also revealed that employee productivity suffered almost as much during data recovery as it did during the downtime itself.
Even more interesting was the fact that something like 95% of those lost hours come from small businesses with less than 500 employees. Of course, that’s mostly due to the fact that there are so many small businesses compared to larger firms, but I’m sure there’s also a “we can’t afford disaster recovery or high availability” mentality at work. How many of those lost hours could be reclaimed if businesses realized that in the era of virtualization, DR and HA no longer require specialized hardware and expensive duplication of resources?
Server virtualization combined with storage virtualization software makes it possible to use standard server hardware and disks—fancy arrays entirely optional—to make disaster recovery what it should always be—a last resort—by assuring business continuity and zero-downtime recovery with synchronous replication. It works if you have 200 TB of data spread across multiple metro clusters, and it works if you’re a small business dependent on a private cloud. Adding storage virtualization software to your virtualization playbook means you can put all your hardware to work, all the time, and stop worrying about being a downtime statistic.
Photo by Justin Marty
Thursday 2 June 2011
Videos from Citrix Synergy 2011; Latest Whitepaper and Webcast - The Storage Virtualization for Rock Stars Webcast Series Part 1: Hitting the Perfect Chord in Storage Efficiency
Check out the latest Webcast: The Storage Virtualization for Rock Stars Webcast Series Part 1: Hitting the Perfect Chord in Storage Efficiency
http://www.datacore.com/Software/Closer-Look/webcasts.aspx
Brian Madden at Citrix Synergy 2011: A video demo of DataCore
In this video from Citrix Synergy 2011 San Francisco, Datacore's Rob Griffin shows off SANsymphony-V and how it can be used to virtualize storage for VDI environments.
http://www.brianmadden.com/blogs/morevideos/archive/2011/05/31/citrix-synergy-2011-a-video-demo-of-datacore.aspx
VMblog interviews Matt Zucker on DataCore at Citrix Synergy 2011
http://events.vmblog.com/video-interviews/interview-datacore
and at:
http://vmblog.com/archive/2011/05/26/citrix-synergy-2011-datacore.aspx
Sabrina Gelado Interview at the Citrix Synergy Both
http://www.citrix.com/tv/#videos/4035
DailyMotion creates a DataCore Video Playlist
http://www.dailymotion.com/playlist/x1kne2_datacorevideos_datacore-playlist-english/1#videoId=xhuyri
Are You Getting Storage Efficiency Backwards?
http://www.datacore.com/storage-virtualization-viewpoint-blog/11-05-31/Are_You_Getting_Storage_Efficiency_Backwards.aspx
SNSeurope DataCore Customer Video
http://www.snseurope.com/tv_full.php?id=322&title=DataCore-Software---Customer-Testimonial
http://www.datacore.com/Software/Closer-Look/webcasts.aspx
Brian Madden at Citrix Synergy 2011: A video demo of DataCore
In this video from Citrix Synergy 2011 San Francisco, Datacore's Rob Griffin shows off SANsymphony-V and how it can be used to virtualize storage for VDI environments.
http://www.brianmadden.com/blogs/morevideos/archive/2011/05/31/citrix-synergy-2011-a-video-demo-of-datacore.aspx
VMblog interviews Matt Zucker on DataCore at Citrix Synergy 2011
http://events.vmblog.com/video-interviews/interview-datacore
and at:
http://vmblog.com/archive/2011/05/26/citrix-synergy-2011-datacore.aspx
Sabrina Gelado Interview at the Citrix Synergy Both
http://www.citrix.com/tv/#videos/4035
DailyMotion creates a DataCore Video Playlist
http://www.dailymotion.com/playlist/x1kne2_datacorevideos_datacore-playlist-english/1#videoId=xhuyri
Are You Getting Storage Efficiency Backwards?
http://www.datacore.com/storage-virtualization-viewpoint-blog/11-05-31/Are_You_Getting_Storage_Efficiency_Backwards.aspx
SNSeurope DataCore Customer Video
http://www.snseurope.com/tv_full.php?id=322&title=DataCore-Software---Customer-Testimonial
Wednesday 1 June 2011
Are You Getting Storage Efficiency Backwards?
Private cloud storage has been much in the news lately: it’s well up along the “hype cycle.” So it was refreshing to see some cautionary words from IBM the other day at the company’s Storage Innovation Executive Summit. Although Big Blue is one of the biggest players in cloud computing—they have a stake in every aspect of it—VP Dan Galvan warned attendees about a common misconception that’s fueling a lot of the private cloud storage hype.
“Cloud computing isn't about storage efficiency. It’s backward, because storage efficiency is how you enable the private cloud.” He went on to recommend implementing technologies such as storage virtualization, automated tiering, and thin provisioning as first steps towards the efficiency needed for private cloud storage.
Exactly right. Couldn’t have said it better myself, even though I’ve had years of practice talking about just those technologies. Talking, and listening to the customers who have used them to increase their storage efficiency.
Along the way I’ve learned that storage efficiency has multiple dimensions, and you have to take them all into account in solving your storage equation. There’s the technical dimension, which involves parameters like storage utilization, performance, power consumption, and the like. There’s the business dimension, where the numbers are about total cost of ownership and ROI. And there’s the operational dimension, which, due to the rapid proliferation of data, is straight out of the Red Queen’s race in Through the Looking Glass: running as fast as you can just to stay in one place.
The only way to optimize all three of these storage efficiency dimensions is to follow the logic of virtualization all the way to the end. Just as you wouldn’t lock your server virtualization capabilities up in a single server box, you shouldn’t lock your storage virtualization capabilities up in a single array. With storage virtualization software that can run on the same standard server hardware you use for application virtualization, and which can virtualize all your storage, regardless of vendor, you’ll not only nail the technical numbers, but extend your existing storage investment for an even better ROI. And being able to manage all your storage assets from a single point, creating virtual disks with a few mouse clicks, means that IT will be doing a lot less running around.
Doing it any other way is just backwards.
Photo by Ross Berteig
“Cloud computing isn't about storage efficiency. It’s backward, because storage efficiency is how you enable the private cloud.” He went on to recommend implementing technologies such as storage virtualization, automated tiering, and thin provisioning as first steps towards the efficiency needed for private cloud storage.
Exactly right. Couldn’t have said it better myself, even though I’ve had years of practice talking about just those technologies. Talking, and listening to the customers who have used them to increase their storage efficiency.
Along the way I’ve learned that storage efficiency has multiple dimensions, and you have to take them all into account in solving your storage equation. There’s the technical dimension, which involves parameters like storage utilization, performance, power consumption, and the like. There’s the business dimension, where the numbers are about total cost of ownership and ROI. And there’s the operational dimension, which, due to the rapid proliferation of data, is straight out of the Red Queen’s race in Through the Looking Glass: running as fast as you can just to stay in one place.
The only way to optimize all three of these storage efficiency dimensions is to follow the logic of virtualization all the way to the end. Just as you wouldn’t lock your server virtualization capabilities up in a single server box, you shouldn’t lock your storage virtualization capabilities up in a single array. With storage virtualization software that can run on the same standard server hardware you use for application virtualization, and which can virtualize all your storage, regardless of vendor, you’ll not only nail the technical numbers, but extend your existing storage investment for an even better ROI. And being able to manage all your storage assets from a single point, creating virtual disks with a few mouse clicks, means that IT will be doing a lot less running around.
Doing it any other way is just backwards.
Photo by Ross Berteig
Subscribe to:
Posts (Atom)