Virtualization

IBM Powers Up New PowerLinux Products, Strategy

IBM this week unveiled its latest generation of industry standard Linux-only servers optimized for its Power architecture along with a new strategy targeting specific x86 applications and workloads.

IBM has been a longtime Linux proponent, supporting industry standard distributions like Red Hat Enterprise Linux (RHEL) and SUSE Linux Enterprise – on its Power Systems line for the last 12 years. This week’s announcement reaffirms Big Blue’s commitment to Linux and broadens its scope with offerings designed to drive more growth for the Power platform in the lucrative x86 arena. IBM will fuel this growth via its mantra, “Tuned to the task,” which emphasizes delivering higher quality and superior economics than rivals.

According to Scott Handy, vice president of IBM’s PowerLinux Strategy and Business Development, “This is an extension to our overall Power strategy to address the Linux x86 space and drive more growth for our Power Systems servers.” …

IBM Powers Up New PowerLinux Products, Strategy Read More »

2011 in High Tech YTD Part 3: Cisco Pulls Plug on Flip, Focuses on Core Competencies

Cisco Pulls the Plug on Flip

Following two consecutive fiscal quarters, Cisco Systems shocked the industry three weeks ago with the news that it will cease to manufacture its popular Flip video camera and will lay off the division’s 550 workers, substantially reducing its consumer businesses.

Also within the past two weeks, Cisco unveiled a voluntary retirement program aimed at workers 50 years old whose age plus tenure at the company equals 60; these workers have from May 10 through June 24 to opt in. This is the first time in two years that Cisco instituted such a cost cutting policy.

Cisco recently hired Gary Moore as Chief Operating Officer to fine tune its re-focused initiatives. …

2011 in High Tech YTD Part 3: Cisco Pulls Plug on Flip, Focuses on Core Competencies Read More »

2011 YTD in High Tech: Bold Aggressive Actions

It’s hard to believe but the first quarter of 2011 is now a memory and we’re well into spring. The tone for the year in high technology was set in early January: fast, bold, aggressive action and sweeping management changes.

In the first four months of the year high tech vendors moved quickly and decisively to seize opportunities in established sectors (smart phones, virtualization, back-up and disaster recovery) and emerging markets (cloud computing, tablet devices and unified storage management). As 2011 unfolds, it’s apparent that high technology vendors are willing to shift strategies and shed executives in order to stay one step ahead of or keep pace with competitors. The competition is cutthroat and unrelenting. No vendor, no matter how dominant its market share, how pristine its balance sheet or how deep its order backlog and book to bill ratio dares relax or rest on its laurels for even a nanosecond.

Recaps of some of the year’s highlights thus far are very revealing. …

2011 YTD in High Tech: Bold Aggressive Actions Read More »

Microsoft Azure Platform, BPOS Cloud Vision Must Address Licensing

Microsoft did a very credible job at its TechEd conference in New Orleans last week, laying out the technology roadmap and strategy for a smooth transition from premises-based networks/services to its emerging Azure cloud infrastructure and software + services model.

One of the biggest challenges facing Microsoft and its customers as it stands on the cusp of what Bob Muglia, president of Microsoft’s Server & Tools Business (STB) unit characterized as a “major transformation in the industry called cloud computing,” is how the Redmond, Wash. software giant will license its cloud offerings.

Licensing programs and plans—even those that involve seemingly straightforward and mature software, PC- and server-based product offerings—are challenging and complex in the best of circumstances. This is something Microsoft knows only too well from experience. Constructing an equitable, easy-to-understand licensing model for cloud-based services could prove to be one of the most daunting tasks on Microsoft’s Azure roadmap.

It is imperative that Microsoft proactively address the cloud licensing issues now, and Microsoft executives are well aware of this. During the Q&A portion of one cloud-related TechEd session, Robert Wahbe, corporate vice president, STB Marketing was asked, “What about licensing?” He took a sip from his water bottle and replied, “That’s a big question.”

That is an understatement.

Microsoft has continually grappled with simplifying and refining its licensing strategy since it made a major misstep with Licensing 6.0 in May, 2001, where the initial offering was complex, convoluted and potentially very expensive. It immediately met with a huge vocal outcry and backlash. The company was compelled to postpone the Licensing 6.0 launch while it re-tooled the program to make it more user-friendly from both a technical and cost perspective.

Over the last nine years, Microsoft’s licensing program and strategy has become one of the best in the high-technology industry. It offers simplified terms and conditions (T&Cs); greater discounts for even the smallest micro SMBs and a variety of add-on tools (e.g. licensing compliance and assessment utilities), as well as access to freebies, such as online and onsite technical service and training for customers who purchase the company’s Software Assurance (SA) maintenance and upgrade agreement along with their Volume Licensing deals.

Licensing from Premises to the Cloud
Microsoft’s cloud strategy is a multi-pronged approach that incorporates a wide array of offerings, including Windows Azure, SQL Azure and Microsoft Online Services (MOS). MOS consists of hosted versions of Microsoft’s most popular and widely deployed server applications, such as Exchange Server, PowerPoint and SharePoint. Microsoft’s cloud strategy also encompasses consumer products like Windows Live, Xbox Live and MSN.

Microsoft is also delivering a hybrid cloud infrastructure that will enable organizations to combine premises-based with hosted cloud solutions. This will indisputably provide Microsoft customers with flexibility and choice as they transition from a fixed-premises computing model to a hosted cloud model. In addition, it will allow them to migrate to the cloud at their own pace as their budgets and business needs dictate. However, the very flexibility, breadth and depth of offerings that make Microsoft products so appealing to customers, ironically, are the very issues that increase the complexity and challenges of creating an easily accessible, straightforward licensing model.

Dueling Microsoft Clouds: Azure vs. BPOS
Complicating matters is that Microsoft has dueling cloud offerings; the Business Productivity Online Suite (BPOS) and the Windows Azure Platform. As a result, Microsoft must also develop, delineate and differentiate its strategy, pricing and provisions for Azure and BPOS. It’s unclear (at least to this analyst) as to when and how a customer will choose one or mix and match BPOS and Azure offerings. Both are currently works in progress.

BPOS is a licensing suite and a set of collaborative end-user services that run on Windows Server, Exchange Server, and SQL Server. Microsoft offers the BPOS Standard Suite, which incorporates Exchange Online, SharePoint Online, Office Live Meeting, and Office Communications (OCS) Online. The availability of the latter two offerings is a key differentiator that distinguishes Microsoft’s BPOS and rival offerings from Google. Microsoft also sells the BPOS Business Productivity Online Deskless Worker Suite. It consists of Exchange Online Deskless Worker, SharePoint Online Deskless Worker and Outlook Web Access Light. This BPOS package is targeted at SMBs, small branch offices or companies that want basic, entry-level messaging and document collaboration functions.

By contrast, Azure is a cloud platform offering that contains all the elements of a traditional application stack from the operating system up to the applications and the development framework. It includes the Windows Azure Platform AppFabric (formerly .NET Services for Azure), as well as the SQL Azure Database service.

While BPOS is aimed squarely at end users and IT managers, Azure targets third-party ISVs and internal corporate developers. Customers that build applications for Azure will host it in the cloud. However, it is not a multi-tenant architecture meant to host your entire infrastructure. With Azure, businesses will rent resources that will reside in Microsoft datacenters. The costs are based on a per-usage model. This gives customers the flexibility to rent fewer or more resources, depending on their business needs.

Cloud Licensing Questions
Any cloud licensing or hybrid cloud licensing program that Microsoft develops must include all of the elements of its current fixed premises and virtualization models. This includes:

1. Volume Licensing: As the technology advances from fixed premises software and hardware offerings to private and public clouds, Microsoft must find ways to translate the elements of its current Open, Select and Enterprise agreements to address the broad spectrum of users from small and midsized (SMBs) companies to the largest enterprises with the associated discounts for volume purchases.
2. Term Length: The majority of volume license agreements are based on a three-year product lifecycle. During the protracted economic downturn, however, many companies could not afford to upgrade. A hosted cloud model, though, will be based on usage and consumption, so the terms should and most likely will vary.
3. Software Assurance: Organizations will still need upgrade and maintenance plans regardless of where their data resides and whether or not they have traditional subscription licensing or the newer consumption/usage model.
4. Service and Support: Provisions for after-market technical services, support and maintenance will be crucial for Microsoft, its users, resellers and OEM channel partners. ITIC survey data indicates that the breadth and depth of after-market technical service and support is among the top four items that make or break a purchasing deal.
5. Defined areas of responsibility and indemnification: This will require careful planning on Microsoft’s part. Existing premises-based licensing models differ according to whether or not the customer purchases their products directly from Microsoft, a reseller or an OEM hardware manufacturer. Organizations that adopt a hybrid premises/cloud offering and those that opt for an entirely hosted cloud offering will be looking more than ever before to Microsoft for guidance. Microsoft must be explicit as to what it will cover and what will be covered by OEM partners and/or host providers.

Complicating the cloud licensing models even further is the nature of the cloud itself. There is no singular cloud model. There may be multiple clouds, and they may be a mixture of public and private clouds that also link to fixed premises and mobile networks.

Among the cloud licensing questions that Microsoft must address and specifically answer in the coming months are:

• What specific pricing models and tiers for SMBs, midsize and enterprises will be based on a hybrid and full cloud infrastructures?
• What specific guarantees if any, will it provide for securing sensitive data?
• What level of guaranteed response time will it provide for service and support?
• What is the minimum acceptable latency/response time for its cloud services?
• Will it provide multiple access points to and from the cloud infrastructure?
• What specific provisions will apply to Service Level Agreements (SLAs)?
• How will financial remuneration for SLA violations be determined?
• What are the capacity ceilings for the service infrastructure?
• What provisions will there be for service failures and disruptions?
• How are upgrade and maintenance provisions defined?

From the keynote speeches and throughout the STB Summit and TechEd conference, Microsoft’s Muglia and Wahbe both emphasized and promoted the idea that there is no singular cloud. Instead, Microsoft’s vision is a world of multiple private, public and hybrid clouds that are built to individual organizations’ specific needs.

That’s all well and good. But in order for this strategy to succeed, Microsoft will have to take the lead on both the technology and the licensing fronts. The BPOS and Azure product managers and marketers should actively engage with the Worldwide Licensing Program (WWLP) managers and construct a simplified, straightforward licensing model. We recognize that this is much easier said than done. But customers need and will demand transparency in licensing pricing, models and T&Cs before committing to the Microsoft cloud.

Microsoft Azure Platform, BPOS Cloud Vision Must Address Licensing Read More »

Virtualization Deployments Soar, But Companies Prefer Terra Firma to Cloud for now

The ongoing buzz surrounding cloud computing – particularly public clouds – is far outpacing actual deployments by mainstream users. To date only 14% of companies have deployed or plan to deploy a private cloud infrastructure within the next two calendar quarters.
Instead, as businesses slowly recover from the ongoing economic downturn, their most immediate priorities are to upgrades to legacy desktop and server hardware, outmoded applications and to expand their virtualization deployments. Those are the results of the latest ITIC 2010 Virtualization and High Availability survey which polled C-level executives and IT managers at 400 organizations worldwide.
ITIC partnered with Stratus Technologies and Sunbelt Software to conduct the Web-based survey of multiple choice questions and essay comments. ITIC also conducted first person interviews with over two dozen end to obtain anecdotal responses on the primary accelerators or impediments to virtualization, high availability and reliability, cloud computing. The survey also queried customers on whether or not their current network infrastructure and mission critical applications were adequate enough to handle new technologies and the increasing demands of the business.
The survey showed that for now at least, although, many midsized and large enterprises are contemplating a move to the cloud – especially a private cloud infrastructure – the technology and business model is still not essential for most businesses. Some 48% of survey participants said they have no plans to migrate to private cloud architecture within the next 12 months while another 33% said their companies are studying the issue but have no firm plans to deploy.

The study also indicates that Private Cloud deployments are outpacing Public Cloud Infrastructure deployments by a 2 to 1 margin. However before businesses can begin to consider a private cloud deployment they must first upgrade the “building block” components of their existing environments e.g., server and desktop hardware, WAN infrastructure; storage, security and applications. Only 11% of businesses described their server and desktop hardware as leading edge or state-of-the-art. And just 8% of respondents characterized their desktop and application environment as leading edge.

The largest proportion of the survey participants – 52% – described their desktop and server hardware working well, while 48% said their applications were up-to-date. However, 34% acknowledged that some of their server hardware needed to be updated. A higher percentage of users 41% admitted that their mission critical software applications were due to be refreshed. And a small 3% minority said that a significant portion of both their hardware and mission critical applications were outmoded and adversely impacting the performance and reliability of their networks.

Based on the survey data and customer interviews, ITIC anticipates that from now until October, companies’ primary focus will be on infrastructure improvements.

Reliability and Uptime Lag

The biggest surprise in this survey from the 2009 High Availability and Fault Tolerant survey, which ITIC & Stratus conducted nearly one year ago, was the decline in the number of survey participants who said their organizations required 99.99% uptime and reliability. In this latest survey, the largest portion of respondents – 38% — or nearly 4 out of 10 businesses said that 99.9% uptime — the equivalent of 8.76 hours of per server, per annum downtime was the minimum acceptable amount for their mission critical line of business (LOB) applications. This is more than three times the 12% of respondents who said that 99.9% uptime was acceptable in the prior 2009 survey. Overall, 62% or nearly two-thirds of survey participants indicated their organizations are willing to live with higher levels of downtime than were considered acceptable in previous years.
Some 39% of survey respondents – almost 4 out of 10 respondents indicated that their organizations demand high availability which ITIC defines as four nines of uptime or greater. Specifically, 27% said their organizations require 99.99% uptime; another 6% need 99.999% uptime and a 3% minority require the highest 99.999% level of availability.
The customer interviews found that the ongoing economic downturn, aged/aging network infrastructures (server and desktop hardware and older applications), layoffs, hiring freezes and the new standard operating procedure (SOP) “do more with less” has made 99.9% uptime more palatable than in previous years.
Those firms that do not keep track of the number and severity of their outages have no way of gauging the financial and data losses to the business. Even a cursory comparison indicates substantial cost disparities between 99% uptime and 99.99% uptime. The monetary costs, business impact and risks associated with downtime will vary by company as well as the duration and severity of individual outage incidents. However a small or midsize business, for example, which estimates the hourly cost of downtime to be a very conservative $10,000 per hour, would potentially incur losses of $876,000 per year at a data center with 99% application availability (87 hours downtime). By contrast, a company whose data center operations has 99.99% uptime, would incur losses of $87,600 or one-tenth that of a firm with conventional 99% availability.
Ironically, the need for rock-solid network reliability has never been greater. The rise of Web-based applications and new technologies like virtualization and Service Oriented Architecture (SOA), as well as the emergence of public or shared cloud computing models are designed to maximize productivity. But without the proper safeguards these new datacenter paradigms may raise the risk of downtime. The Association for Computer Operations Management/ Data Center Institute (AFCOM) forecasts that one-in-four data centers will experience a serious business disruption over the next five years.
At the same time, customer interviews revealed that over half of all businesses 56% lack the budget for high availability technology. Another ongoing challenge is that 78% of survey participants acknowledged that their companies either lack the skills or simply do not attempt to quantify the monetary and business costs associated with hourly downtime. The reasons for this are well documented. Some organizations don’t routinely do this and those that attempt to calculate costs and damages run into difficulties collecting data because the data resides with many individuals across the enterprise. Inter-departmental communication, cooperation and collaboration is sorely lacking at many firms. Only 22% of survey respondents were able assign a specific cost to one hour of downtime and most of them gave conservative estimates of $1,000 to $25,000 for a one hour network outage. Only 13% of the 22% of survey participants who were able to quantify the cost of downtime indicated that their hourly losses would top $175,000 or more.

Users Confident and Committed to Virtualization Technology
The news was more upbeat with respect to virtualization – especially server virtualization deployments. Organizations are both confident and comfortable with virtualization technology.
72% of respondents indicated the number of desktop and server-based applications demanding high availability has increased over the past two years. The survey also found that a 77% majority of participants run business critical applications on virtual machines. Not surprisingly, the survey data showed that virtualization usage will continue to expand over the next 12 months. A 79% majority – approximately eight-out-of-10 respondents — said the number of business critical applications running on virtual machines and virtual desktops will increase significantly over the next year. Server virtualization is very much a mainstream and accepted technology. The responses to this question indicate increased adoption as well as confidence. Nearly one-quarter of the respondents – 24% say that more than 75% of their production servers are VMs. Overall 44% of respondents say than over 50% of their servers are VMs. However, none of the survey participants indicate that 100% of their servers are virtualized. Additionally, only 6% of survey resp

Virtualization Deployments Soar, But Companies Prefer Terra Firma to Cloud for now Read More »

VDI Vendor Wars Intensify

There’s no hotter market in high tech this year than Virtual Desktop Infrastructure (VDI) and you don’t need sales and unit shipment statistics to prove it. No, the best measurement of VDI’s hotness is the sudden flurry of vendor announcements accompanied by a concomitant rise in vitriol.
The main players in the VDI market are actually two sets of pairs. It’s Citrix and Microsoft lining up against VMware and EMC for Round 2 in the ongoing virtualization wars. On March 18, Citrix and Microsoft came out swinging, landing the first potent, preemptive punches right where they hope will hurt VMware the most: in its pocketbook.
Citrix and Microsoft unveiled a series of VDI initiatives that include aggressive promotional pricing deals and more simplified licensing models. To demonstrate just how solid and committed they are to their alliance and taking on and taking down VMware and EMC, the two firms even went so far as to combine their respective VDI graphics technologies.
At stake is the leadership position in the nascent, but rapidly expanding global VDI market. The results of the ITIC 2010 Global Virtualization Deployment and Trends Survey which polled 800+ businesses worldwide in the December/January timeframe indicate that 31% of respondents plan to implement VDI in 2010; that’s more than double the 13% that said they would undertake a VDI deployment in 2009. Application virtualization is also on the rise. The same ITIC survey found that 37% of participants plan application virtualization upgrades this year, up from 15% who responded affirmatively to the same question in the 2009.
The current installed base of VDI deployments is still relatively small; hence the statistics that show the number of deployments doubling year over year must be considered in that context. Nonetheless, double digit deployment figures are evidence of strengthening demand and a market that is robustly transitioning from niche to mainstream. The spate of announcements from Microsoft and Citrix were clearly intended to capitalize on the growth spurt in VDI. At the same time, the companies threw down the gauntlet with initiatives aimed at solidifying and expanding their base of current VDI customers while serving the dual purpose of luring VMware customers away from that company’s VDI platform. They include:
• “VDI Kick Start” This wide ranging sales promotion, which runs from March 18 through December 31, 2010, seeks to jump start VDI deployments by lowering the entry level pricing for customers purchasing Microsoft and Citrix technologies. As part of this deal, existing Microsoft client access licensing (CAL) customers will pay $28 per desktop for up to 250 users to purchase the Microsoft Virtual Desktop Infrastructure Suite, Standard edition, and Citrix’s XenDesktop VDI Edition for one year. That’s roughly a 50% discount off the list prices that corporations have paid up until now for their annual CALs. This is crucial for cost conscious businesses. Client access licenses typically represent the lion’s share of their licensing deals since desktops outnumber servers in mid-sized and large enterprises. In addition to merging Microsoft’s 3-D graphics technology for virtual desktops, called RemoteFX, with Citrix’s high-definition HDX technology.

• The Microsoft Virtual Desktop Access (VDA) License Plan. Organizations that use Thin Client devices which are not included or covered under Microsoft’s SA maintenance plan, can now purchase the VDA licenses at a retail price of $100 per device per annum. This targets end users who travel or telecommute and need to use personal devices or public networks to access their corporate data. Microsoft also made another move towards simplifying its virtualization licensing plan. Starting July 1, Microsoft SA customers will no longer be required to purchase a separate license to access Windows via a VDI.
• The “Rescue for VMware VDI” (the name says it all) this promotion is a direct attack on VMware. Like the VDI Kick Start program it runs from March 18 through December 31, 2010. Under the terms of this deal, any Microsoft Software Assurance licensing/maintenance customer can replace their existing VMware View licenses for free. VMware View users who opt out of that platform in favor of the Citrix and Microsoft offerings will receive up to 500 XenDesktop VDI Edition device licenses and up to 500 Microsoft VDI Standard Suite device licenses free for an entire year once they trade in their VMware View licenses.
Dai Vu, Microsoft’s director of virtualization marketing said the announcements were all about delivering more value to desktop customers and simplifying and extending organizations’ licensing rights.
The Citrix/Microsoft announcements also cement the close working partnership and the “enemy of my enemy is my friend” relationship the firms have enjoyed for many years. By bundling their respective VDI offerings together, the two companies should also ensure integration and interoperability which are crucial components for each and every layer in a virtualized data center environment.
VMware and EMC: Not Standing Still
VMware and EMC executives have yet to publicly respond to the Microsoft/Citrix initiatives. However, it’s almost certain that VMware will have to offer its current and prospective VDI accounts incentives to counter the Microsoft/Citrix alliance. Cash strapped corporations and IT departments are all on the lookout for top notch products at bargain basement prices. And it doesn’t get much better for customers than the free Rescue for VMware VDI program.
VMware built up a commanding lead in the server virtualization arena over the last five years by virtue of being first to market and delivering leading edge features and performance in its signature ESX Server product. VMware’s competitors have spent the last several years playing catch up in server virtualization. This allowed VMware to charge a premium price for its premier offerings. Depending on the size and scope of the individual organization’s server virtualization deployment, customers paid on average 35% to as much as 75% higher for VMware server-based offerings. There were surprisingly few complaints.
The emerging VDI and application virtualization markets are a different story. Only about 5% to 8% of organizations worldwide have fully virtualized their desktop infrastructure. So it’s too soon to declare a clear market winner. It’s safe to say that Citrix, Microsoft and VMware are all market leaders in this segment. This time around though, Microsoft and Citrix are determined not to let VMware and EMC run away with the race by building an insurmountable lead.
Meanwhile, VMware and EMC have not been idle. Former Microsoft executive Paul Maritz succeeded VMware founder Diane Greene following her 2008 departure as the company’s president and chief executive officer. Since then he has made tangible moves to bolster VMware’s position in the VDI and application virtualization arenas. Maritz and EMC CEO Joe Tucci make a formidable combination, as do EMC and VMware. EMC purchased VMware in 2004 for $635 million and it owns an 86% majority stake in the server virtualization market leader. In the past several years, VMware’s fortunes and revenues have risen faster than EMC’s. VMware’s year-over-year (YoY) quarterly revenue growth stands at 18.20% compared with EMC’s modest 2.10% Y0Y quarterly sales. Another key indicator is net earnings and in this regard, VMware experienced negative YoY quarterly earnings growth of -49.4 0% . By contrast its parent EMC recorded a very robust and positive 44.70% jump in YoY quarterly earnings. It is also worth noting that VMware’s annual revenues of $2.02 billion represent only 15% of EMC’s annual sales of $14.03 billion. And to date, EMC’s solutions have only been related tangentially to VMware’s VDI products. For practical purposes, this may continue to be the case. From a PR standpoint though, EMC and VMware are presenting themselves as a sort of virtualization “dynamic duo.”
At an EMC Analyst event at the company’s Hopkinton, MA headquarters on March 11, Pat Gelsinger, president of EMC’s Information Infrastructure Products group described the combination of EMC and VMware – specifically with respect to storage virtualization, virtualization management and private cloud infrastructures — as the “Wild West” of the virtualization market, saying “we want to be disruptive and change the way people fundamentally think of IT.” Though Gelsinger mainly confined his comments to EMC’s core bailiwick in the storage arena, it is clear that EMC and VMware are pro-actively presenting a united front.
In February, the two firms moved to reposition some of their assets; EMC and VMware inked a deal for VMware to acquire certain software products and expertise from EMC’s Ionix IT management business in an all cash deal for $200 million. EMC does retain the Ionix brand and gets full reseller rights to continue to offer customers the products acquired by VMware. Maritz said VMware’s acquisition of the Ionix products and expertise promises to further establish VMware vCenter as the next generation management platform for private cloud infrastructures.
The agreement also calls for VMware to take control of all the technology and intellectual property of FastScale, which EMC acquired in 2009. The FastScale Composer Suite incorporates integrated software management tools to enable organizations to maintain peak performance in a virtualized environment.
Also, recently, VMware introduced ThinApp 4.5, a new version of its application virtualization package designed to simplify enterprises’ migration to Windows 7.
End Users are the Biggest Winners
What makes the latest competition for VDI market dominance noteworthy is the extreme actions the combatants are willing to take in order to retain and gain customers’ at their rivals expense. With last week’s joint announcements and deepening partnership, Citrix and Microsoft have signaled their intention to lead but it’s still too early to call the race.
The joint Microsoft/Citrix initiatives to cut costs and simplify virtualization licensing plans remove two of the more significant barriers to VDI adoption. The largest looming challenge remains the willingness of corporations to embrace a new technology model as their organizations and IT departments continue to grapple with the lingering effects of the ongoing economic crunch. In this regard, all of the virtualization vendors in concert with OEM hardware vendors like Dell, Hewlett-Packard, IBM, Stratus Technologies and Wyse who partner with them must convince customers that transitioning to VDI will provide tangible Total Cost of Ownership (TCO) and Return on Investment (ROI) benefits. This entails providing organizations with the necessary guidance – including tools, training, documentation, Best Practices and solid technical service and support – to ensure that a conversion to VDI can be accomplished with minimal disruption. Admittedly, this is a tall order.
Hardware vendors like Dell, HP, IBM et al all have a stake in the future success of the VDI market. Organizations that migrate to VDI will seek to upgrade to newer, more powerful desktops (PCs, notebooks) and servers, which in turn, potentially boosts the hardware vendors’ individual and collective bottom lines. Additionally, both HP and IBM boast huge service and support organizations, which also stand to benefit from an uptick in VDI adoptions. So the hardware vendors have every reason to partner with Citrix, Microsoft and VMware to promote and expand the VDI market segment. Regardless of which vendor(s) prevails, the biggest winners will be the customers. When several big name vendors vie for the hearts, minds and wallets of customers, it usually means that feature-rich, reliable products get to market sooner at more competitive prices. Let’s hope the VDI race is a long one.

VDI Vendor Wars Intensify Read More »

HP, Microsoft Still Have Some ‘Splainin’ to Do on Application-to-Infrastructure Pact

The recently announced joint Hewlett-Packard/Microsoft Application-to-Infrastructure Model Partnership has intriguing possibilities for both companies and their respective and overlapping installed customer base. However, it remains to be seen how quickly and efficiently the two industry giants can deliver products and market the merits of the solution. Now $250 million is huge investment even for two high tech powerhouses like HP and Microsoft. So we know this is a serious committment.

To recap, HP and Microsoft said they will invest $250 million into their Frontline Partnership. The deal aims to deliver full, integrated stacks that support Microsoft’s Exchange Server and SQL Server, including management, virtualization and cloud implementations. The resulting product offerings will consist of pre-packaged application solution bundles that incorporate the aforementioned management and virtualization capabilities. The two companies said the pact calls for them to partner on engineering, R&D, marketing and channel sales.
Still, the announcement left many industry watchers with more questions than answers. As my colleagues Charles King and Merv Adrian noted in their Breaking News Review in the January 14 special edition of Charles King’s Pund-IT, HP and Microsoft “have worked closely for years, share tens of thousands of common customers and channel partners and have long supported each other’s interests.”
So what’s new about this announcement? That question should be answered during the coming months. A $250 million investment is considerable even for two high technology titans. It now remains for HP and Microsoft to execute on their promise to produce solutions that thoroughly integrate the two companies’ infrastructure and applications stacks to ship pre-configured and optimized solutions for Microsoft’s Exchange Server, and SQL Server, virtualization, cloud computing converged infrastructure and pre-packaged application tools.
But perhaps the most immediate and daunting challenge is for HP and Microsoft to deliver a product roadmap that also includes specific details about the pricing, training and services the two firms will commonly deliver. Above all, companies must market and sell this deal to the legions of skeptics. The high tech industry has witnessed numerous high profile partnership deals announced amidst much industry fanfare never to be heard from after the initial press releases.
Remember the Cisco Systems/Microsoft Directory Enabled Network (DEN) initiative of the late 1990s? No. Not many people do. Announced with great fanfare, this dream team was supposed to incorporate the functionality of Microsoft’s Active Directory into Cisco routers and provide network administrators with a more comprehensive means of managing various devices on their network. In reality, the Cisco/Microsoft DEN initiative was a partnership on paper only. There are dozens of similar examples. Hence, the skepticism that greets such announcements is understandable.
This is all the more reason for HP and Microsoft executives to follow up on last week’s announcement with quick, decisive action and not just more fodder for the PR Newswire. For example, when can we expect to see the first fruits of the so-called “deeply optimized machine environment” that will provide turn-key, pre-packaged and pre-integrated server, application, networking and storage solutions? Who are the specific target users and how will they benefit? How will Microsoft and HP license and service these products? Those are just a few of the questions that need to be answered.
Non-Exclusive Partnerships Sometimes Make Strange Bedfellows
The partnership also has especially intriguing implications for HP which now has pacts in place with all of the major virtualization providers, including Microsoft’s biggest rival, and VMware. The new HP/Microsoft Application-to-Infrastructure is a non-exclusive three year partnership. It’s worth noting that HP already has a deal in place with VMware, whose ESX Server is the market leader in server virtualization. Microsoft also gets a boost from this deal. Microsoft’s Hyper-V has been gaining ground, particularly among small and mid-sized corporations. However, it has a long way to go to catch up to ESX Server’s installed base, particularly among large enterprises, so this pact helps keep Microsoft competitive. Additionally, HP also delivers a full suite of management solutions that integrates VMware’s vCenter offering with HP’s Insight management product. HP and Microsoft intend to similarly integrate HP’s Insight and Microsoft’s Systems Center. So again, this helps Microsoft broaden the appeal of its virtualization appeal to its existing base and makes it a more attractive solution for prospective customers.
The partnership with Microsoft put’s HP in the proverbial cat-bird’s seat: it now has a full line of its own servers that runs all the VMware products and similar plans to support Microsoft’s SQL Server and Exchange Server. This gives HP the ability to offer a full line of integrated hardware and services customers their choice of virtualization vendors, while remaining agnostic.
From Microsoft’s perspective, the partnership with HP also has immediate value: it allows Microsoft – at least on paper – to keep pace with VMware, by working with HP, a top OEM hardware vendor and services provider, which is no mean feat. Former Microsoft executive Paul Maritz who now runs VMware is intent on rejuvenating that company and he knows that the way to solidify and expand VMware’s influence is to increase its stake in management and applications. Just last week, VMware purchased Zimbra, the open source Email and collaboration unit of Yahoo for a rumored $100 million. Not coincidentally, Zimbra describes its Collaboration suite as the “next generation” Microsoft Exchange server.
Microsoft clearly felt the need to respond in kind.
The plethora of technology and partnership deals such the HP/Microsoft Application-to-Infrastructure pact, serve as a reminder of the intensity of the IT industry’s competitive landscape – particularly in burgeoning markets like virtualization and by extension, nascent markets like cloud computing. No vendor can afford to rest on its laurels. They must continue to upgrade their product and services offerings to keep pace with the competition.
Microsoft and VMware will continue to try and top one another, and HP is the beneficiary of this ongoing rivalry. Let’s hope the end users are also winners, too.

HP, Microsoft Still Have Some ‘Splainin’ to Do on Application-to-Infrastructure Pact Read More »

ITIC 2009-2010 Global Virtualization Deployment Trends Survey Results

Server virtualization demand and deployments are strong and will remain so for the remainder of 2009 and through 2010, despite the ongoing economic downturn.

The results of the new, independent ITIC 2009 Global Server Virtualization Survey, which polled more than 700 corporations worldwide during May/June and August, reveal that server virtualization deployments have remained strong throughout the ongoing 2009 economic downturn. It also shows that the three market leaders Citrix, Microsoft and VMware, are consolidating their positions even as the virtualization arena itself consolidates through mergers, acquisitions and partnerships.

Microsoft in particular has made big year-over-year gains in deployments and market share. Thanks to the summer release of the new Hyper-V 2.0 with live migration capabilities  the Redmond, Washington software firm has substantially closed the feature/performance gap between itself and VMware’s ESX Server.  The technical advances of Hyper-V combined with the excellent conditions of Microsoft’s licensing program, make the company’s virtualization products very competitive and alluring. Three out of five — 59% of the survey respondents — indicated their intent to deploy Hyper-V 2.0 within the next 12 to 18 months.

Survey responses also show a groundswell of support for application and desktop virtualization deployments. These two market segments constitute a much smaller niche of deployments and installations compared to virtualized server environments. The survey results show that application virtualization (where Microsoft is the market leader) and desktop virtualization (in which Citrix is the market leader), are both poised for significant growth in the 2010 timeframe.

Another key survey revelation was that 40% of respondents, especially businesses with 500 or more end users, said they either have or plan to install virtualization products from multiple vendors. This will place more emphasis and importance on integration, interoperability, management and third-party add-on tools to support these more complex, heterogeneous virtualization environments.

Among the other key survey highlights:

  • The “Big Three,” Citrix, Microsoft and VMware, are bolstering their positions with a slew of new offerings and a plethora of partnerships due out in the 2009 summer and fall.
  • Partnerships and Alliances: The alliance between Citrix and Microsoft remains robust as these two firms believe that there’s strength in numbers, as they mount a challenge to server virtualization leader VMware’s continuing dominance.
  • Microsoft Hyper-V Closes the Gap: Microsoft made big year-over-year market share gains from 2008 to 2009. The survey data shows current Hyper-V usage at 32%; but 59% plan to adopt in next 12 to 18 months.
  • VMware remains the market leader in server virtualization with approximately 50% share among enterprise users; Microsoft follows with 26% share.
  • Microsoft is the current market leader in application virtualization with a 15% share; followed by Citrix with 11% and VMware with 7%. However, nearly two-thirds of businesses have not yet deployed application virtualization.
  • Citrix is the market leader in desktop virtualization with a 19% market share followed by Microsoft with 15% and VMware with 8%. But again, over 60% of corporations have not yet begun to virtualize their desktop environments.
  • Mergers and Acquisitions Raise Questions: There is confusion among the legacy Sun and Virtual Iron users as to what will happen to both the product lines and technical support in the wake of both firms’ acquisition by Oracle.
  • Apple Mac is a popular virtualization platform; nearly 30% of respondents said they use Mac hardware in conjunction with Windows operating systems to virtualize their server and desktop environments.
  • Parallels and VMware Fusion are the two leading Mac virtualization vendors with a near 50/50 split market share.
  • Time to Bargain: Despite budget cuts and reduced resources only a very small percentage of companies — 7% — have attempted to renegotiate their virtualization licensing contracts to get lower prices and better deals.
  • Server Virtualization Lowers TCO: Almost 50% of survey respondents reported that server virtualization lets them lower their total cost of ownership (TCO) and achieve faster return on investment (ROI); however, only 25% of businesses could quantify the actual monetary cost savings
  • Users Prefer Terra Firma Virtualization to Cloud: Users are moving slowly with respect to public cloud computing migrations, which are heavily dependent on virtualization technology. To date, only 14% of survey respondents said they will move their data to a virtualized public cloud within the next six-to-12 months.

This survey identifies the trends that propel or impede server, application and desktop virtualization deployments and to elucidate the timeframes in which corporations plan to virtualize their environments. ITIC advises all businesses, irrespective of size or vertical market to conduct due diligence to determine which virtualization solution or combination of products best meets their technical and business needs in advance of any migration. And in light of the ongoing economic downturn, businesses are well advised to negotiate hard with their vendors for the best deals and to ensure that the appropriate IT managers receive the necessary training and certification to ensure a smooth, trouble-free virtualization upgrade. This will enable the business to lower TCO, accelerate ROI and minimize and mitigate risk to an acceptable level.

ITIC 2009-2010 Global Virtualization Deployment Trends Survey Results Read More »

Corporations Prefer Terra Firma to the Cloud — For Now

Concerns about cloud computing security and how fast cloud providers will respond in the event technical troubles should arise is making companies hesitant to embrace cloud computing — at least within the next 12 months. An 85% majority of the IT Performance Trends survey subjects say they will not implement a public or private cloud between June 2009 and June 2010. However, of that 85%, 31% say they are studying the issue but have made no decision yet and another 7% are “Unsure.”

Security topped the list of concerns and guarantees that companies would demand from a cloud services provider, if their firms were to implement a cloud model. An overwhelming 83% of respondents said they would need specific guarantees to safeguard their sensitive mission critical data before committing to a cloud. Additionally, almost three-quarters or 73% of respondents would require guaranteed fast response time for technical service and support. Nearly two thirds (63%) of respondents want minimum acceptable latency/response times and a nearly equal number (62%) say they would need multiple access paths to and from the cloud infrastructure.

It was clear from the customer interviews and essay responses that IT managers, especially those companies with fewer than 1,000 end users, will keep their corporate data and applications firmly planted behind the corporate firewall until they have ironclad assurances regarding the security of their data and their ability to access it.

“The idea that I would trust my email, financial transactions, or other day to day business operations to cloud computing is just asking for trouble,” observed an IT manager at a midsized corporation with 500 employees in the Midwest. “I do not even want to imagine my all my users being dead in the water because my link to the Internet was down,” he adds. Another manager at a retail firm with 250 employees expressed reservations about the ability of a cloud services vendor to deliver top notch service and support should the need arise.

“Downtime is the bane of an IT professional’s life,” says the network administrator at a retail firm with 250 employees. He noted that when an onsite and locally managed system fails, he and his IT team can take immediate action to replace parts, rebuild the operating system, restore data from tape backup or perform any other action required to restore services and applications. “Compare that to a failure in a cloud computing scenario, when all you can do is report the problem and hurry up and wait,” he says. “Most IT people are action oriented and they won’t respond well to being at the mercy of a cloud provider while listening to complaints and queries from users and management of ‘When will the system be back up?’ or ‘When can I get access to my data?'”

The director of IT at another midsized company with 400 users opined that he does not yet have confidence in the still-emerging cloud computing model. “We own our data, not the cloud provider, and we need to know it is movable if we need to leave the provider.”

Finally, the survey respondents indicated during first person customer interviews that they will continue to chart a conservative course that includes a very low tolerance for risk until the economy recovers and their companies can once again bolster IT staffs and provide more resources.

Analysis

Cloud computing is still in its nascent stages. It’s common for the hype among vendors, the press and analyst community to outpace current realities in IT, especially in the small and midsized businesses who have smaller budgets and are generally more conservative and risk averse than their enterprise counterparts.

The survey results also showed that there was much more of willingness on the part of larger enterprises to explore, test and deploy a cloud infrastructure. Among corporations with over 3,000 end users, a more convincing 57% percentage said they will either deploy or are considering a public or private cloud implementation over the next 12 to 18 months. Even this group though, is rightfully concerned about the uncertainties of trusting their sensitive data to a public cloud whose provider may be located in a foreign country.

Therefore, it is imperative that cloud computing vendors provide customers and prospective customers with transparency and full accountability with respect to crucial issues like: security, technical service and support, equipment and capacity of their data centers; an overview of the technology used (e.g. specific server equipment, virtualization, management, etc.). The vendors should also provide specific SLA levels and guarantees in the event those levels are not met.

Corporations should also perform due diligence. Get informed. Thoroughly investigate and compare the services and options of the various cloud providers. Know where and how your data will be stored, secured and managed. Ask for customer references. Consult with your in-house attorneys or obtain outside counsel to review proposed contracts. Don’t be afraid to insert out clauses and penalties in the event your cloud provider fails to meet SLAs. Also, at this early stage of development, don’t be afraid to ask for discounts and caps on prices hikes for the duration of your contract.

Corporations Prefer Terra Firma to the Cloud — For Now Read More »

Application Availability, Reliability and Downtime: Ignorance is NOT Bliss

Two out of five businesses – 40% – report that their major business applications require higher availability rates than they did two or three years ago. However an overwhelming 81% are unable to quantify the cost of downtime and only a small 5% minority of businesses are willing to spend whatever it takes to guarantee the highest levels of application availability 99.99% and above. Those are the results of the latest ITIC survey which polled C-level executives and IT managers at 300 corporations worldwide.

ITIC partnered with Stratus Technologies in Maynard, Ma. a vendor that specializes in high availability and fault tolerant hardware and software solutions, to compose the Web-based survey. ITIC conducted this blind, non-vendor and non-product specific survey which polled businesses on their application availability requirements, virtualization and the compliance rate of their service level agreements (SLAs). None of the respondents received any remuneration. The Web-based survey consisted of multiple choice and essay questions. ITIC analysts also conducted two dozen first person customer interviews to obtain detailed anecdotal data.

Respondents ranged from SMBs with 100 users to very large enterprises with over 100,000 end users. Industries represented: academic, advertising, aerospace, banking, communications, consumer products, defense, energy, finance, government, healthcare, insurance, IT services, legal, manufacturing, media and entertainment, telecommunications, transportation, and utilities. None of the survey respondents received any remuneration for their participation. The respondents hailed from 15 countries; 85% were based in North America.

Survey Highlights

The survey results uncovered many “disconnects” between the levels of application reliability that corporate enterprises profess to need and the availability rates their systems and applications actually deliver. Additionally, a significant portion of the survey respondents had difficulty defining what constitutes high application availability; do not specifically track downtime and could not quantify or qualify the cost of downtime and its impact on their network operations and business.

Among the other survey highlights:

  • A 54% majority of IT managers and executives surveyed said more than two-thirds of their companies’ applications require the highest level of availability – 99.99% — or four nines of uptime.
  • Over half – 52% of survey respondents said that virtualization technology increases application uptime and availability; only 4% said availability decreased as a result of virtualization deployments.
  • In response to the question, “which aspect of application availability is most important” to the business, 59% of those polled cited the prevention of unplanned downtime as being most crucial; 40% said disaster recovery and business continuity were most important; 38% said that minimizing planned downtime to apply patches and upgrades was their top priority; 16% said the ability to meet SLAs was most important and 40% of the survey respondents said all of the choices were equally crucial to their business needs.
  • Some 41% said they would be satisfied with conventional 99% to 99.9% (the equivalent of two or three nines) availability for their most critical applications. Ninety-nine percent or 99.9% does not qualify as a high-availability or continuous-availability solution.
  • An overwhelming 81% of survey respondents said the number of applications that demand high availability has increased in the past two-to-three years.
  • Of those who said they have been unable to meet service level agreements (SLAs), 72% can’t or don’t keep track of the cost and productivity losses created by downtime.
  • Budgetary constraints are a gating factor prohibiting many organizations from installing software solutions that would improve application availability. Overall, 70% of the survey respondents said they lacked the funds to purchase value-added availability solutions (40%); or were unsure how much or if their companies would spend to guarantee application availability (30%).
  • Of the 30% of businesses that quantified how much their firms would spend on availability solutions, 3% indicated they would spend $2,000 to $4,000; 8% said $4,000 to $5,000; another 3% said $5,000 to $10,000; 11% — mainly large enterprises indicated they were willing to allocate $10,000 to $15,000 to ensure application availability and 5% said they would spend “whatever it takes.”

According to the survey findings, just under half of all businesses – 49% – lack the budget for high availability technology and 40% of the respondents reported they don’t understand what qualifies as high availability. An overwhelming eight out of 10 IT managers – 80% — are unable to quantify the cost of downtime to their C-level executives.

To reiterate, the ITIC survey polled users on the various aspects and impact of application availability and downtime but it did not specify any products or vendors.

The survey results supplemented by ITIC first person interviews with IT managers and C-level executives clearly shows that on a visceral level, businesses are very aware of the need for increased application availability has grown. This is particularly true in light of the emergence of new technologies like application and desktop virtualization, cloud computing, Service Oriented Architecture (SOA). The fast growing remote, mobile and telecommuting end user population utilizes unified communications and collaboration applications and utilities is also spurring the need for greater application availability and reliability.

High Application Availability Not a Reality for 80% of Businesses

The survey results clearly show that network uptime isn’t keeping pace with the need for application availability. At the same time, IT managers and C-level executives interviewed by ITIC did comprehend the business risks associated with downtime, even though most are unable to quantify the cost of downtime or qualify the impact to the corporation, its customers, suppliers and business partners when unplanned application and network outages occur.

“We are continually being asked to do more with less,” said an IT manager at a large enterprise in the Northeast. “We are now at a point, where the number of complex systems requiring expert knowledge has exceeded the headcount needed to maintain them … I am dreading vacation season,” he added.

Another executive at an Application Service provider acknowledged that even though his firm’s SLA guarantees to customers are a modest 98%, it has on occasion, been unable to meet those goals. The executive said his firm compensated one of its clients for a significant outage incident. “We had a half day outage a couple of years ago which cost us in excess of $40,000 in goodwill payouts to a handful of our clients, despite the fact that it was the first outage in five years,” he said.

Another user said a lack of funds prevented his firm from allocating capital expenditure monies to purchase solutions that would guarantee 99.99% application availability. “Our biggest concern is keeping what we have running and available. Change usually costs money, and at the moment our budgets are simply in survival mode,” he said.

Another VP of IT at a New Jersey-based business said that ignorance is not bliss. “If people knew the actual dollar value their applications and customers represent, they’d already have the necessary software availability solutions in place to safeguard applications,” he said. “Yes, it does cost money to purchase application availability solutions, but we’d rather pay now, then wait for something to fail and pay more later,” the VP of IT said.

Overall, the survey results show that the inability of users to put valid metrics and cost formulas in place to track and quantify what uptime means to their organization is woefully inadequate and many corporations are courting disaster.

ITIC advises businesses to track downtime, the actual cost of downtime to the organization and to take the necessary steps to qualify the impact of downtime including lost data, potential liability risks e.g. lost business, lost customers, potential lawsuits and damage to the company’s reputation. Once a company can quantify the amount of downtime associated with its main line of business applications, the impact of downtime and the risk to the business, it can then make an accurate assessment of whether or not its current IT infrastructure adequately supports the degree of application availability the corporation needs to maintain its SLAs.

Application Availability, Reliability and Downtime: Ignorance is NOT Bliss Read More »

Scroll to Top