cloud

Cloud Computing: De-Mystifying the Cloud

Every year or so the high technology industry gets a new buzzword or experiences a paradigm shift which is hyped as “the next big thing.”
For the last 12 months or so, cloud computing has had that distinction. Anyone reading all the vendor-generated cloud computing press releases and associated news articles and blogs would conclude that corporations are building and deploying both private and public clouds in record breaking numbers. The reality is much more sobering. An ITIC independent Web-based survey that polled IT managers and C-level professionals at 700 organizations worldwide in January 2010, found that spending on cloud adoption was not a priority for the majority of survey participants during calendar 2010. In fact only 6 percent of participants said that private cloud spending was a priority this year and an even smaller 3 percent minority say that public cloud spending is a priority this year.
Those findings are buttressed by the latest joint ITIC/Sunbelt Software survey data (which is still live); it indicates that just under 20 percent of organizations have implemented a public or a private cloud. When asked why, nearly two-thirds or 65 percent of the respondents said they felt no compelling business need. Translation: they feel safe inside the confines of their current datacenters here on Terra Firma.

While there is a great deal of interest in the cloud infrastructure model, the majority of midsized and enterprise organizations are not rushing to install and deploy private or public clouds in 2010.

However, that is not to say that organizations – especially mid-sized and large enterprises – are not considering cloud implementations. ITIC research indicates that many businesses are more focused on performing much needed upgrades to such essentials as disaster recovery, desktop and server hardware, operating systems, applications, bandwidth and storage before turning their attention to new technologies like cloud computing.
Despite the many articles written about public and private cloud infrastructures over the past 18 months, many businesses remain confused about cloud specifics such as characteristics, costs, operational requirements, integration and interoperability with their existing environment or how to even get started.
De-Mystifying the Cloud
But just what is cloud computing, exactly? Definitions vary. The simplest, most straightforward definition is that a cloud is a grid or utility style pay-as-you-go computing model that uses the Web to deliver applications and services in real-time.
Organizations can choose to deploy a private cloud infrastructure wherein they host their services on-premises from behind the safety of the corporate firewall. The advantage here is that the IT department always knows what’s going on with all aspects of the corporate data from bandwidth, CPU utilization to all-important security issues. Alternatively, organizations can opt for a public cloud deployment in which a third party like Amazon Web Services (a division of Amazon.com) hosts the services at a remote location. This latter scenario saves businesses money and manpower hours by utilizing the host provider’s equipment and management. All that is needed is a Web browser and a high-speed Internet connection to connect to the host to access applications, services and data. However, the public cloud infrastructure is also a shared model in which corporate customers share bandwidth and space on the host’s servers.
Organizations that are extremely concerned about security and privacy issues and those that desire more control over their data can opt for a private cloud infrastructure in which the hosted services are delivered to the corporation’s end users from behind the safe confines of an internal corporate firewall. However, a private cloud is more than just a hosted services model that exists behind the confines of a firewall. Any discussion of private and/or public cloud infrastructure must also include virtualization. While most virtualized desktop, server, storage and network environments are not yet part of a cloud infrastructure, just about every private and public cloud will feature a virtualized environment.
Organizations contemplating a private cloud also need to ensure that they feature very high (near fault tolerant) availability with at least “five nines” 99.999% uptime or better. The private cloud should also be able to scale dynamically to accommodate the needs and demands of the users. And unlike most existing, traditional datacenters, the private cloud model should also incorporate a high degree of user-based resource provisioning. Ideally, the IT department should also be able to track resource usage in the private cloud by user, department or groups of users working on specific projects, for chargeback purposes.
Private clouds will also make extensive use of business intelligence and business process automation to guarantee that resources are available to the users on demand.
Given the Spartan economic conditions of the last two years, all but the most cash-rich organizations (and there are very few of those) will almost certainly have to upgrade their network infrastructure in advance of migrating to a private cloud environment. Organizations considering outsourcing any of their datacenter needs to a public cloud will also have to perform due diligence to determine the bona fides of their potential cloud service providers.
There are three basic types of cloud computing although the first two are the most prevalent. They are:
• Software as a Service (SaaS) which uses the Web to deliver software applications to the customer. Examples of this are Salesforce.com, which has one of the most popular, widely deployed, and the earliest cloud-based CRM application and Google Apps, which is experiencing solid growth. Google Apps comes in three editions – Standard, Education and Premier (the first two are free). It provides consumers and corporations with customizable versions of the company’s applications like Google Mail, Google Docs and Calendar.
• Platform as a Service (PaaS) offerings; examples of this include the above-mentioned Amazon Web Services and Microsoft’s nascent Windows Azure Platform. The Microsoft Azure cloud platform offering contains all the elements of a traditional application stack from the operating system up to the applications and the development framework. It includes the Windows Azure Platform AppFabric (formerly .NET Services for Azure) as well as the SQL Azure Database service. Customers that build applications for Azure will host it in the cloud. However, it is not a multi-tenant architecture meant to host your entire infrastructure. With Azure, businesses will rent resources that will reside in Microsoft datacenters. The costs are based on a per usage model. This gives customers the flexibility to rent fewer or more resources depending on their business needs.
• Infrastructure as a Service (IaaS) is exactly what its name implies: the entire infrastructure becomes a multi-tiered hosted cloud model and delivery mechanism.
Both public and private clouds should be flexible and agile: the resources should be available on demand and should be able to scale up or scale back as the businesses’ needs dictate.

Next: In Part 2 The Pros and Cons of the Cloud

Cloud Computing: De-Mystifying the Cloud Read More »

Microsoft Azure Platform, BPOS Cloud Vision Must Address Licensing

Microsoft did a very credible job at its TechEd conference in New Orleans last week, laying out the technology roadmap and strategy for a smooth transition from premises-based networks/services to its emerging Azure cloud infrastructure and software + services model.

One of the biggest challenges facing Microsoft and its customers as it stands on the cusp of what Bob Muglia, president of Microsoft’s Server & Tools Business (STB) unit characterized as a “major transformation in the industry called cloud computing,” is how the Redmond, Wash. software giant will license its cloud offerings.

Licensing programs and plans—even those that involve seemingly straightforward and mature software, PC- and server-based product offerings—are challenging and complex in the best of circumstances. This is something Microsoft knows only too well from experience. Constructing an equitable, easy-to-understand licensing model for cloud-based services could prove to be one of the most daunting tasks on Microsoft’s Azure roadmap.

It is imperative that Microsoft proactively address the cloud licensing issues now, and Microsoft executives are well aware of this. During the Q&A portion of one cloud-related TechEd session, Robert Wahbe, corporate vice president, STB Marketing was asked, “What about licensing?” He took a sip from his water bottle and replied, “That’s a big question.”

That is an understatement.

Microsoft has continually grappled with simplifying and refining its licensing strategy since it made a major misstep with Licensing 6.0 in May, 2001, where the initial offering was complex, convoluted and potentially very expensive. It immediately met with a huge vocal outcry and backlash. The company was compelled to postpone the Licensing 6.0 launch while it re-tooled the program to make it more user-friendly from both a technical and cost perspective.

Over the last nine years, Microsoft’s licensing program and strategy has become one of the best in the high-technology industry. It offers simplified terms and conditions (T&Cs); greater discounts for even the smallest micro SMBs and a variety of add-on tools (e.g. licensing compliance and assessment utilities), as well as access to freebies, such as online and onsite technical service and training for customers who purchase the company’s Software Assurance (SA) maintenance and upgrade agreement along with their Volume Licensing deals.

Licensing from Premises to the Cloud
Microsoft’s cloud strategy is a multi-pronged approach that incorporates a wide array of offerings, including Windows Azure, SQL Azure and Microsoft Online Services (MOS). MOS consists of hosted versions of Microsoft’s most popular and widely deployed server applications, such as Exchange Server, PowerPoint and SharePoint. Microsoft’s cloud strategy also encompasses consumer products like Windows Live, Xbox Live and MSN.

Microsoft is also delivering a hybrid cloud infrastructure that will enable organizations to combine premises-based with hosted cloud solutions. This will indisputably provide Microsoft customers with flexibility and choice as they transition from a fixed-premises computing model to a hosted cloud model. In addition, it will allow them to migrate to the cloud at their own pace as their budgets and business needs dictate. However, the very flexibility, breadth and depth of offerings that make Microsoft products so appealing to customers, ironically, are the very issues that increase the complexity and challenges of creating an easily accessible, straightforward licensing model.

Dueling Microsoft Clouds: Azure vs. BPOS
Complicating matters is that Microsoft has dueling cloud offerings; the Business Productivity Online Suite (BPOS) and the Windows Azure Platform. As a result, Microsoft must also develop, delineate and differentiate its strategy, pricing and provisions for Azure and BPOS. It’s unclear (at least to this analyst) as to when and how a customer will choose one or mix and match BPOS and Azure offerings. Both are currently works in progress.

BPOS is a licensing suite and a set of collaborative end-user services that run on Windows Server, Exchange Server, and SQL Server. Microsoft offers the BPOS Standard Suite, which incorporates Exchange Online, SharePoint Online, Office Live Meeting, and Office Communications (OCS) Online. The availability of the latter two offerings is a key differentiator that distinguishes Microsoft’s BPOS and rival offerings from Google. Microsoft also sells the BPOS Business Productivity Online Deskless Worker Suite. It consists of Exchange Online Deskless Worker, SharePoint Online Deskless Worker and Outlook Web Access Light. This BPOS package is targeted at SMBs, small branch offices or companies that want basic, entry-level messaging and document collaboration functions.

By contrast, Azure is a cloud platform offering that contains all the elements of a traditional application stack from the operating system up to the applications and the development framework. It includes the Windows Azure Platform AppFabric (formerly .NET Services for Azure), as well as the SQL Azure Database service.

While BPOS is aimed squarely at end users and IT managers, Azure targets third-party ISVs and internal corporate developers. Customers that build applications for Azure will host it in the cloud. However, it is not a multi-tenant architecture meant to host your entire infrastructure. With Azure, businesses will rent resources that will reside in Microsoft datacenters. The costs are based on a per-usage model. This gives customers the flexibility to rent fewer or more resources, depending on their business needs.

Cloud Licensing Questions
Any cloud licensing or hybrid cloud licensing program that Microsoft develops must include all of the elements of its current fixed premises and virtualization models. This includes:

1. Volume Licensing: As the technology advances from fixed premises software and hardware offerings to private and public clouds, Microsoft must find ways to translate the elements of its current Open, Select and Enterprise agreements to address the broad spectrum of users from small and midsized (SMBs) companies to the largest enterprises with the associated discounts for volume purchases.
2. Term Length: The majority of volume license agreements are based on a three-year product lifecycle. During the protracted economic downturn, however, many companies could not afford to upgrade. A hosted cloud model, though, will be based on usage and consumption, so the terms should and most likely will vary.
3. Software Assurance: Organizations will still need upgrade and maintenance plans regardless of where their data resides and whether or not they have traditional subscription licensing or the newer consumption/usage model.
4. Service and Support: Provisions for after-market technical services, support and maintenance will be crucial for Microsoft, its users, resellers and OEM channel partners. ITIC survey data indicates that the breadth and depth of after-market technical service and support is among the top four items that make or break a purchasing deal.
5. Defined areas of responsibility and indemnification: This will require careful planning on Microsoft’s part. Existing premises-based licensing models differ according to whether or not the customer purchases their products directly from Microsoft, a reseller or an OEM hardware manufacturer. Organizations that adopt a hybrid premises/cloud offering and those that opt for an entirely hosted cloud offering will be looking more than ever before to Microsoft for guidance. Microsoft must be explicit as to what it will cover and what will be covered by OEM partners and/or host providers.

Complicating the cloud licensing models even further is the nature of the cloud itself. There is no singular cloud model. There may be multiple clouds, and they may be a mixture of public and private clouds that also link to fixed premises and mobile networks.

Among the cloud licensing questions that Microsoft must address and specifically answer in the coming months are:

• What specific pricing models and tiers for SMBs, midsize and enterprises will be based on a hybrid and full cloud infrastructures?
• What specific guarantees if any, will it provide for securing sensitive data?
• What level of guaranteed response time will it provide for service and support?
• What is the minimum acceptable latency/response time for its cloud services?
• Will it provide multiple access points to and from the cloud infrastructure?
• What specific provisions will apply to Service Level Agreements (SLAs)?
• How will financial remuneration for SLA violations be determined?
• What are the capacity ceilings for the service infrastructure?
• What provisions will there be for service failures and disruptions?
• How are upgrade and maintenance provisions defined?

From the keynote speeches and throughout the STB Summit and TechEd conference, Microsoft’s Muglia and Wahbe both emphasized and promoted the idea that there is no singular cloud. Instead, Microsoft’s vision is a world of multiple private, public and hybrid clouds that are built to individual organizations’ specific needs.

That’s all well and good. But in order for this strategy to succeed, Microsoft will have to take the lead on both the technology and the licensing fronts. The BPOS and Azure product managers and marketers should actively engage with the Worldwide Licensing Program (WWLP) managers and construct a simplified, straightforward licensing model. We recognize that this is much easier said than done. But customers need and will demand transparency in licensing pricing, models and T&Cs before committing to the Microsoft cloud.

Microsoft Azure Platform, BPOS Cloud Vision Must Address Licensing Read More »

Virtualization Deployments Soar, But Companies Prefer Terra Firma to Cloud for now

The ongoing buzz surrounding cloud computing – particularly public clouds – is far outpacing actual deployments by mainstream users. To date only 14% of companies have deployed or plan to deploy a private cloud infrastructure within the next two calendar quarters.
Instead, as businesses slowly recover from the ongoing economic downturn, their most immediate priorities are to upgrades to legacy desktop and server hardware, outmoded applications and to expand their virtualization deployments. Those are the results of the latest ITIC 2010 Virtualization and High Availability survey which polled C-level executives and IT managers at 400 organizations worldwide.
ITIC partnered with Stratus Technologies and Sunbelt Software to conduct the Web-based survey of multiple choice questions and essay comments. ITIC also conducted first person interviews with over two dozen end to obtain anecdotal responses on the primary accelerators or impediments to virtualization, high availability and reliability, cloud computing. The survey also queried customers on whether or not their current network infrastructure and mission critical applications were adequate enough to handle new technologies and the increasing demands of the business.
The survey showed that for now at least, although, many midsized and large enterprises are contemplating a move to the cloud – especially a private cloud infrastructure – the technology and business model is still not essential for most businesses. Some 48% of survey participants said they have no plans to migrate to private cloud architecture within the next 12 months while another 33% said their companies are studying the issue but have no firm plans to deploy.

The study also indicates that Private Cloud deployments are outpacing Public Cloud Infrastructure deployments by a 2 to 1 margin. However before businesses can begin to consider a private cloud deployment they must first upgrade the “building block” components of their existing environments e.g., server and desktop hardware, WAN infrastructure; storage, security and applications. Only 11% of businesses described their server and desktop hardware as leading edge or state-of-the-art. And just 8% of respondents characterized their desktop and application environment as leading edge.

The largest proportion of the survey participants – 52% – described their desktop and server hardware working well, while 48% said their applications were up-to-date. However, 34% acknowledged that some of their server hardware needed to be updated. A higher percentage of users 41% admitted that their mission critical software applications were due to be refreshed. And a small 3% minority said that a significant portion of both their hardware and mission critical applications were outmoded and adversely impacting the performance and reliability of their networks.

Based on the survey data and customer interviews, ITIC anticipates that from now until October, companies’ primary focus will be on infrastructure improvements.

Reliability and Uptime Lag

The biggest surprise in this survey from the 2009 High Availability and Fault Tolerant survey, which ITIC & Stratus conducted nearly one year ago, was the decline in the number of survey participants who said their organizations required 99.99% uptime and reliability. In this latest survey, the largest portion of respondents – 38% — or nearly 4 out of 10 businesses said that 99.9% uptime — the equivalent of 8.76 hours of per server, per annum downtime was the minimum acceptable amount for their mission critical line of business (LOB) applications. This is more than three times the 12% of respondents who said that 99.9% uptime was acceptable in the prior 2009 survey. Overall, 62% or nearly two-thirds of survey participants indicated their organizations are willing to live with higher levels of downtime than were considered acceptable in previous years.
Some 39% of survey respondents – almost 4 out of 10 respondents indicated that their organizations demand high availability which ITIC defines as four nines of uptime or greater. Specifically, 27% said their organizations require 99.99% uptime; another 6% need 99.999% uptime and a 3% minority require the highest 99.999% level of availability.
The customer interviews found that the ongoing economic downturn, aged/aging network infrastructures (server and desktop hardware and older applications), layoffs, hiring freezes and the new standard operating procedure (SOP) “do more with less” has made 99.9% uptime more palatable than in previous years.
Those firms that do not keep track of the number and severity of their outages have no way of gauging the financial and data losses to the business. Even a cursory comparison indicates substantial cost disparities between 99% uptime and 99.99% uptime. The monetary costs, business impact and risks associated with downtime will vary by company as well as the duration and severity of individual outage incidents. However a small or midsize business, for example, which estimates the hourly cost of downtime to be a very conservative $10,000 per hour, would potentially incur losses of $876,000 per year at a data center with 99% application availability (87 hours downtime). By contrast, a company whose data center operations has 99.99% uptime, would incur losses of $87,600 or one-tenth that of a firm with conventional 99% availability.
Ironically, the need for rock-solid network reliability has never been greater. The rise of Web-based applications and new technologies like virtualization and Service Oriented Architecture (SOA), as well as the emergence of public or shared cloud computing models are designed to maximize productivity. But without the proper safeguards these new datacenter paradigms may raise the risk of downtime. The Association for Computer Operations Management/ Data Center Institute (AFCOM) forecasts that one-in-four data centers will experience a serious business disruption over the next five years.
At the same time, customer interviews revealed that over half of all businesses 56% lack the budget for high availability technology. Another ongoing challenge is that 78% of survey participants acknowledged that their companies either lack the skills or simply do not attempt to quantify the monetary and business costs associated with hourly downtime. The reasons for this are well documented. Some organizations don’t routinely do this and those that attempt to calculate costs and damages run into difficulties collecting data because the data resides with many individuals across the enterprise. Inter-departmental communication, cooperation and collaboration is sorely lacking at many firms. Only 22% of survey respondents were able assign a specific cost to one hour of downtime and most of them gave conservative estimates of $1,000 to $25,000 for a one hour network outage. Only 13% of the 22% of survey participants who were able to quantify the cost of downtime indicated that their hourly losses would top $175,000 or more.

Users Confident and Committed to Virtualization Technology
The news was more upbeat with respect to virtualization – especially server virtualization deployments. Organizations are both confident and comfortable with virtualization technology.
72% of respondents indicated the number of desktop and server-based applications demanding high availability has increased over the past two years. The survey also found that a 77% majority of participants run business critical applications on virtual machines. Not surprisingly, the survey data showed that virtualization usage will continue to expand over the next 12 months. A 79% majority – approximately eight-out-of-10 respondents — said the number of business critical applications running on virtual machines and virtual desktops will increase significantly over the next year. Server virtualization is very much a mainstream and accepted technology. The responses to this question indicate increased adoption as well as confidence. Nearly one-quarter of the respondents – 24% say that more than 75% of their production servers are VMs. Overall 44% of respondents say than over 50% of their servers are VMs. However, none of the survey participants indicate that 100% of their servers are virtualized. Additionally, only 6% of survey resp

Virtualization Deployments Soar, But Companies Prefer Terra Firma to Cloud for now Read More »

HP, Microsoft Still Have Some ‘Splainin’ to Do on Application-to-Infrastructure Pact

The recently announced joint Hewlett-Packard/Microsoft Application-to-Infrastructure Model Partnership has intriguing possibilities for both companies and their respective and overlapping installed customer base. However, it remains to be seen how quickly and efficiently the two industry giants can deliver products and market the merits of the solution. Now $250 million is huge investment even for two high tech powerhouses like HP and Microsoft. So we know this is a serious committment.

To recap, HP and Microsoft said they will invest $250 million into their Frontline Partnership. The deal aims to deliver full, integrated stacks that support Microsoft’s Exchange Server and SQL Server, including management, virtualization and cloud implementations. The resulting product offerings will consist of pre-packaged application solution bundles that incorporate the aforementioned management and virtualization capabilities. The two companies said the pact calls for them to partner on engineering, R&D, marketing and channel sales.
Still, the announcement left many industry watchers with more questions than answers. As my colleagues Charles King and Merv Adrian noted in their Breaking News Review in the January 14 special edition of Charles King’s Pund-IT, HP and Microsoft “have worked closely for years, share tens of thousands of common customers and channel partners and have long supported each other’s interests.”
So what’s new about this announcement? That question should be answered during the coming months. A $250 million investment is considerable even for two high technology titans. It now remains for HP and Microsoft to execute on their promise to produce solutions that thoroughly integrate the two companies’ infrastructure and applications stacks to ship pre-configured and optimized solutions for Microsoft’s Exchange Server, and SQL Server, virtualization, cloud computing converged infrastructure and pre-packaged application tools.
But perhaps the most immediate and daunting challenge is for HP and Microsoft to deliver a product roadmap that also includes specific details about the pricing, training and services the two firms will commonly deliver. Above all, companies must market and sell this deal to the legions of skeptics. The high tech industry has witnessed numerous high profile partnership deals announced amidst much industry fanfare never to be heard from after the initial press releases.
Remember the Cisco Systems/Microsoft Directory Enabled Network (DEN) initiative of the late 1990s? No. Not many people do. Announced with great fanfare, this dream team was supposed to incorporate the functionality of Microsoft’s Active Directory into Cisco routers and provide network administrators with a more comprehensive means of managing various devices on their network. In reality, the Cisco/Microsoft DEN initiative was a partnership on paper only. There are dozens of similar examples. Hence, the skepticism that greets such announcements is understandable.
This is all the more reason for HP and Microsoft executives to follow up on last week’s announcement with quick, decisive action and not just more fodder for the PR Newswire. For example, when can we expect to see the first fruits of the so-called “deeply optimized machine environment” that will provide turn-key, pre-packaged and pre-integrated server, application, networking and storage solutions? Who are the specific target users and how will they benefit? How will Microsoft and HP license and service these products? Those are just a few of the questions that need to be answered.
Non-Exclusive Partnerships Sometimes Make Strange Bedfellows
The partnership also has especially intriguing implications for HP which now has pacts in place with all of the major virtualization providers, including Microsoft’s biggest rival, and VMware. The new HP/Microsoft Application-to-Infrastructure is a non-exclusive three year partnership. It’s worth noting that HP already has a deal in place with VMware, whose ESX Server is the market leader in server virtualization. Microsoft also gets a boost from this deal. Microsoft’s Hyper-V has been gaining ground, particularly among small and mid-sized corporations. However, it has a long way to go to catch up to ESX Server’s installed base, particularly among large enterprises, so this pact helps keep Microsoft competitive. Additionally, HP also delivers a full suite of management solutions that integrates VMware’s vCenter offering with HP’s Insight management product. HP and Microsoft intend to similarly integrate HP’s Insight and Microsoft’s Systems Center. So again, this helps Microsoft broaden the appeal of its virtualization appeal to its existing base and makes it a more attractive solution for prospective customers.
The partnership with Microsoft put’s HP in the proverbial cat-bird’s seat: it now has a full line of its own servers that runs all the VMware products and similar plans to support Microsoft’s SQL Server and Exchange Server. This gives HP the ability to offer a full line of integrated hardware and services customers their choice of virtualization vendors, while remaining agnostic.
From Microsoft’s perspective, the partnership with HP also has immediate value: it allows Microsoft – at least on paper – to keep pace with VMware, by working with HP, a top OEM hardware vendor and services provider, which is no mean feat. Former Microsoft executive Paul Maritz who now runs VMware is intent on rejuvenating that company and he knows that the way to solidify and expand VMware’s influence is to increase its stake in management and applications. Just last week, VMware purchased Zimbra, the open source Email and collaboration unit of Yahoo for a rumored $100 million. Not coincidentally, Zimbra describes its Collaboration suite as the “next generation” Microsoft Exchange server.
Microsoft clearly felt the need to respond in kind.
The plethora of technology and partnership deals such the HP/Microsoft Application-to-Infrastructure pact, serve as a reminder of the intensity of the IT industry’s competitive landscape – particularly in burgeoning markets like virtualization and by extension, nascent markets like cloud computing. No vendor can afford to rest on its laurels. They must continue to upgrade their product and services offerings to keep pace with the competition.
Microsoft and VMware will continue to try and top one another, and HP is the beneficiary of this ongoing rivalry. Let’s hope the end users are also winners, too.

HP, Microsoft Still Have Some ‘Splainin’ to Do on Application-to-Infrastructure Pact Read More »

ITIC 2009-2010 Global Virtualization Deployment Trends Survey Results

Server virtualization demand and deployments are strong and will remain so for the remainder of 2009 and through 2010, despite the ongoing economic downturn.

The results of the new, independent ITIC 2009 Global Server Virtualization Survey, which polled more than 700 corporations worldwide during May/June and August, reveal that server virtualization deployments have remained strong throughout the ongoing 2009 economic downturn. It also shows that the three market leaders Citrix, Microsoft and VMware, are consolidating their positions even as the virtualization arena itself consolidates through mergers, acquisitions and partnerships.

Microsoft in particular has made big year-over-year gains in deployments and market share. Thanks to the summer release of the new Hyper-V 2.0 with live migration capabilities  the Redmond, Washington software firm has substantially closed the feature/performance gap between itself and VMware’s ESX Server.  The technical advances of Hyper-V combined with the excellent conditions of Microsoft’s licensing program, make the company’s virtualization products very competitive and alluring. Three out of five — 59% of the survey respondents — indicated their intent to deploy Hyper-V 2.0 within the next 12 to 18 months.

Survey responses also show a groundswell of support for application and desktop virtualization deployments. These two market segments constitute a much smaller niche of deployments and installations compared to virtualized server environments. The survey results show that application virtualization (where Microsoft is the market leader) and desktop virtualization (in which Citrix is the market leader), are both poised for significant growth in the 2010 timeframe.

Another key survey revelation was that 40% of respondents, especially businesses with 500 or more end users, said they either have or plan to install virtualization products from multiple vendors. This will place more emphasis and importance on integration, interoperability, management and third-party add-on tools to support these more complex, heterogeneous virtualization environments.

Among the other key survey highlights:

  • The “Big Three,” Citrix, Microsoft and VMware, are bolstering their positions with a slew of new offerings and a plethora of partnerships due out in the 2009 summer and fall.
  • Partnerships and Alliances: The alliance between Citrix and Microsoft remains robust as these two firms believe that there’s strength in numbers, as they mount a challenge to server virtualization leader VMware’s continuing dominance.
  • Microsoft Hyper-V Closes the Gap: Microsoft made big year-over-year market share gains from 2008 to 2009. The survey data shows current Hyper-V usage at 32%; but 59% plan to adopt in next 12 to 18 months.
  • VMware remains the market leader in server virtualization with approximately 50% share among enterprise users; Microsoft follows with 26% share.
  • Microsoft is the current market leader in application virtualization with a 15% share; followed by Citrix with 11% and VMware with 7%. However, nearly two-thirds of businesses have not yet deployed application virtualization.
  • Citrix is the market leader in desktop virtualization with a 19% market share followed by Microsoft with 15% and VMware with 8%. But again, over 60% of corporations have not yet begun to virtualize their desktop environments.
  • Mergers and Acquisitions Raise Questions: There is confusion among the legacy Sun and Virtual Iron users as to what will happen to both the product lines and technical support in the wake of both firms’ acquisition by Oracle.
  • Apple Mac is a popular virtualization platform; nearly 30% of respondents said they use Mac hardware in conjunction with Windows operating systems to virtualize their server and desktop environments.
  • Parallels and VMware Fusion are the two leading Mac virtualization vendors with a near 50/50 split market share.
  • Time to Bargain: Despite budget cuts and reduced resources only a very small percentage of companies — 7% — have attempted to renegotiate their virtualization licensing contracts to get lower prices and better deals.
  • Server Virtualization Lowers TCO: Almost 50% of survey respondents reported that server virtualization lets them lower their total cost of ownership (TCO) and achieve faster return on investment (ROI); however, only 25% of businesses could quantify the actual monetary cost savings
  • Users Prefer Terra Firma Virtualization to Cloud: Users are moving slowly with respect to public cloud computing migrations, which are heavily dependent on virtualization technology. To date, only 14% of survey respondents said they will move their data to a virtualized public cloud within the next six-to-12 months.

This survey identifies the trends that propel or impede server, application and desktop virtualization deployments and to elucidate the timeframes in which corporations plan to virtualize their environments. ITIC advises all businesses, irrespective of size or vertical market to conduct due diligence to determine which virtualization solution or combination of products best meets their technical and business needs in advance of any migration. And in light of the ongoing economic downturn, businesses are well advised to negotiate hard with their vendors for the best deals and to ensure that the appropriate IT managers receive the necessary training and certification to ensure a smooth, trouble-free virtualization upgrade. This will enable the business to lower TCO, accelerate ROI and minimize and mitigate risk to an acceptable level.

ITIC 2009-2010 Global Virtualization Deployment Trends Survey Results Read More »

Corporations Prefer Terra Firma to the Cloud — For Now

Concerns about cloud computing security and how fast cloud providers will respond in the event technical troubles should arise is making companies hesitant to embrace cloud computing — at least within the next 12 months. An 85% majority of the IT Performance Trends survey subjects say they will not implement a public or private cloud between June 2009 and June 2010. However, of that 85%, 31% say they are studying the issue but have made no decision yet and another 7% are “Unsure.”

Security topped the list of concerns and guarantees that companies would demand from a cloud services provider, if their firms were to implement a cloud model. An overwhelming 83% of respondents said they would need specific guarantees to safeguard their sensitive mission critical data before committing to a cloud. Additionally, almost three-quarters or 73% of respondents would require guaranteed fast response time for technical service and support. Nearly two thirds (63%) of respondents want minimum acceptable latency/response times and a nearly equal number (62%) say they would need multiple access paths to and from the cloud infrastructure.

It was clear from the customer interviews and essay responses that IT managers, especially those companies with fewer than 1,000 end users, will keep their corporate data and applications firmly planted behind the corporate firewall until they have ironclad assurances regarding the security of their data and their ability to access it.

“The idea that I would trust my email, financial transactions, or other day to day business operations to cloud computing is just asking for trouble,” observed an IT manager at a midsized corporation with 500 employees in the Midwest. “I do not even want to imagine my all my users being dead in the water because my link to the Internet was down,” he adds. Another manager at a retail firm with 250 employees expressed reservations about the ability of a cloud services vendor to deliver top notch service and support should the need arise.

“Downtime is the bane of an IT professional’s life,” says the network administrator at a retail firm with 250 employees. He noted that when an onsite and locally managed system fails, he and his IT team can take immediate action to replace parts, rebuild the operating system, restore data from tape backup or perform any other action required to restore services and applications. “Compare that to a failure in a cloud computing scenario, when all you can do is report the problem and hurry up and wait,” he says. “Most IT people are action oriented and they won’t respond well to being at the mercy of a cloud provider while listening to complaints and queries from users and management of ‘When will the system be back up?’ or ‘When can I get access to my data?'”

The director of IT at another midsized company with 400 users opined that he does not yet have confidence in the still-emerging cloud computing model. “We own our data, not the cloud provider, and we need to know it is movable if we need to leave the provider.”

Finally, the survey respondents indicated during first person customer interviews that they will continue to chart a conservative course that includes a very low tolerance for risk until the economy recovers and their companies can once again bolster IT staffs and provide more resources.

Analysis

Cloud computing is still in its nascent stages. It’s common for the hype among vendors, the press and analyst community to outpace current realities in IT, especially in the small and midsized businesses who have smaller budgets and are generally more conservative and risk averse than their enterprise counterparts.

The survey results also showed that there was much more of willingness on the part of larger enterprises to explore, test and deploy a cloud infrastructure. Among corporations with over 3,000 end users, a more convincing 57% percentage said they will either deploy or are considering a public or private cloud implementation over the next 12 to 18 months. Even this group though, is rightfully concerned about the uncertainties of trusting their sensitive data to a public cloud whose provider may be located in a foreign country.

Therefore, it is imperative that cloud computing vendors provide customers and prospective customers with transparency and full accountability with respect to crucial issues like: security, technical service and support, equipment and capacity of their data centers; an overview of the technology used (e.g. specific server equipment, virtualization, management, etc.). The vendors should also provide specific SLA levels and guarantees in the event those levels are not met.

Corporations should also perform due diligence. Get informed. Thoroughly investigate and compare the services and options of the various cloud providers. Know where and how your data will be stored, secured and managed. Ask for customer references. Consult with your in-house attorneys or obtain outside counsel to review proposed contracts. Don’t be afraid to insert out clauses and penalties in the event your cloud provider fails to meet SLAs. Also, at this early stage of development, don’t be afraid to ask for discounts and caps on prices hikes for the duration of your contract.

Corporations Prefer Terra Firma to the Cloud — For Now Read More »

Application Availability, Reliability and Downtime: Ignorance is NOT Bliss

Two out of five businesses – 40% – report that their major business applications require higher availability rates than they did two or three years ago. However an overwhelming 81% are unable to quantify the cost of downtime and only a small 5% minority of businesses are willing to spend whatever it takes to guarantee the highest levels of application availability 99.99% and above. Those are the results of the latest ITIC survey which polled C-level executives and IT managers at 300 corporations worldwide.

ITIC partnered with Stratus Technologies in Maynard, Ma. a vendor that specializes in high availability and fault tolerant hardware and software solutions, to compose the Web-based survey. ITIC conducted this blind, non-vendor and non-product specific survey which polled businesses on their application availability requirements, virtualization and the compliance rate of their service level agreements (SLAs). None of the respondents received any remuneration. The Web-based survey consisted of multiple choice and essay questions. ITIC analysts also conducted two dozen first person customer interviews to obtain detailed anecdotal data.

Respondents ranged from SMBs with 100 users to very large enterprises with over 100,000 end users. Industries represented: academic, advertising, aerospace, banking, communications, consumer products, defense, energy, finance, government, healthcare, insurance, IT services, legal, manufacturing, media and entertainment, telecommunications, transportation, and utilities. None of the survey respondents received any remuneration for their participation. The respondents hailed from 15 countries; 85% were based in North America.

Survey Highlights

The survey results uncovered many “disconnects” between the levels of application reliability that corporate enterprises profess to need and the availability rates their systems and applications actually deliver. Additionally, a significant portion of the survey respondents had difficulty defining what constitutes high application availability; do not specifically track downtime and could not quantify or qualify the cost of downtime and its impact on their network operations and business.

Among the other survey highlights:

  • A 54% majority of IT managers and executives surveyed said more than two-thirds of their companies’ applications require the highest level of availability – 99.99% — or four nines of uptime.
  • Over half – 52% of survey respondents said that virtualization technology increases application uptime and availability; only 4% said availability decreased as a result of virtualization deployments.
  • In response to the question, “which aspect of application availability is most important” to the business, 59% of those polled cited the prevention of unplanned downtime as being most crucial; 40% said disaster recovery and business continuity were most important; 38% said that minimizing planned downtime to apply patches and upgrades was their top priority; 16% said the ability to meet SLAs was most important and 40% of the survey respondents said all of the choices were equally crucial to their business needs.
  • Some 41% said they would be satisfied with conventional 99% to 99.9% (the equivalent of two or three nines) availability for their most critical applications. Ninety-nine percent or 99.9% does not qualify as a high-availability or continuous-availability solution.
  • An overwhelming 81% of survey respondents said the number of applications that demand high availability has increased in the past two-to-three years.
  • Of those who said they have been unable to meet service level agreements (SLAs), 72% can’t or don’t keep track of the cost and productivity losses created by downtime.
  • Budgetary constraints are a gating factor prohibiting many organizations from installing software solutions that would improve application availability. Overall, 70% of the survey respondents said they lacked the funds to purchase value-added availability solutions (40%); or were unsure how much or if their companies would spend to guarantee application availability (30%).
  • Of the 30% of businesses that quantified how much their firms would spend on availability solutions, 3% indicated they would spend $2,000 to $4,000; 8% said $4,000 to $5,000; another 3% said $5,000 to $10,000; 11% — mainly large enterprises indicated they were willing to allocate $10,000 to $15,000 to ensure application availability and 5% said they would spend “whatever it takes.”

According to the survey findings, just under half of all businesses – 49% – lack the budget for high availability technology and 40% of the respondents reported they don’t understand what qualifies as high availability. An overwhelming eight out of 10 IT managers – 80% — are unable to quantify the cost of downtime to their C-level executives.

To reiterate, the ITIC survey polled users on the various aspects and impact of application availability and downtime but it did not specify any products or vendors.

The survey results supplemented by ITIC first person interviews with IT managers and C-level executives clearly shows that on a visceral level, businesses are very aware of the need for increased application availability has grown. This is particularly true in light of the emergence of new technologies like application and desktop virtualization, cloud computing, Service Oriented Architecture (SOA). The fast growing remote, mobile and telecommuting end user population utilizes unified communications and collaboration applications and utilities is also spurring the need for greater application availability and reliability.

High Application Availability Not a Reality for 80% of Businesses

The survey results clearly show that network uptime isn’t keeping pace with the need for application availability. At the same time, IT managers and C-level executives interviewed by ITIC did comprehend the business risks associated with downtime, even though most are unable to quantify the cost of downtime or qualify the impact to the corporation, its customers, suppliers and business partners when unplanned application and network outages occur.

“We are continually being asked to do more with less,” said an IT manager at a large enterprise in the Northeast. “We are now at a point, where the number of complex systems requiring expert knowledge has exceeded the headcount needed to maintain them … I am dreading vacation season,” he added.

Another executive at an Application Service provider acknowledged that even though his firm’s SLA guarantees to customers are a modest 98%, it has on occasion, been unable to meet those goals. The executive said his firm compensated one of its clients for a significant outage incident. “We had a half day outage a couple of years ago which cost us in excess of $40,000 in goodwill payouts to a handful of our clients, despite the fact that it was the first outage in five years,” he said.

Another user said a lack of funds prevented his firm from allocating capital expenditure monies to purchase solutions that would guarantee 99.99% application availability. “Our biggest concern is keeping what we have running and available. Change usually costs money, and at the moment our budgets are simply in survival mode,” he said.

Another VP of IT at a New Jersey-based business said that ignorance is not bliss. “If people knew the actual dollar value their applications and customers represent, they’d already have the necessary software availability solutions in place to safeguard applications,” he said. “Yes, it does cost money to purchase application availability solutions, but we’d rather pay now, then wait for something to fail and pay more later,” the VP of IT said.

Overall, the survey results show that the inability of users to put valid metrics and cost formulas in place to track and quantify what uptime means to their organization is woefully inadequate and many corporations are courting disaster.

ITIC advises businesses to track downtime, the actual cost of downtime to the organization and to take the necessary steps to qualify the impact of downtime including lost data, potential liability risks e.g. lost business, lost customers, potential lawsuits and damage to the company’s reputation. Once a company can quantify the amount of downtime associated with its main line of business applications, the impact of downtime and the risk to the business, it can then make an accurate assessment of whether or not its current IT infrastructure adequately supports the degree of application availability the corporation needs to maintain its SLAs.

Application Availability, Reliability and Downtime: Ignorance is NOT Bliss Read More »

Apple Shines

Apple rang in 2009 by celebrating a trio of milestones that were impressive by any standards including those of a company whose 32-year span has been filled with a cornucopia of noteworthy events. In quick succession, Apple posted the best financial results in its history: during the just ended 2009 first fiscal quarter it achieved record revenues of nearly $10.2 billion on record net quarterly profits of $1.61 billion and it sold an astounding 22.7 million iPods, another record. The icing on the cake: Apple’s flagship Mac computer celebrated its 25th birthday amidst the news that the Cupertino, California firm’s latest Mac Book and Mac Book Pro notebooks contributed to the overall financial bonanza with sales of 2.5 million units; a 34% gain in year-over-year unit shipments.

These feats would be extraordinary at any time but they offered even more cause for celebration due to their arrival during a week in which the news from almost all of Apple’s high-tech vendor counterparts ranged from disappointing to dismal to downright dire. Intel said it would shed up to 6,000 workers and close five manufacturing plants; Microsoft announced it will lay off 5,000 workers (the first such major action in its history) amidst declining demand for Windows PC solutions, and even the goliath Google saw a sharp decline in its 2009 first fiscal quarter profits.

With such a bountiful harvest, it was more than a little perplexing to read the headline in the January 22 issue of Silicon Valley.com column proclaiming: “Mac’s influence could wane.” Granted, the headline was a bit misleading. The article itself stated that things look good for Apple and its Macs in the near term, but what about the next 25 years? Good question.

Long term forecasts of even five years are more art or guesswork than science. But decades long prognostications are rarities unless you’re talking about Nostradamus or the Oracle of Delphi. So we’re left to forecast with the tools at our disposal – in this case, the facts. So here for your consideration is our Top 10 List concerning Apple’s health and well-being. It includes some little known facts of both a positive and even potentially negative nature.

10. Big Mac sales shrink. Apple Mac desktop sales dipped slightly even as sales of its notebooks and the lightweight Apple Mac Book Air soared. This is hardly surprising. Both the American and global consumers and workforces are becoming increasingly mobile, transitioning into an era of ever-more powerful notebooks, Netbooks (or minis) and PDAs. Critics argue that the commoditization of PC hardware will make it difficult for Apple or any hardware vendor to distinguish itself. As a result, Apple desktop sales may continue to contract along with those of PCs although they won’t become obsolete for many years. Meanwhile, Apple has a wide array of Mac Book, Mac Book Pro and the Mac Book Air products to take up the slack. The company also wisely cut hardware and OS X 10.x operating system prices to be more competitive with PCs.

9. iPod and iPhone. Apple sold a record 22.7 million iPods during the quarter, and the device has approximately 70% market share in the U.S. Worldwide market share percentages vary by country from 70% in Western Europe and Australia to well over 60% in Japan and over 50% in Canada. At the same time, iPhone sales in Q1 were 4.36 units million, representing 88% unit growth over the year-ago quarter. At some point, iPod and iPhone sales may reach saturation but that won’t happen anytime soon and when it does, Apple will most likely have another device in the offing.

8. Up, up and away. Data is no longer tied to the PC or desktop, it is moving to the cloud. Apple is right there in the cloud. Cloud computing is the new buzz word for delivering applications as services via the Internet. The first fruits of Apple’s cloud computing initiative involves the integration of Google’s cloud computing offering, the Google App Engine with Apple’s iPhone mobile computing platform. ITIC anticipates Apple will expand its reach into the cloud, again based on customer demand. Nearly half – 49% of the ITIC/Sunbelt Software survey respondents said they plan to increase integration between existing Apple consumer products like the iPhone to allow corporate users to access corporate Email and other applications over the next 12 months.

7. Marketing. No one does it better. From the moment that Steve Jobs stepped onstage 25 years ago and unveiled his 20lb. baby, to the creative licensing of the Rolling Stones tune “Like a Rainbow”, to partnering with the Irish rock group U2 to help promote iPod usage, Apple’s marketing has always been stellar. Apple uses every available channel – from the airwaves to the street – to promote its brand. There are now 251 Apple retail stores open in 10 countries, with total quarterly traffic of 46.7 million visitors.

6. New gadgets. Users and industry watchers have grown accustomed to Apple debuting revolutionary new products at MacWorld and they disappointed when it doesn’t happen. It is unrealistic to expect that any company, even one as inventive as Apple, can deliver a iPod or iPhone every year. Meanwhile, users will have to “settle” for evolutionary innovations like new laptop batteries that will run for eight hours without re-charging and Time Capsule, an all-in-one 802.11n wireless backup router that includes up to 1 terabyte of disk storage.

5. Leadership. It’s impossible to overstate or understate what company founder Steve Jobs has meant to Apple. His 1996 return to Apple sparked one of the greatest corporate revivals since Lazarus. An iconic figure in Silicon Valley for over 30 years, Jobs’ future is now clouded by health concerns, and investors and industry watchers are rightly nervous. Only time will tell when or if Jobs will return. If he does not, it will be a devastating loss on many levels but it will not cripple the company’s ability to thrive and survive. Still, Apple must allay customer, investor and government concerns by being truthful and forthcoming regarding Jobs and the company’s future.

4. What’s in Apple’s Wallet? Cash — $28.1 billion to be exact and $0 debt. That’s more than Google ($15.85B); Microsoft ($20.3B); IBM ($12.9B); Intel ($11.84B) or Sony ($6.05B). Apple also has double digit profit margins of 14.70% and operating margins of nearly 19%; return on assets is 10.77% while return on shareholders’ equity is a robust 24.47%. Few if any corporations can boast such a healthy balance sheet, which leaves Apple free to invest heavily in R&D, marketing initiatives and other efforts to keep ahead of competitors.

3. Apple is hot – and cool. Consumers have always loved Apple and there’s nothing to indicate that will change. Consumer enthusiasm for iPods and iPhones has fueled the resurgence of Macs and OS X 10.x in enterprises. Everyone it seems has or wishes they had an iPod or an iPhone. Beyond that the latest joint ITIC/Sunbelt Software data indicates that Apple is increasing its presence in many markets thanks to the performance and reliability of the core products. Eight out of 10 businesses – 82% of the survey respondents – rated the reliability of the Mac and OS X 10.x as “excellent” or “very good,” while almost 70% of those polled gave the same high marks to the security of the Apple platform. Tellingly, 68% of the survey respondents said their firms are likely to allow more users to deploy Macs as their enterprise desktops in the next six-to-12 months.

2. Enterprising. Over the past three years Apple has made a comeback in the enterprise. The latest joint ITIC/Sunbelt Software survey of 700 companies worldwide indicates that nearly 80% of businesses have Macs in their environment and 25% have significant (>30) numbers of Macs. But while enterprise users love Apple, IT managers remain divided. The biggest drawback for the Mac is the dearth of enterprise-class third party management and performance enhancement tools but technical service and support is also an issue. Apple will have to address these points if the company expects or plans to challenge Microsoft’s dominance on business desktops. So far, Apple has been silent about its enterprise strategy but a new consortium of five third party vendors calling itself the Enterprise Desktop Alliance (EDA) is determined to promote the management, integration and interoperability capabilities of the Mac in corporate environments.

1. Mobile and agile, not fragile. The combination and plethora of Apple consumer and corporate devices makes for a powerful product portfolio with widespread appeal. Unlike many of its competitors Apple is not dependent on a single product or market segment. Hence, when sales decline in one sector, the slippage is offset by another product as we’ve seen with Mac notebooks picking up the slack for Mac desktops. This enables Apple to adjust both its technology plans and market focus accordingly, strengthening and insulating the company from cyclical downturns.

One of the hallmarks of Apple’s existence has been the ability to re-invent itself – not only changing with the times – but keeping its fingers on the pulse of an often fickle public and anticipating what its users and the industry wants. Apple is well positioned for both the near and intermediate term. It will have to stay focused, keep its edge and clearly communicate its strategy in order to maintain the same level of success it has achieved in the last 32 years.

Apple Shines Read More »

Scroll to Top