STEM

ITIC 2009-2010 Global Virtualization Deployment Trends Survey Results

Server virtualization demand and deployments are strong and will remain so for the remainder of 2009 and through 2010, despite the ongoing economic downturn.

The results of the new, independent ITIC 2009 Global Server Virtualization Survey, which polled more than 700 corporations worldwide during May/June and August, reveal that server virtualization deployments have remained strong throughout the ongoing 2009 economic downturn. It also shows that the three market leaders Citrix, Microsoft and VMware, are consolidating their positions even as the virtualization arena itself consolidates through mergers, acquisitions and partnerships.

Microsoft in particular has made big year-over-year gains in deployments and market share. Thanks to the summer release of the new Hyper-V 2.0 with live migration capabilities  the Redmond, Washington software firm has substantially closed the feature/performance gap between itself and VMware’s ESX Server.  The technical advances of Hyper-V combined with the excellent conditions of Microsoft’s licensing program, make the company’s virtualization products very competitive and alluring. Three out of five — 59% of the survey respondents — indicated their intent to deploy Hyper-V 2.0 within the next 12 to 18 months.

Survey responses also show a groundswell of support for application and desktop virtualization deployments. These two market segments constitute a much smaller niche of deployments and installations compared to virtualized server environments. The survey results show that application virtualization (where Microsoft is the market leader) and desktop virtualization (in which Citrix is the market leader), are both poised for significant growth in the 2010 timeframe.

Another key survey revelation was that 40% of respondents, especially businesses with 500 or more end users, said they either have or plan to install virtualization products from multiple vendors. This will place more emphasis and importance on integration, interoperability, management and third-party add-on tools to support these more complex, heterogeneous virtualization environments.

Among the other key survey highlights:

  • The “Big Three,” Citrix, Microsoft and VMware, are bolstering their positions with a slew of new offerings and a plethora of partnerships due out in the 2009 summer and fall.
  • Partnerships and Alliances: The alliance between Citrix and Microsoft remains robust as these two firms believe that there’s strength in numbers, as they mount a challenge to server virtualization leader VMware’s continuing dominance.
  • Microsoft Hyper-V Closes the Gap: Microsoft made big year-over-year market share gains from 2008 to 2009. The survey data shows current Hyper-V usage at 32%; but 59% plan to adopt in next 12 to 18 months.
  • VMware remains the market leader in server virtualization with approximately 50% share among enterprise users; Microsoft follows with 26% share.
  • Microsoft is the current market leader in application virtualization with a 15% share; followed by Citrix with 11% and VMware with 7%. However, nearly two-thirds of businesses have not yet deployed application virtualization.
  • Citrix is the market leader in desktop virtualization with a 19% market share followed by Microsoft with 15% and VMware with 8%. But again, over 60% of corporations have not yet begun to virtualize their desktop environments.
  • Mergers and Acquisitions Raise Questions: There is confusion among the legacy Sun and Virtual Iron users as to what will happen to both the product lines and technical support in the wake of both firms’ acquisition by Oracle.
  • Apple Mac is a popular virtualization platform; nearly 30% of respondents said they use Mac hardware in conjunction with Windows operating systems to virtualize their server and desktop environments.
  • Parallels and VMware Fusion are the two leading Mac virtualization vendors with a near 50/50 split market share.
  • Time to Bargain: Despite budget cuts and reduced resources only a very small percentage of companies — 7% — have attempted to renegotiate their virtualization licensing contracts to get lower prices and better deals.
  • Server Virtualization Lowers TCO: Almost 50% of survey respondents reported that server virtualization lets them lower their total cost of ownership (TCO) and achieve faster return on investment (ROI); however, only 25% of businesses could quantify the actual monetary cost savings
  • Users Prefer Terra Firma Virtualization to Cloud: Users are moving slowly with respect to public cloud computing migrations, which are heavily dependent on virtualization technology. To date, only 14% of survey respondents said they will move their data to a virtualized public cloud within the next six-to-12 months.

This survey identifies the trends that propel or impede server, application and desktop virtualization deployments and to elucidate the timeframes in which corporations plan to virtualize their environments. ITIC advises all businesses, irrespective of size or vertical market to conduct due diligence to determine which virtualization solution or combination of products best meets their technical and business needs in advance of any migration. And in light of the ongoing economic downturn, businesses are well advised to negotiate hard with their vendors for the best deals and to ensure that the appropriate IT managers receive the necessary training and certification to ensure a smooth, trouble-free virtualization upgrade. This will enable the business to lower TCO, accelerate ROI and minimize and mitigate risk to an acceptable level.

ITIC 2009-2010 Global Virtualization Deployment Trends Survey Results Read More »

Corporations Prefer Terra Firma to the Cloud — For Now

Concerns about cloud computing security and how fast cloud providers will respond in the event technical troubles should arise is making companies hesitant to embrace cloud computing — at least within the next 12 months. An 85% majority of the IT Performance Trends survey subjects say they will not implement a public or private cloud between June 2009 and June 2010. However, of that 85%, 31% say they are studying the issue but have made no decision yet and another 7% are “Unsure.”

Security topped the list of concerns and guarantees that companies would demand from a cloud services provider, if their firms were to implement a cloud model. An overwhelming 83% of respondents said they would need specific guarantees to safeguard their sensitive mission critical data before committing to a cloud. Additionally, almost three-quarters or 73% of respondents would require guaranteed fast response time for technical service and support. Nearly two thirds (63%) of respondents want minimum acceptable latency/response times and a nearly equal number (62%) say they would need multiple access paths to and from the cloud infrastructure.

It was clear from the customer interviews and essay responses that IT managers, especially those companies with fewer than 1,000 end users, will keep their corporate data and applications firmly planted behind the corporate firewall until they have ironclad assurances regarding the security of their data and their ability to access it.

“The idea that I would trust my email, financial transactions, or other day to day business operations to cloud computing is just asking for trouble,” observed an IT manager at a midsized corporation with 500 employees in the Midwest. “I do not even want to imagine my all my users being dead in the water because my link to the Internet was down,” he adds. Another manager at a retail firm with 250 employees expressed reservations about the ability of a cloud services vendor to deliver top notch service and support should the need arise.

“Downtime is the bane of an IT professional’s life,” says the network administrator at a retail firm with 250 employees. He noted that when an onsite and locally managed system fails, he and his IT team can take immediate action to replace parts, rebuild the operating system, restore data from tape backup or perform any other action required to restore services and applications. “Compare that to a failure in a cloud computing scenario, when all you can do is report the problem and hurry up and wait,” he says. “Most IT people are action oriented and they won’t respond well to being at the mercy of a cloud provider while listening to complaints and queries from users and management of ‘When will the system be back up?’ or ‘When can I get access to my data?'”

The director of IT at another midsized company with 400 users opined that he does not yet have confidence in the still-emerging cloud computing model. “We own our data, not the cloud provider, and we need to know it is movable if we need to leave the provider.”

Finally, the survey respondents indicated during first person customer interviews that they will continue to chart a conservative course that includes a very low tolerance for risk until the economy recovers and their companies can once again bolster IT staffs and provide more resources.

Analysis

Cloud computing is still in its nascent stages. It’s common for the hype among vendors, the press and analyst community to outpace current realities in IT, especially in the small and midsized businesses who have smaller budgets and are generally more conservative and risk averse than their enterprise counterparts.

The survey results also showed that there was much more of willingness on the part of larger enterprises to explore, test and deploy a cloud infrastructure. Among corporations with over 3,000 end users, a more convincing 57% percentage said they will either deploy or are considering a public or private cloud implementation over the next 12 to 18 months. Even this group though, is rightfully concerned about the uncertainties of trusting their sensitive data to a public cloud whose provider may be located in a foreign country.

Therefore, it is imperative that cloud computing vendors provide customers and prospective customers with transparency and full accountability with respect to crucial issues like: security, technical service and support, equipment and capacity of their data centers; an overview of the technology used (e.g. specific server equipment, virtualization, management, etc.). The vendors should also provide specific SLA levels and guarantees in the event those levels are not met.

Corporations should also perform due diligence. Get informed. Thoroughly investigate and compare the services and options of the various cloud providers. Know where and how your data will be stored, secured and managed. Ask for customer references. Consult with your in-house attorneys or obtain outside counsel to review proposed contracts. Don’t be afraid to insert out clauses and penalties in the event your cloud provider fails to meet SLAs. Also, at this early stage of development, don’t be afraid to ask for discounts and caps on prices hikes for the duration of your contract.

Corporations Prefer Terra Firma to the Cloud — For Now Read More »

Windows 7 is a make or break release for Microsoft

The long awaited successor to Windows XP and Windows Vista, will ship several months earlier than planned. Expectations are high industry-wide.

Windows 7 is crucial to Microsoft’s over-arching software business and technology strategy for the next two years. Although it is an incremental upgrade and not a major overhaul of the underlying Vista kernel, Windows 7 represents a crucial upgrade for both consumer and corporate customers.

Practically speaking, Windows 7 must do what Vista didn’t: deliver near seamless, plug and play integration and interoperability with the overwhelming majority of Microsoft and third party applications, device drivers, utilities and hardware peripherals. As a standalone operating system (OS) Vista was fine. Unfortunately, there’s no such thing as a standalone OS. The lack of backwards compatibility between Vista and third party software and even incompatibilities in the file formats between Vista and Office 2007 and other Microsoft products was a nightmare for corporations and consumers alike.

As a result, there is no margin for error. Windows 7 must fulfill users’ expectations, business and technology needs from the first day it ships. Microsoft will not get a second chance to make a good first impression. Failure to do so could send customers running to rival desktop platforms like Apple’s Mac OS X 10.x and Linux distributions, or even online options such as those being pitched by Google. . And if Windows 7 does not deliver the features, integration, interoperability and reliability Microsoft is promising, it may well create a domino effect that adversely impacts the upcoming releases of related solutions like Exchange Server and the Office platform.

Integration and interoperability are the most important criteria, besting even cost, when it comes to choosing a new technology. The results of ITIC’s May 2009 Application Availability survey of 300 businesses worldwide found that 60% of business said integration and interoperability with existing and legacy applications tops the list of “must have” items in new software application and operating system purchases. Cost came in a close second with 56% of the respondents followed by ease of use and installation (55%).

The stakes for Windows 7 are also high because of intensified competition. Rumors abound that Microsoft pushed up the release date by at least three months so that Windows 7 hits the streets in advance of the low cost netbook version of Google’s Android. Microsoft also faces increased competition in its decades-old rival Apple. During the past two years Apple’s Mac OS X 10.x running on Apple’s Intel-based proprietary hardware has been making a comeback in corporate enterprises. Apple products do not represent a significant threat to Microsoft’s corporate desktop dominance, but they can nibble at the fringes, potentially dilute momentum [for Windows 7] and take some market share. In this ongoing global economic downturn, no vendor wants to concede any revenue or even a percentage point of market share.

Microsoft of course is acutely aware of these issues. In recent months, company CEO Steve Ballmer and Senior Vice President Bill Veghte have publicly stated that users were stymied by the incompatibility issues they encountered with Vista. They intend to avoid those problems with Windows 7.

Fortuitously, for Microsoft, there are many factors in Windows 7’s favor. They include:

  • Pent-up Demand. To date, only 10% of the 700 survey respondents in ITIC’s 2009 Global IT and Technology Trends Global Deployment Survey have deployed Vista as their company’s primary desktop operating system. The results indicated that Windows XP remains the primary desktop OS for 89% of the respondents. Nearly half—45%—of the survey respondents indicated they would skip Vista and migrate from XP to Windows 7. The main reasons for this were cost constraints associated with the bearish economy, and reluctance to undertake a complex OS upgrade with manpower constraints.The prevailing sentiment among businesses is that they can afford to wait because Windows XP adequate met their business and technology needs over the last two years. ITIC believes this bodes well for Windows 7 deployments in the short and intermediate term. If 20% of the installed base of legacy Windows XP users migrate or indicate their intention to upgrade to Windows 7 within the first three or four months of shipment, Microsoft will be well-positioned. There is a reasonable likelihood of this, providing Windows 7 delivers the goods. And the advance word from customers interviewed by ITIC is generally positive.
  • New feature set. Windows 7 will have six different versions, but to minimize the confusion that accompanied the Vista launch, only the Home Premium and Professional editions will be widely sold in retail outlets. Specific versions that are designed for enterprise use or developing nations will be aggressively marketed to those specific accounts and geographic regions, thus taking the guesswork out of purchasing. Most importantly: Microsoft says that every one of the versions will include all of the capabilities and features of the edition below it which will help to minimize upgrade woes. Corporations and consumers that want to move to a more feature rich version of Windows 7 can use Windows Anytime Upgrade to purchase the upgrade online and unlock the features of those editions from their desktops.ITIC interviewed several dozen Windows 7 beta users over the last several months and an overwhelming 9 out of 10 respondents expressed their satisfaction with improvements in many Windows 7’s core capabilities when compared to both Windows XP and Vista. This includes faster boot sequence, better reliability, improved security, a much faster and more comprehensive search engine, and more flexible configuration options. Additionally, Microsoft bolstered the inherent security of Windows 7 with DirectAccess and BitLocker To Go features. The DirectAccess capability is designed to provide remote, traveling and telecommuting workers with the same secure connectivity as though they were local by establishing a VPN “tunnel” to their corporate networks. BitLocker To Go extends the data encryption features introduced in Vista to include removable storage devices such as USB thumb drives support in Windows 7. Users can employ a password or a smart card with a digital certificate to unlock and access their data. And the devices can be used on any other Windows 7-based machine with the correct password. Users can also read, but not modify data on older Windows XP and Vista systems.
  • Economical and feature rich Licensing contracts. Finally, the terms and conditions of Windows 7 licensing contracts promise to make upgrades easier on corporate IT budgets. In February, Microsoft said it would provide a license that will allow customers to directly upgrade from Windows XP to Windows 7. There is a caveat, though: users will have to wipe their hard drives and perform a clean install – so technically, it’s not an upgrade. Microsoft has not yet released pricing details for Windows 7 but ITIC believes the upgrade license will most likely cost 20% to 40% less than a new license.Additionally, corporations that purchased Microsoft’s Software Assurance Maintenance and upgrade plan as a standalone product or received it as part of their Enterprise Agreement (EA) licenses, are entitled to free upgrades to Windows 7 since it is an incremental release. Additionally, in order to make life easier for users (and to engender goodwill) Microsoft is letting the Release Candidate (RC) free trial license for Windows 7 last a full year until June 2010! And users looking for a discounted version of Windows 7 to run on low cost, minis or netbooks take note: Microsoft and Intel have agreed that in order for a device to be considered a netbook, the screen must not exceed 10.2” Prior to this, Microsoft allowed customers to get the Windows XP or Vista discount for or devices as large as a 12.1” screen.

In summary, all indications are that Microsoft has learned from its Vista mistakes. As a result, businesses and consumers stand ready to reap significant benefits in compatibility, features, pricing and licensing with Windows 7.

Windows 7 is a make or break release for Microsoft Read More »

Application Availability, Reliability and Downtime: Ignorance is NOT Bliss

Two out of five businesses – 40% – report that their major business applications require higher availability rates than they did two or three years ago. However an overwhelming 81% are unable to quantify the cost of downtime and only a small 5% minority of businesses are willing to spend whatever it takes to guarantee the highest levels of application availability 99.99% and above. Those are the results of the latest ITIC survey which polled C-level executives and IT managers at 300 corporations worldwide.

ITIC partnered with Stratus Technologies in Maynard, Ma. a vendor that specializes in high availability and fault tolerant hardware and software solutions, to compose the Web-based survey. ITIC conducted this blind, non-vendor and non-product specific survey which polled businesses on their application availability requirements, virtualization and the compliance rate of their service level agreements (SLAs). None of the respondents received any remuneration. The Web-based survey consisted of multiple choice and essay questions. ITIC analysts also conducted two dozen first person customer interviews to obtain detailed anecdotal data.

Respondents ranged from SMBs with 100 users to very large enterprises with over 100,000 end users. Industries represented: academic, advertising, aerospace, banking, communications, consumer products, defense, energy, finance, government, healthcare, insurance, IT services, legal, manufacturing, media and entertainment, telecommunications, transportation, and utilities. None of the survey respondents received any remuneration for their participation. The respondents hailed from 15 countries; 85% were based in North America.

Survey Highlights

The survey results uncovered many “disconnects” between the levels of application reliability that corporate enterprises profess to need and the availability rates their systems and applications actually deliver. Additionally, a significant portion of the survey respondents had difficulty defining what constitutes high application availability; do not specifically track downtime and could not quantify or qualify the cost of downtime and its impact on their network operations and business.

Among the other survey highlights:

  • A 54% majority of IT managers and executives surveyed said more than two-thirds of their companies’ applications require the highest level of availability – 99.99% — or four nines of uptime.
  • Over half – 52% of survey respondents said that virtualization technology increases application uptime and availability; only 4% said availability decreased as a result of virtualization deployments.
  • In response to the question, “which aspect of application availability is most important” to the business, 59% of those polled cited the prevention of unplanned downtime as being most crucial; 40% said disaster recovery and business continuity were most important; 38% said that minimizing planned downtime to apply patches and upgrades was their top priority; 16% said the ability to meet SLAs was most important and 40% of the survey respondents said all of the choices were equally crucial to their business needs.
  • Some 41% said they would be satisfied with conventional 99% to 99.9% (the equivalent of two or three nines) availability for their most critical applications. Ninety-nine percent or 99.9% does not qualify as a high-availability or continuous-availability solution.
  • An overwhelming 81% of survey respondents said the number of applications that demand high availability has increased in the past two-to-three years.
  • Of those who said they have been unable to meet service level agreements (SLAs), 72% can’t or don’t keep track of the cost and productivity losses created by downtime.
  • Budgetary constraints are a gating factor prohibiting many organizations from installing software solutions that would improve application availability. Overall, 70% of the survey respondents said they lacked the funds to purchase value-added availability solutions (40%); or were unsure how much or if their companies would spend to guarantee application availability (30%).
  • Of the 30% of businesses that quantified how much their firms would spend on availability solutions, 3% indicated they would spend $2,000 to $4,000; 8% said $4,000 to $5,000; another 3% said $5,000 to $10,000; 11% — mainly large enterprises indicated they were willing to allocate $10,000 to $15,000 to ensure application availability and 5% said they would spend “whatever it takes.”

According to the survey findings, just under half of all businesses – 49% – lack the budget for high availability technology and 40% of the respondents reported they don’t understand what qualifies as high availability. An overwhelming eight out of 10 IT managers – 80% — are unable to quantify the cost of downtime to their C-level executives.

To reiterate, the ITIC survey polled users on the various aspects and impact of application availability and downtime but it did not specify any products or vendors.

The survey results supplemented by ITIC first person interviews with IT managers and C-level executives clearly shows that on a visceral level, businesses are very aware of the need for increased application availability has grown. This is particularly true in light of the emergence of new technologies like application and desktop virtualization, cloud computing, Service Oriented Architecture (SOA). The fast growing remote, mobile and telecommuting end user population utilizes unified communications and collaboration applications and utilities is also spurring the need for greater application availability and reliability.

High Application Availability Not a Reality for 80% of Businesses

The survey results clearly show that network uptime isn’t keeping pace with the need for application availability. At the same time, IT managers and C-level executives interviewed by ITIC did comprehend the business risks associated with downtime, even though most are unable to quantify the cost of downtime or qualify the impact to the corporation, its customers, suppliers and business partners when unplanned application and network outages occur.

“We are continually being asked to do more with less,” said an IT manager at a large enterprise in the Northeast. “We are now at a point, where the number of complex systems requiring expert knowledge has exceeded the headcount needed to maintain them … I am dreading vacation season,” he added.

Another executive at an Application Service provider acknowledged that even though his firm’s SLA guarantees to customers are a modest 98%, it has on occasion, been unable to meet those goals. The executive said his firm compensated one of its clients for a significant outage incident. “We had a half day outage a couple of years ago which cost us in excess of $40,000 in goodwill payouts to a handful of our clients, despite the fact that it was the first outage in five years,” he said.

Another user said a lack of funds prevented his firm from allocating capital expenditure monies to purchase solutions that would guarantee 99.99% application availability. “Our biggest concern is keeping what we have running and available. Change usually costs money, and at the moment our budgets are simply in survival mode,” he said.

Another VP of IT at a New Jersey-based business said that ignorance is not bliss. “If people knew the actual dollar value their applications and customers represent, they’d already have the necessary software availability solutions in place to safeguard applications,” he said. “Yes, it does cost money to purchase application availability solutions, but we’d rather pay now, then wait for something to fail and pay more later,” the VP of IT said.

Overall, the survey results show that the inability of users to put valid metrics and cost formulas in place to track and quantify what uptime means to their organization is woefully inadequate and many corporations are courting disaster.

ITIC advises businesses to track downtime, the actual cost of downtime to the organization and to take the necessary steps to qualify the impact of downtime including lost data, potential liability risks e.g. lost business, lost customers, potential lawsuits and damage to the company’s reputation. Once a company can quantify the amount of downtime associated with its main line of business applications, the impact of downtime and the risk to the business, it can then make an accurate assessment of whether or not its current IT infrastructure adequately supports the degree of application availability the corporation needs to maintain its SLAs.

Application Availability, Reliability and Downtime: Ignorance is NOT Bliss Read More »

IBM Charts Green, Energy Efficient Course with Dynamic Infrastructure Initiatives

These days just about every high technology vendor is “keen to be green.” However, few vendors can match IBM for its pioneering efforts and long term commitment to energy efficient solutions that are both good for the planet and good for recession racked enterprises.

This week, IBM took another giant step in its green data efforts. It officially launched its Dynamic Infrastructure for Energy Efficiency initiative, which is a comprehensive, compelling set of new hardware, software and services offerings designed to help customers build, manage and maintain more energy efficient infrastructures.

IBM’s Managing Dynamic Infrastructure for Energy Efficiency initiative serves as a blueprint for vendors and corporate customers to follow and emulate in their respective efforts to reduce power consumption, utility costs and their carbon footprints in the pursuit of greater system, application and network equipment economies of scale.

Declaring that “Environmental sustainability is an imperative for 21st Century business,” Rich Lechner, IBM’s VP of Energy & Environment, outlined IBM’s ambitious plan. Lechner and Chris O’Connor, VP of Tivoli Strategy, Product Management and SWG Green said that Big Blue worked with some 3,200 customers over the past two years to construct and validate metrics on energy usage and costs. Among the key findings from these efforts:

  • IT energy expenses are expected to increase 35% between 2009 and 2013
  • An overwhelming 80% of CEOs expect climate change regulations in five years
  • Buildings account for 40% of worldwide energy consumption

The company’s new products and services are the product of years of primary research and extensive research and development (R&D) in which the company has. spared no effort or expense in its quest to “go green” and assist its customers. It addresses the full spectrum of Green IT issues including: conservation, pollution prevention, consolidation and regulatory compliance initiatives for the physical devices and facilities and using renewable energy sources.

Managing Dynamic Infrastructure for Energy Efficiency

IBM’s Managing Dynamic Infrastructure for Energy Efficiency calls for corporations to build Green Infrastructures, Sustainable Solutions and Intelligent Systems. IBM’s plan is backed by a wide array of product offerings such as the Tivoli Monitoring for Energy Management and enhancements to the existing Tivoli Business Service Manager. IBM is offering customer a free trial of the Tivoli Monitoring for Energy Management.

The Tivoli Energy Management solution is supported by IBM hardware and IBM Global Services. The latter includes chargeback and accounting services and the ability to demonstrate to customers how to optimize assets (plant and facilities) and improve energy usage.

On the hardware front, IBM is embedding new capabilities in its x86 servers through consolidation which can result in an astounding 95% reduction of power compared to servers built three or four years ago.

IBM also has a Green Infrastructure ROI analysis tool. This is an interactive Web-based assessment toll that provides business with benchmarks on green/energy efficiency performance. It also provides the customers with specific recommendations to reduce energy consumption.

IBM also has a full set of services offerings to assist corporations in reviewing their current consumption and infrastructure and constructing customized plans for Green IT. IBM also has agreements in place with a number of technology partners – including Novell and Thunderhead – to deliver solutions that are certified to reduce environmental impact.

Going Green is Good Business

According to Lechner and O’Connor, Green IT initiatives will yield tangible benefits. Actual dollar value cost savings will vary according to the business and its specific cost cutting efforts. IBM customer Care2 for instance, cut energy consumption by 70% and reduced energy usage by 340 megawatt hours with proactive management. Another enterprise customer, Nationwide Insurance anticipates it will save $15 million (US dollars) over the next three years, including an 85% to 90% reduction in server utilization rates via virtualization and an 80% decrease in its environmental costs.

Not surprisingly, Lechner and O’Connor said that IBM practices what it preaches: IBM’s Austin facility achieved a 150% capacity increase while simultaneously cutting energy consumption by 25%. Those figures were good enough for the EPA to rank IBM’s Austin facility number 31 on its list of Greenest hardware vendors.

“Four years ago when we worked with clients [regarding energy efficiency] the discussion was academic,” Lechner said. “Now they want IBM to help them with Proof of Concept (POC) initiatives. The ROI for Green IT is two years or less,” he added.

Analysis

IBM’s Managing Dynamic Infrastructure for Energy Efficiency is the real deal. It is the result of years of dedication and commitment. And it shows. As one of the founding developers of the Electronic Industry Code of Conduct (EICC) in 2004 IBM has always backed up its words with action. The EICC is a code of best practices adopted and implemented by some of the world’s major electronics brands and their suppliers. Its goal is to improve conditions in the electronics supply chain.

It is well known and well documented that demand for Green desktop and server hardware and services will increase significantly over the next one-to-five years. Governments, states, municipalities and utility firms are now offering consumers and businesses a mixture of incentives, backed by mandates to reduce costs, power consumption and produce hardware, whose material components won’t poison the planet when it comes time to discard and/or recycle them.

Green IT initiatives are rising sharply and it’s easy to see why. The energy used to process and route server requests and transactions will exceed 100 Billion kilowatts (kWh) at an annual cost of $7.4 Billion by the year 2011, according to the Environmental Protection Agency (EPA). PCs and servers are currently the biggest hogs consuming 60% of peak power even when idle!!! This is double the energy servers used in 2006!

Corporations have a choice: go green voluntarily or be compelled to do so by a slew of new regulations which are now being written into law. For example, one of the mandates of the Green Building Act of 2006 requires that commercial buildings in Washington, D.C. larger than 50,000 sq. ft. must meet or exceed New Construction standards by 2012. Others are voluntary like the Energy Policy Act of 2005. It allows building owners to realize a tax savings of $1.80 per sq. ft. for new commercial buildings that reduce regulated energy use by 50%.

ITIC’s own survey data indicates that 74% of corporate data centers face limitations and constraints on space, power consumption and the rising costs associated with energy and physical plant leasing/rentals. The obvious solution is to cut energy consumption and utility costs, which in turn, reduce carbon emissions and cut the greenhouse gases.

IBM’s Managing Dynamic Infrastructure for Energy Efficiency initiative is a well-conceived and powerful set of products and services. It solidifies IBM’s reputation and position as an energy efficiency pioneer. Few vendors can match IBM in this area. IBM is well positioned to help corporations achieve their goals of cutting costs, consolidating server hardware and physical plant space and ultimately becoming carbon neutral. Corporations are urged to examine IBM’s products and services and test them for themselves.

IBM Charts Green, Energy Efficient Course with Dynamic Infrastructure Initiatives Read More »

Apple Gets More Entrenched in the Enterprise

Apple Macintosh Enterprise Usage Continues to Grow

Apple Mac and OS X 10.x continue to make inroads in the enterprise.

ITIC’s 2009 Global IT and Technology Trends Survey shows that corporate enterprises continue to embrace the Apple Mac and OS X 10.x server operating system in numbers not seen since the late 1980s. ITIC polled IT managers and C-level executives at 700 corporations worldwide. Among the survey highlights:

  • Over two-thirds of the 700 survey respondents – 68% — indicated they are likely to allow their end users to deploy Macs as their corporate enterprise desktops in the next 12 months.
  • Almost one-quarter or 23% have a significant number of Macintoshes (> 50) present in their organizations. Apple Macs have long been a favorite of company executives, but the survey responses clearly indicate that Mac usage has filtered down to rank and file knowledge workers across the enterprise.
  • Half of all the survey respondents – 50% — said they plan to increase integration with existing Apple consumer products such as the iPhone to allow users to access corporate Email and other applications. This augurs well for the iPhone to establish itself as a viable alternative to Research In Motion’s (RIM) as a mobile device that allows users to access Email and other collaboration applications.

In summary, the ITIC/Sunbelt survey responses show that businesses will find themselves challenged to do more with fewer resources. The respondents also exhibited their practicality and resourcefulness in extending the lifespan of still-useful technologies like Windows XP. However those who have the need and the budget, will get an able assist from emerging technologies like virtualization – and for those that correctly configure and deploy them – Vista and the Mac and OS X 10.x

Apple Gets More Entrenched in the Enterprise Read More »

Apple Shines

Apple rang in 2009 by celebrating a trio of milestones that were impressive by any standards including those of a company whose 32-year span has been filled with a cornucopia of noteworthy events. In quick succession, Apple posted the best financial results in its history: during the just ended 2009 first fiscal quarter it achieved record revenues of nearly $10.2 billion on record net quarterly profits of $1.61 billion and it sold an astounding 22.7 million iPods, another record. The icing on the cake: Apple’s flagship Mac computer celebrated its 25th birthday amidst the news that the Cupertino, California firm’s latest Mac Book and Mac Book Pro notebooks contributed to the overall financial bonanza with sales of 2.5 million units; a 34% gain in year-over-year unit shipments.

These feats would be extraordinary at any time but they offered even more cause for celebration due to their arrival during a week in which the news from almost all of Apple’s high-tech vendor counterparts ranged from disappointing to dismal to downright dire. Intel said it would shed up to 6,000 workers and close five manufacturing plants; Microsoft announced it will lay off 5,000 workers (the first such major action in its history) amidst declining demand for Windows PC solutions, and even the goliath Google saw a sharp decline in its 2009 first fiscal quarter profits.

With such a bountiful harvest, it was more than a little perplexing to read the headline in the January 22 issue of Silicon Valley.com column proclaiming: “Mac’s influence could wane.” Granted, the headline was a bit misleading. The article itself stated that things look good for Apple and its Macs in the near term, but what about the next 25 years? Good question.

Long term forecasts of even five years are more art or guesswork than science. But decades long prognostications are rarities unless you’re talking about Nostradamus or the Oracle of Delphi. So we’re left to forecast with the tools at our disposal – in this case, the facts. So here for your consideration is our Top 10 List concerning Apple’s health and well-being. It includes some little known facts of both a positive and even potentially negative nature.

10. Big Mac sales shrink. Apple Mac desktop sales dipped slightly even as sales of its notebooks and the lightweight Apple Mac Book Air soared. This is hardly surprising. Both the American and global consumers and workforces are becoming increasingly mobile, transitioning into an era of ever-more powerful notebooks, Netbooks (or minis) and PDAs. Critics argue that the commoditization of PC hardware will make it difficult for Apple or any hardware vendor to distinguish itself. As a result, Apple desktop sales may continue to contract along with those of PCs although they won’t become obsolete for many years. Meanwhile, Apple has a wide array of Mac Book, Mac Book Pro and the Mac Book Air products to take up the slack. The company also wisely cut hardware and OS X 10.x operating system prices to be more competitive with PCs.

9. iPod and iPhone. Apple sold a record 22.7 million iPods during the quarter, and the device has approximately 70% market share in the U.S. Worldwide market share percentages vary by country from 70% in Western Europe and Australia to well over 60% in Japan and over 50% in Canada. At the same time, iPhone sales in Q1 were 4.36 units million, representing 88% unit growth over the year-ago quarter. At some point, iPod and iPhone sales may reach saturation but that won’t happen anytime soon and when it does, Apple will most likely have another device in the offing.

8. Up, up and away. Data is no longer tied to the PC or desktop, it is moving to the cloud. Apple is right there in the cloud. Cloud computing is the new buzz word for delivering applications as services via the Internet. The first fruits of Apple’s cloud computing initiative involves the integration of Google’s cloud computing offering, the Google App Engine with Apple’s iPhone mobile computing platform. ITIC anticipates Apple will expand its reach into the cloud, again based on customer demand. Nearly half – 49% of the ITIC/Sunbelt Software survey respondents said they plan to increase integration between existing Apple consumer products like the iPhone to allow corporate users to access corporate Email and other applications over the next 12 months.

7. Marketing. No one does it better. From the moment that Steve Jobs stepped onstage 25 years ago and unveiled his 20lb. baby, to the creative licensing of the Rolling Stones tune “Like a Rainbow”, to partnering with the Irish rock group U2 to help promote iPod usage, Apple’s marketing has always been stellar. Apple uses every available channel – from the airwaves to the street – to promote its brand. There are now 251 Apple retail stores open in 10 countries, with total quarterly traffic of 46.7 million visitors.

6. New gadgets. Users and industry watchers have grown accustomed to Apple debuting revolutionary new products at MacWorld and they disappointed when it doesn’t happen. It is unrealistic to expect that any company, even one as inventive as Apple, can deliver a iPod or iPhone every year. Meanwhile, users will have to “settle” for evolutionary innovations like new laptop batteries that will run for eight hours without re-charging and Time Capsule, an all-in-one 802.11n wireless backup router that includes up to 1 terabyte of disk storage.

5. Leadership. It’s impossible to overstate or understate what company founder Steve Jobs has meant to Apple. His 1996 return to Apple sparked one of the greatest corporate revivals since Lazarus. An iconic figure in Silicon Valley for over 30 years, Jobs’ future is now clouded by health concerns, and investors and industry watchers are rightly nervous. Only time will tell when or if Jobs will return. If he does not, it will be a devastating loss on many levels but it will not cripple the company’s ability to thrive and survive. Still, Apple must allay customer, investor and government concerns by being truthful and forthcoming regarding Jobs and the company’s future.

4. What’s in Apple’s Wallet? Cash — $28.1 billion to be exact and $0 debt. That’s more than Google ($15.85B); Microsoft ($20.3B); IBM ($12.9B); Intel ($11.84B) or Sony ($6.05B). Apple also has double digit profit margins of 14.70% and operating margins of nearly 19%; return on assets is 10.77% while return on shareholders’ equity is a robust 24.47%. Few if any corporations can boast such a healthy balance sheet, which leaves Apple free to invest heavily in R&D, marketing initiatives and other efforts to keep ahead of competitors.

3. Apple is hot – and cool. Consumers have always loved Apple and there’s nothing to indicate that will change. Consumer enthusiasm for iPods and iPhones has fueled the resurgence of Macs and OS X 10.x in enterprises. Everyone it seems has or wishes they had an iPod or an iPhone. Beyond that the latest joint ITIC/Sunbelt Software data indicates that Apple is increasing its presence in many markets thanks to the performance and reliability of the core products. Eight out of 10 businesses – 82% of the survey respondents – rated the reliability of the Mac and OS X 10.x as “excellent” or “very good,” while almost 70% of those polled gave the same high marks to the security of the Apple platform. Tellingly, 68% of the survey respondents said their firms are likely to allow more users to deploy Macs as their enterprise desktops in the next six-to-12 months.

2. Enterprising. Over the past three years Apple has made a comeback in the enterprise. The latest joint ITIC/Sunbelt Software survey of 700 companies worldwide indicates that nearly 80% of businesses have Macs in their environment and 25% have significant (>30) numbers of Macs. But while enterprise users love Apple, IT managers remain divided. The biggest drawback for the Mac is the dearth of enterprise-class third party management and performance enhancement tools but technical service and support is also an issue. Apple will have to address these points if the company expects or plans to challenge Microsoft’s dominance on business desktops. So far, Apple has been silent about its enterprise strategy but a new consortium of five third party vendors calling itself the Enterprise Desktop Alliance (EDA) is determined to promote the management, integration and interoperability capabilities of the Mac in corporate environments.

1. Mobile and agile, not fragile. The combination and plethora of Apple consumer and corporate devices makes for a powerful product portfolio with widespread appeal. Unlike many of its competitors Apple is not dependent on a single product or market segment. Hence, when sales decline in one sector, the slippage is offset by another product as we’ve seen with Mac notebooks picking up the slack for Mac desktops. This enables Apple to adjust both its technology plans and market focus accordingly, strengthening and insulating the company from cyclical downturns.

One of the hallmarks of Apple’s existence has been the ability to re-invent itself – not only changing with the times – but keeping its fingers on the pulse of an often fickle public and anticipating what its users and the industry wants. Apple is well positioned for both the near and intermediate term. It will have to stay focused, keep its edge and clearly communicate its strategy in order to maintain the same level of success it has achieved in the last 32 years.

Apple Shines Read More »

Microsoft Pulls Out all the Stops for SQL Server 2008

Microsoft is pulling out all the stops to support SQL Server 2008 and keep the momentum going for its latest enhanced database offering. On September 29, the company will launch the SQL Server 2008 Experience, a year-long series of in-person events designed to introduce “350,000+ customers, partners and community members” to the new features and benefits of its database offering.

Additionally, Microsoft is touting the merits of SQL Server 2008 on a new Website: http://www.moresqlserver.com. And it also just released the results of the new Transaction Processing Performance Council (TPC) performance benchmark tests for Microsoft SQL Server 2008. The TPC ranked Microsoft SQL Server 2008 #1 on price/performance on servers using Intel’s new Dunnington x64 processors, and as the top performance leader using IBM’s new System x3950 M2 server.

There’s no doubt that SQL Server 2008 boasts greatly improved features, functions, scalability, security, management and reliability compared to the 2005 version, and a more powerful, robust and manageable SQL Server 2008 is a must for Microsoft. The company is going head to head with industry powerhouses including IBM’s DB2 and Oracle’s 11g database running on Linux. So 2009 is shaping up to be an extremely competitive and crucial year for database vendors and their respective customers.

At this point, Microsoft is a strong number three behind Oracle and IBM in the database arena, according to both Gartner Group and IDC. The latest statistics show Oracle with approximately 42% market share; IBM second with about 21% and Microsoft with an estimated 19% of the database market. The financial stakes are also high: Oracle’s database revenue is well over $7 billion; IBM realizes close to $3.5 billion from database sales and Microsoft SQL Server generates close to $3 billion in annual sales.

In order to retain its existing installed base and increase its presence – particularly among SMBs and large enterprises, Microsoft must hit the ground running with SQL Server 2008. There is no margin for error from either a technical or a marketing standpoint. Hence, Microsoft is marshalling all its forces.

SQL Server 2008 incorporates a slew of new management capabilities such as: policy management; configuration servers; data collector/management warehouse and a multiple server query capability. Such features are crucial for database administrators, particularly those in large enterprises who are charged with overseeing complex and geographically dispersed database environments that may include hundreds or thousands of physical and virtual servers encompassing tens of thousands of databases.

The SQL Server 2008 Policy Management feature enables database administrators to create and execute configuration policies against one or more servers while the Data Collector facility obviates the need for managers to create custom solutions to cull data from their database server environments.

Data Collector lets administrators utilize the SQL Server Agent and SQL Server Integration Services (SSIS) to create a framework that collects and stores data while delivering a detailed history of error handling, auditing, and collection.

Just as important as SQL Server 2008’s new management functions are the accompanying documentation and training that Microsoft is making available for the database platform via its Website, TechNet and its Software Assurance maintenance and upgrade program. Vendor rivalries aside, the chief impediments to users upgrading to any new software platform are the cost and complexity of the migration. These factors are even more crucial when weighed against the cost constraints of the current economic downturn. Microsoft’s TechNet provides SQL Server 2008 customers with ample, “at-your-fingertips” documentation and troubleshooting tips as they prepare to upgrade.

In addition, customers who have purchased Microsoft’s Software Assurance will be able to get significant discounts on training as well as access to Elearning tools. The combination of TechNet and Software Assurance can save IT departments and the corporation untold thousands to millions in capital and operational expenditures and cut upgrade time by 25% to 65% depending on the size and scope of the deployment. And in the event that any significant bugs or performance glitches arise, Microsoft must move quickly and decisively to publicly address the problems and issue the necessary patches without dissembling or temporizing.

Overall, Microsoft has assembled all of the necessary technology and business components to make SQL Server 2008 a winner. The latest Microsoft database has the performance, scalability and management to make the upgrade path easy. The excellent documentation and technical support offered by TechNet is also a plus. Companies worried about budgetary constraints (and who isn’t?) will also find monetary relief from the inherent value of the myriad Software Assurance benefits.

Microsoft Pulls Out all the Stops for SQL Server 2008 Read More »

Scroll to Top