IT

Windows 7 is a make or break release for Microsoft

The long awaited successor to Windows XP and Windows Vista, will ship several months earlier than planned. Expectations are high industry-wide.

Windows 7 is crucial to Microsoft’s over-arching software business and technology strategy for the next two years. Although it is an incremental upgrade and not a major overhaul of the underlying Vista kernel, Windows 7 represents a crucial upgrade for both consumer and corporate customers.

Practically speaking, Windows 7 must do what Vista didn’t: deliver near seamless, plug and play integration and interoperability with the overwhelming majority of Microsoft and third party applications, device drivers, utilities and hardware peripherals. As a standalone operating system (OS) Vista was fine. Unfortunately, there’s no such thing as a standalone OS. The lack of backwards compatibility between Vista and third party software and even incompatibilities in the file formats between Vista and Office 2007 and other Microsoft products was a nightmare for corporations and consumers alike.

As a result, there is no margin for error. Windows 7 must fulfill users’ expectations, business and technology needs from the first day it ships. Microsoft will not get a second chance to make a good first impression. Failure to do so could send customers running to rival desktop platforms like Apple’s Mac OS X 10.x and Linux distributions, or even online options such as those being pitched by Google. . And if Windows 7 does not deliver the features, integration, interoperability and reliability Microsoft is promising, it may well create a domino effect that adversely impacts the upcoming releases of related solutions like Exchange Server and the Office platform.

Integration and interoperability are the most important criteria, besting even cost, when it comes to choosing a new technology. The results of ITIC’s May 2009 Application Availability survey of 300 businesses worldwide found that 60% of business said integration and interoperability with existing and legacy applications tops the list of “must have” items in new software application and operating system purchases. Cost came in a close second with 56% of the respondents followed by ease of use and installation (55%).

The stakes for Windows 7 are also high because of intensified competition. Rumors abound that Microsoft pushed up the release date by at least three months so that Windows 7 hits the streets in advance of the low cost netbook version of Google’s Android. Microsoft also faces increased competition in its decades-old rival Apple. During the past two years Apple’s Mac OS X 10.x running on Apple’s Intel-based proprietary hardware has been making a comeback in corporate enterprises. Apple products do not represent a significant threat to Microsoft’s corporate desktop dominance, but they can nibble at the fringes, potentially dilute momentum [for Windows 7] and take some market share. In this ongoing global economic downturn, no vendor wants to concede any revenue or even a percentage point of market share.

Microsoft of course is acutely aware of these issues. In recent months, company CEO Steve Ballmer and Senior Vice President Bill Veghte have publicly stated that users were stymied by the incompatibility issues they encountered with Vista. They intend to avoid those problems with Windows 7.

Fortuitously, for Microsoft, there are many factors in Windows 7’s favor. They include:

  • Pent-up Demand. To date, only 10% of the 700 survey respondents in ITIC’s 2009 Global IT and Technology Trends Global Deployment Survey have deployed Vista as their company’s primary desktop operating system. The results indicated that Windows XP remains the primary desktop OS for 89% of the respondents. Nearly half—45%—of the survey respondents indicated they would skip Vista and migrate from XP to Windows 7. The main reasons for this were cost constraints associated with the bearish economy, and reluctance to undertake a complex OS upgrade with manpower constraints.The prevailing sentiment among businesses is that they can afford to wait because Windows XP adequate met their business and technology needs over the last two years. ITIC believes this bodes well for Windows 7 deployments in the short and intermediate term. If 20% of the installed base of legacy Windows XP users migrate or indicate their intention to upgrade to Windows 7 within the first three or four months of shipment, Microsoft will be well-positioned. There is a reasonable likelihood of this, providing Windows 7 delivers the goods. And the advance word from customers interviewed by ITIC is generally positive.
  • New feature set. Windows 7 will have six different versions, but to minimize the confusion that accompanied the Vista launch, only the Home Premium and Professional editions will be widely sold in retail outlets. Specific versions that are designed for enterprise use or developing nations will be aggressively marketed to those specific accounts and geographic regions, thus taking the guesswork out of purchasing. Most importantly: Microsoft says that every one of the versions will include all of the capabilities and features of the edition below it which will help to minimize upgrade woes. Corporations and consumers that want to move to a more feature rich version of Windows 7 can use Windows Anytime Upgrade to purchase the upgrade online and unlock the features of those editions from their desktops.ITIC interviewed several dozen Windows 7 beta users over the last several months and an overwhelming 9 out of 10 respondents expressed their satisfaction with improvements in many Windows 7’s core capabilities when compared to both Windows XP and Vista. This includes faster boot sequence, better reliability, improved security, a much faster and more comprehensive search engine, and more flexible configuration options. Additionally, Microsoft bolstered the inherent security of Windows 7 with DirectAccess and BitLocker To Go features. The DirectAccess capability is designed to provide remote, traveling and telecommuting workers with the same secure connectivity as though they were local by establishing a VPN “tunnel” to their corporate networks. BitLocker To Go extends the data encryption features introduced in Vista to include removable storage devices such as USB thumb drives support in Windows 7. Users can employ a password or a smart card with a digital certificate to unlock and access their data. And the devices can be used on any other Windows 7-based machine with the correct password. Users can also read, but not modify data on older Windows XP and Vista systems.
  • Economical and feature rich Licensing contracts. Finally, the terms and conditions of Windows 7 licensing contracts promise to make upgrades easier on corporate IT budgets. In February, Microsoft said it would provide a license that will allow customers to directly upgrade from Windows XP to Windows 7. There is a caveat, though: users will have to wipe their hard drives and perform a clean install – so technically, it’s not an upgrade. Microsoft has not yet released pricing details for Windows 7 but ITIC believes the upgrade license will most likely cost 20% to 40% less than a new license.Additionally, corporations that purchased Microsoft’s Software Assurance Maintenance and upgrade plan as a standalone product or received it as part of their Enterprise Agreement (EA) licenses, are entitled to free upgrades to Windows 7 since it is an incremental release. Additionally, in order to make life easier for users (and to engender goodwill) Microsoft is letting the Release Candidate (RC) free trial license for Windows 7 last a full year until June 2010! And users looking for a discounted version of Windows 7 to run on low cost, minis or netbooks take note: Microsoft and Intel have agreed that in order for a device to be considered a netbook, the screen must not exceed 10.2” Prior to this, Microsoft allowed customers to get the Windows XP or Vista discount for or devices as large as a 12.1” screen.

In summary, all indications are that Microsoft has learned from its Vista mistakes. As a result, businesses and consumers stand ready to reap significant benefits in compatibility, features, pricing and licensing with Windows 7.

Windows 7 is a make or break release for Microsoft Read More »

Application Availability, Reliability and Downtime: Ignorance is NOT Bliss

Two out of five businesses – 40% – report that their major business applications require higher availability rates than they did two or three years ago. However an overwhelming 81% are unable to quantify the cost of downtime and only a small 5% minority of businesses are willing to spend whatever it takes to guarantee the highest levels of application availability 99.99% and above. Those are the results of the latest ITIC survey which polled C-level executives and IT managers at 300 corporations worldwide.

ITIC partnered with Stratus Technologies in Maynard, Ma. a vendor that specializes in high availability and fault tolerant hardware and software solutions, to compose the Web-based survey. ITIC conducted this blind, non-vendor and non-product specific survey which polled businesses on their application availability requirements, virtualization and the compliance rate of their service level agreements (SLAs). None of the respondents received any remuneration. The Web-based survey consisted of multiple choice and essay questions. ITIC analysts also conducted two dozen first person customer interviews to obtain detailed anecdotal data.

Respondents ranged from SMBs with 100 users to very large enterprises with over 100,000 end users. Industries represented: academic, advertising, aerospace, banking, communications, consumer products, defense, energy, finance, government, healthcare, insurance, IT services, legal, manufacturing, media and entertainment, telecommunications, transportation, and utilities. None of the survey respondents received any remuneration for their participation. The respondents hailed from 15 countries; 85% were based in North America.

Survey Highlights

The survey results uncovered many “disconnects” between the levels of application reliability that corporate enterprises profess to need and the availability rates their systems and applications actually deliver. Additionally, a significant portion of the survey respondents had difficulty defining what constitutes high application availability; do not specifically track downtime and could not quantify or qualify the cost of downtime and its impact on their network operations and business.

Among the other survey highlights:

  • A 54% majority of IT managers and executives surveyed said more than two-thirds of their companies’ applications require the highest level of availability – 99.99% — or four nines of uptime.
  • Over half – 52% of survey respondents said that virtualization technology increases application uptime and availability; only 4% said availability decreased as a result of virtualization deployments.
  • In response to the question, “which aspect of application availability is most important” to the business, 59% of those polled cited the prevention of unplanned downtime as being most crucial; 40% said disaster recovery and business continuity were most important; 38% said that minimizing planned downtime to apply patches and upgrades was their top priority; 16% said the ability to meet SLAs was most important and 40% of the survey respondents said all of the choices were equally crucial to their business needs.
  • Some 41% said they would be satisfied with conventional 99% to 99.9% (the equivalent of two or three nines) availability for their most critical applications. Ninety-nine percent or 99.9% does not qualify as a high-availability or continuous-availability solution.
  • An overwhelming 81% of survey respondents said the number of applications that demand high availability has increased in the past two-to-three years.
  • Of those who said they have been unable to meet service level agreements (SLAs), 72% can’t or don’t keep track of the cost and productivity losses created by downtime.
  • Budgetary constraints are a gating factor prohibiting many organizations from installing software solutions that would improve application availability. Overall, 70% of the survey respondents said they lacked the funds to purchase value-added availability solutions (40%); or were unsure how much or if their companies would spend to guarantee application availability (30%).
  • Of the 30% of businesses that quantified how much their firms would spend on availability solutions, 3% indicated they would spend $2,000 to $4,000; 8% said $4,000 to $5,000; another 3% said $5,000 to $10,000; 11% — mainly large enterprises indicated they were willing to allocate $10,000 to $15,000 to ensure application availability and 5% said they would spend “whatever it takes.”

According to the survey findings, just under half of all businesses – 49% – lack the budget for high availability technology and 40% of the respondents reported they don’t understand what qualifies as high availability. An overwhelming eight out of 10 IT managers – 80% — are unable to quantify the cost of downtime to their C-level executives.

To reiterate, the ITIC survey polled users on the various aspects and impact of application availability and downtime but it did not specify any products or vendors.

The survey results supplemented by ITIC first person interviews with IT managers and C-level executives clearly shows that on a visceral level, businesses are very aware of the need for increased application availability has grown. This is particularly true in light of the emergence of new technologies like application and desktop virtualization, cloud computing, Service Oriented Architecture (SOA). The fast growing remote, mobile and telecommuting end user population utilizes unified communications and collaboration applications and utilities is also spurring the need for greater application availability and reliability.

High Application Availability Not a Reality for 80% of Businesses

The survey results clearly show that network uptime isn’t keeping pace with the need for application availability. At the same time, IT managers and C-level executives interviewed by ITIC did comprehend the business risks associated with downtime, even though most are unable to quantify the cost of downtime or qualify the impact to the corporation, its customers, suppliers and business partners when unplanned application and network outages occur.

“We are continually being asked to do more with less,” said an IT manager at a large enterprise in the Northeast. “We are now at a point, where the number of complex systems requiring expert knowledge has exceeded the headcount needed to maintain them … I am dreading vacation season,” he added.

Another executive at an Application Service provider acknowledged that even though his firm’s SLA guarantees to customers are a modest 98%, it has on occasion, been unable to meet those goals. The executive said his firm compensated one of its clients for a significant outage incident. “We had a half day outage a couple of years ago which cost us in excess of $40,000 in goodwill payouts to a handful of our clients, despite the fact that it was the first outage in five years,” he said.

Another user said a lack of funds prevented his firm from allocating capital expenditure monies to purchase solutions that would guarantee 99.99% application availability. “Our biggest concern is keeping what we have running and available. Change usually costs money, and at the moment our budgets are simply in survival mode,” he said.

Another VP of IT at a New Jersey-based business said that ignorance is not bliss. “If people knew the actual dollar value their applications and customers represent, they’d already have the necessary software availability solutions in place to safeguard applications,” he said. “Yes, it does cost money to purchase application availability solutions, but we’d rather pay now, then wait for something to fail and pay more later,” the VP of IT said.

Overall, the survey results show that the inability of users to put valid metrics and cost formulas in place to track and quantify what uptime means to their organization is woefully inadequate and many corporations are courting disaster.

ITIC advises businesses to track downtime, the actual cost of downtime to the organization and to take the necessary steps to qualify the impact of downtime including lost data, potential liability risks e.g. lost business, lost customers, potential lawsuits and damage to the company’s reputation. Once a company can quantify the amount of downtime associated with its main line of business applications, the impact of downtime and the risk to the business, it can then make an accurate assessment of whether or not its current IT infrastructure adequately supports the degree of application availability the corporation needs to maintain its SLAs.

Application Availability, Reliability and Downtime: Ignorance is NOT Bliss Read More »

IBM Charts Green, Energy Efficient Course with Dynamic Infrastructure Initiatives

These days just about every high technology vendor is “keen to be green.” However, few vendors can match IBM for its pioneering efforts and long term commitment to energy efficient solutions that are both good for the planet and good for recession racked enterprises.

This week, IBM took another giant step in its green data efforts. It officially launched its Dynamic Infrastructure for Energy Efficiency initiative, which is a comprehensive, compelling set of new hardware, software and services offerings designed to help customers build, manage and maintain more energy efficient infrastructures.

IBM’s Managing Dynamic Infrastructure for Energy Efficiency initiative serves as a blueprint for vendors and corporate customers to follow and emulate in their respective efforts to reduce power consumption, utility costs and their carbon footprints in the pursuit of greater system, application and network equipment economies of scale.

Declaring that “Environmental sustainability is an imperative for 21st Century business,” Rich Lechner, IBM’s VP of Energy & Environment, outlined IBM’s ambitious plan. Lechner and Chris O’Connor, VP of Tivoli Strategy, Product Management and SWG Green said that Big Blue worked with some 3,200 customers over the past two years to construct and validate metrics on energy usage and costs. Among the key findings from these efforts:

  • IT energy expenses are expected to increase 35% between 2009 and 2013
  • An overwhelming 80% of CEOs expect climate change regulations in five years
  • Buildings account for 40% of worldwide energy consumption

The company’s new products and services are the product of years of primary research and extensive research and development (R&D) in which the company has. spared no effort or expense in its quest to “go green” and assist its customers. It addresses the full spectrum of Green IT issues including: conservation, pollution prevention, consolidation and regulatory compliance initiatives for the physical devices and facilities and using renewable energy sources.

Managing Dynamic Infrastructure for Energy Efficiency

IBM’s Managing Dynamic Infrastructure for Energy Efficiency calls for corporations to build Green Infrastructures, Sustainable Solutions and Intelligent Systems. IBM’s plan is backed by a wide array of product offerings such as the Tivoli Monitoring for Energy Management and enhancements to the existing Tivoli Business Service Manager. IBM is offering customer a free trial of the Tivoli Monitoring for Energy Management.

The Tivoli Energy Management solution is supported by IBM hardware and IBM Global Services. The latter includes chargeback and accounting services and the ability to demonstrate to customers how to optimize assets (plant and facilities) and improve energy usage.

On the hardware front, IBM is embedding new capabilities in its x86 servers through consolidation which can result in an astounding 95% reduction of power compared to servers built three or four years ago.

IBM also has a Green Infrastructure ROI analysis tool. This is an interactive Web-based assessment toll that provides business with benchmarks on green/energy efficiency performance. It also provides the customers with specific recommendations to reduce energy consumption.

IBM also has a full set of services offerings to assist corporations in reviewing their current consumption and infrastructure and constructing customized plans for Green IT. IBM also has agreements in place with a number of technology partners – including Novell and Thunderhead – to deliver solutions that are certified to reduce environmental impact.

Going Green is Good Business

According to Lechner and O’Connor, Green IT initiatives will yield tangible benefits. Actual dollar value cost savings will vary according to the business and its specific cost cutting efforts. IBM customer Care2 for instance, cut energy consumption by 70% and reduced energy usage by 340 megawatt hours with proactive management. Another enterprise customer, Nationwide Insurance anticipates it will save $15 million (US dollars) over the next three years, including an 85% to 90% reduction in server utilization rates via virtualization and an 80% decrease in its environmental costs.

Not surprisingly, Lechner and O’Connor said that IBM practices what it preaches: IBM’s Austin facility achieved a 150% capacity increase while simultaneously cutting energy consumption by 25%. Those figures were good enough for the EPA to rank IBM’s Austin facility number 31 on its list of Greenest hardware vendors.

“Four years ago when we worked with clients [regarding energy efficiency] the discussion was academic,” Lechner said. “Now they want IBM to help them with Proof of Concept (POC) initiatives. The ROI for Green IT is two years or less,” he added.

Analysis

IBM’s Managing Dynamic Infrastructure for Energy Efficiency is the real deal. It is the result of years of dedication and commitment. And it shows. As one of the founding developers of the Electronic Industry Code of Conduct (EICC) in 2004 IBM has always backed up its words with action. The EICC is a code of best practices adopted and implemented by some of the world’s major electronics brands and their suppliers. Its goal is to improve conditions in the electronics supply chain.

It is well known and well documented that demand for Green desktop and server hardware and services will increase significantly over the next one-to-five years. Governments, states, municipalities and utility firms are now offering consumers and businesses a mixture of incentives, backed by mandates to reduce costs, power consumption and produce hardware, whose material components won’t poison the planet when it comes time to discard and/or recycle them.

Green IT initiatives are rising sharply and it’s easy to see why. The energy used to process and route server requests and transactions will exceed 100 Billion kilowatts (kWh) at an annual cost of $7.4 Billion by the year 2011, according to the Environmental Protection Agency (EPA). PCs and servers are currently the biggest hogs consuming 60% of peak power even when idle!!! This is double the energy servers used in 2006!

Corporations have a choice: go green voluntarily or be compelled to do so by a slew of new regulations which are now being written into law. For example, one of the mandates of the Green Building Act of 2006 requires that commercial buildings in Washington, D.C. larger than 50,000 sq. ft. must meet or exceed New Construction standards by 2012. Others are voluntary like the Energy Policy Act of 2005. It allows building owners to realize a tax savings of $1.80 per sq. ft. for new commercial buildings that reduce regulated energy use by 50%.

ITIC’s own survey data indicates that 74% of corporate data centers face limitations and constraints on space, power consumption and the rising costs associated with energy and physical plant leasing/rentals. The obvious solution is to cut energy consumption and utility costs, which in turn, reduce carbon emissions and cut the greenhouse gases.

IBM’s Managing Dynamic Infrastructure for Energy Efficiency initiative is a well-conceived and powerful set of products and services. It solidifies IBM’s reputation and position as an energy efficiency pioneer. Few vendors can match IBM in this area. IBM is well positioned to help corporations achieve their goals of cutting costs, consolidating server hardware and physical plant space and ultimately becoming carbon neutral. Corporations are urged to examine IBM’s products and services and test them for themselves.

IBM Charts Green, Energy Efficient Course with Dynamic Infrastructure Initiatives Read More »

Apple Gets More Entrenched in the Enterprise

Apple Macintosh Enterprise Usage Continues to Grow

Apple Mac and OS X 10.x continue to make inroads in the enterprise.

ITIC’s 2009 Global IT and Technology Trends Survey shows that corporate enterprises continue to embrace the Apple Mac and OS X 10.x server operating system in numbers not seen since the late 1980s. ITIC polled IT managers and C-level executives at 700 corporations worldwide. Among the survey highlights:

  • Over two-thirds of the 700 survey respondents – 68% — indicated they are likely to allow their end users to deploy Macs as their corporate enterprise desktops in the next 12 months.
  • Almost one-quarter or 23% have a significant number of Macintoshes (> 50) present in their organizations. Apple Macs have long been a favorite of company executives, but the survey responses clearly indicate that Mac usage has filtered down to rank and file knowledge workers across the enterprise.
  • Half of all the survey respondents – 50% — said they plan to increase integration with existing Apple consumer products such as the iPhone to allow users to access corporate Email and other applications. This augurs well for the iPhone to establish itself as a viable alternative to Research In Motion’s (RIM) as a mobile device that allows users to access Email and other collaboration applications.

In summary, the ITIC/Sunbelt survey responses show that businesses will find themselves challenged to do more with fewer resources. The respondents also exhibited their practicality and resourcefulness in extending the lifespan of still-useful technologies like Windows XP. However those who have the need and the budget, will get an able assist from emerging technologies like virtualization – and for those that correctly configure and deploy them – Vista and the Mac and OS X 10.x

Apple Gets More Entrenched in the Enterprise Read More »

ITIC Survey Indicates 35% of Companies Will Delay Network Upgrades for Lack of Money

Server hardware, network infrastructure and storage upgrades are hardest hit; 97% of security upgrades are on course; nearly 40% of companies report their migrations will proceed on schedule.

BOSTON, MA (February 2, 2009) — Information Technology Intelligence Corporation (ITIC), a high-tech research and consulting firm, today announced that the global economic downturn will force 35% of corporations to delay or abandon crucial network upgrades during 2009.

The latest joint survey conducted by ITIC and Sunbelt Software polled over 700 C-level executives and IT managers at 700 corporations worldwide. The results showed that budgetary constraints and IT staffing issues topped users’ list of most daunting business challenges in the year ahead. The corporate respondents indicated they are understandably cautious about spending their precious capital expenditure monies and are only committing to crucial upgrades on an “as needed” basis.

Among the key survey findings:

  • Over one-third of the corporate respondents — 35% — said that the ongoing economic downturn had caused their companies to delay or abandon planned software, hardware and network infrastructure upgrades. However, an additional 26% of those polled — over one-quarter of companies — indicated they may yet be forced to shelve crucial migration plans due to lack of funds and a dearth of trained IT staff.
  • Of the 35% of companies that indicated they will delay or abandon certain planned upgrades — the network projects that will be most impacted are: server hardware (21%) and network infrastructure products such as routers (19%) and storage devices (15%).
  • Security remains the sole market segment that appears to be immune to the global economic downturn. An overwhelming 97% majority of the survey respondents said their security upgrades will proceed as planned, with only a very small 3% minority indicating they will defer security upgrades.
  • Overall, 39% of the survey respondents — nearly two out of five businesses — reported that their network migration and upgrade plans will proceed as planned in calendar 2009.
  • Some 27% of companies — or about three out of 10 businesses — reported their 2009 IT budgets will decrease; another 32% said their budgets will remain the same as 2008. Only 16% of the survey respondents reported their IT budgets will increase during the next 12 months.
  • Of the 16% of corporations that said budgets will increase — the largest portion — 23% said the budget increases would be modest — ranging from 5% to 15%. 8% reported their IT budgets would rise minimally — 1% to 5%. Large budget increases will be a rarity in 2009: only 1% of companies will see budgets go up by 20% to 30%, and 3% will see IT budgets increase by more than 30%.

Survey Methodology and Background

The Web-based survey included multiple choice and essay responses. In addition, ITIC and Sunbelt conducted two dozen first person customer interviews to validate the survey responses. ITIC and Sunbelt received no vendor sponsorship for this research and none of the survey respondents received any remuneration for their participation. Approximately 85% of the respondents came from North America; the remaining 15% came from 20 countries including Europe, Asia, Australia, New Zealand and South America.

About Information Technology Intelligence Corporation (ITIC)

ITIC, founded in 2002, is a research and consulting firm in suburban Boston. It provides primary research on a wide variety of technology topics for vendors and enterprises. ITIC’s mission is to help its clients make sense of the technology and business events and provide tactical, practical and actionable advice. For more information visit ITIC’s website at https://itic-corp.com.

About Sunbelt Software

Sunbelt Software was founded in 1994 and is a leading provider of Windows security and management software with product solutions in the areas of antispam and antivirus, antispyware, and vulnerability assessment. Leading products include the CounterSpy and VIPRE product lines. For more information, visit the company’s website at http://sunbeltsoftware.com.

ITIC Survey Indicates 35% of Companies Will Delay Network Upgrades for Lack of Money Read More »

Microsoft Pulls Out all the Stops for SQL Server 2008

Microsoft is pulling out all the stops to support SQL Server 2008 and keep the momentum going for its latest enhanced database offering. On September 29, the company will launch the SQL Server 2008 Experience, a year-long series of in-person events designed to introduce “350,000+ customers, partners and community members” to the new features and benefits of its database offering.

Additionally, Microsoft is touting the merits of SQL Server 2008 on a new Website: http://www.moresqlserver.com. And it also just released the results of the new Transaction Processing Performance Council (TPC) performance benchmark tests for Microsoft SQL Server 2008. The TPC ranked Microsoft SQL Server 2008 #1 on price/performance on servers using Intel’s new Dunnington x64 processors, and as the top performance leader using IBM’s new System x3950 M2 server.

There’s no doubt that SQL Server 2008 boasts greatly improved features, functions, scalability, security, management and reliability compared to the 2005 version, and a more powerful, robust and manageable SQL Server 2008 is a must for Microsoft. The company is going head to head with industry powerhouses including IBM’s DB2 and Oracle’s 11g database running on Linux. So 2009 is shaping up to be an extremely competitive and crucial year for database vendors and their respective customers.

At this point, Microsoft is a strong number three behind Oracle and IBM in the database arena, according to both Gartner Group and IDC. The latest statistics show Oracle with approximately 42% market share; IBM second with about 21% and Microsoft with an estimated 19% of the database market. The financial stakes are also high: Oracle’s database revenue is well over $7 billion; IBM realizes close to $3.5 billion from database sales and Microsoft SQL Server generates close to $3 billion in annual sales.

In order to retain its existing installed base and increase its presence – particularly among SMBs and large enterprises, Microsoft must hit the ground running with SQL Server 2008. There is no margin for error from either a technical or a marketing standpoint. Hence, Microsoft is marshalling all its forces.

SQL Server 2008 incorporates a slew of new management capabilities such as: policy management; configuration servers; data collector/management warehouse and a multiple server query capability. Such features are crucial for database administrators, particularly those in large enterprises who are charged with overseeing complex and geographically dispersed database environments that may include hundreds or thousands of physical and virtual servers encompassing tens of thousands of databases.

The SQL Server 2008 Policy Management feature enables database administrators to create and execute configuration policies against one or more servers while the Data Collector facility obviates the need for managers to create custom solutions to cull data from their database server environments.

Data Collector lets administrators utilize the SQL Server Agent and SQL Server Integration Services (SSIS) to create a framework that collects and stores data while delivering a detailed history of error handling, auditing, and collection.

Just as important as SQL Server 2008’s new management functions are the accompanying documentation and training that Microsoft is making available for the database platform via its Website, TechNet and its Software Assurance maintenance and upgrade program. Vendor rivalries aside, the chief impediments to users upgrading to any new software platform are the cost and complexity of the migration. These factors are even more crucial when weighed against the cost constraints of the current economic downturn. Microsoft’s TechNet provides SQL Server 2008 customers with ample, “at-your-fingertips” documentation and troubleshooting tips as they prepare to upgrade.

In addition, customers who have purchased Microsoft’s Software Assurance will be able to get significant discounts on training as well as access to Elearning tools. The combination of TechNet and Software Assurance can save IT departments and the corporation untold thousands to millions in capital and operational expenditures and cut upgrade time by 25% to 65% depending on the size and scope of the deployment. And in the event that any significant bugs or performance glitches arise, Microsoft must move quickly and decisively to publicly address the problems and issue the necessary patches without dissembling or temporizing.

Overall, Microsoft has assembled all of the necessary technology and business components to make SQL Server 2008 a winner. The latest Microsoft database has the performance, scalability and management to make the upgrade path easy. The excellent documentation and technical support offered by TechNet is also a plus. Companies worried about budgetary constraints (and who isn’t?) will also find monetary relief from the inherent value of the myriad Software Assurance benefits.

Microsoft Pulls Out all the Stops for SQL Server 2008 Read More »

Scroll to Top