Year

Virtualization Deployments Soar, But Companies Prefer Terra Firma to Cloud for now

The ongoing buzz surrounding cloud computing – particularly public clouds – is far outpacing actual deployments by mainstream users. To date only 14% of companies have deployed or plan to deploy a private cloud infrastructure within the next two calendar quarters.
Instead, as businesses slowly recover from the ongoing economic downturn, their most immediate priorities are to upgrades to legacy desktop and server hardware, outmoded applications and to expand their virtualization deployments. Those are the results of the latest ITIC 2010 Virtualization and High Availability survey which polled C-level executives and IT managers at 400 organizations worldwide.
ITIC partnered with Stratus Technologies and Sunbelt Software to conduct the Web-based survey of multiple choice questions and essay comments. ITIC also conducted first person interviews with over two dozen end to obtain anecdotal responses on the primary accelerators or impediments to virtualization, high availability and reliability, cloud computing. The survey also queried customers on whether or not their current network infrastructure and mission critical applications were adequate enough to handle new technologies and the increasing demands of the business.
The survey showed that for now at least, although, many midsized and large enterprises are contemplating a move to the cloud – especially a private cloud infrastructure – the technology and business model is still not essential for most businesses. Some 48% of survey participants said they have no plans to migrate to private cloud architecture within the next 12 months while another 33% said their companies are studying the issue but have no firm plans to deploy.

The study also indicates that Private Cloud deployments are outpacing Public Cloud Infrastructure deployments by a 2 to 1 margin. However before businesses can begin to consider a private cloud deployment they must first upgrade the “building block” components of their existing environments e.g., server and desktop hardware, WAN infrastructure; storage, security and applications. Only 11% of businesses described their server and desktop hardware as leading edge or state-of-the-art. And just 8% of respondents characterized their desktop and application environment as leading edge.

The largest proportion of the survey participants – 52% – described their desktop and server hardware working well, while 48% said their applications were up-to-date. However, 34% acknowledged that some of their server hardware needed to be updated. A higher percentage of users 41% admitted that their mission critical software applications were due to be refreshed. And a small 3% minority said that a significant portion of both their hardware and mission critical applications were outmoded and adversely impacting the performance and reliability of their networks.

Based on the survey data and customer interviews, ITIC anticipates that from now until October, companies’ primary focus will be on infrastructure improvements.

Reliability and Uptime Lag

The biggest surprise in this survey from the 2009 High Availability and Fault Tolerant survey, which ITIC & Stratus conducted nearly one year ago, was the decline in the number of survey participants who said their organizations required 99.99% uptime and reliability. In this latest survey, the largest portion of respondents – 38% — or nearly 4 out of 10 businesses said that 99.9% uptime — the equivalent of 8.76 hours of per server, per annum downtime was the minimum acceptable amount for their mission critical line of business (LOB) applications. This is more than three times the 12% of respondents who said that 99.9% uptime was acceptable in the prior 2009 survey. Overall, 62% or nearly two-thirds of survey participants indicated their organizations are willing to live with higher levels of downtime than were considered acceptable in previous years.
Some 39% of survey respondents – almost 4 out of 10 respondents indicated that their organizations demand high availability which ITIC defines as four nines of uptime or greater. Specifically, 27% said their organizations require 99.99% uptime; another 6% need 99.999% uptime and a 3% minority require the highest 99.999% level of availability.
The customer interviews found that the ongoing economic downturn, aged/aging network infrastructures (server and desktop hardware and older applications), layoffs, hiring freezes and the new standard operating procedure (SOP) “do more with less” has made 99.9% uptime more palatable than in previous years.
Those firms that do not keep track of the number and severity of their outages have no way of gauging the financial and data losses to the business. Even a cursory comparison indicates substantial cost disparities between 99% uptime and 99.99% uptime. The monetary costs, business impact and risks associated with downtime will vary by company as well as the duration and severity of individual outage incidents. However a small or midsize business, for example, which estimates the hourly cost of downtime to be a very conservative $10,000 per hour, would potentially incur losses of $876,000 per year at a data center with 99% application availability (87 hours downtime). By contrast, a company whose data center operations has 99.99% uptime, would incur losses of $87,600 or one-tenth that of a firm with conventional 99% availability.
Ironically, the need for rock-solid network reliability has never been greater. The rise of Web-based applications and new technologies like virtualization and Service Oriented Architecture (SOA), as well as the emergence of public or shared cloud computing models are designed to maximize productivity. But without the proper safeguards these new datacenter paradigms may raise the risk of downtime. The Association for Computer Operations Management/ Data Center Institute (AFCOM) forecasts that one-in-four data centers will experience a serious business disruption over the next five years.
At the same time, customer interviews revealed that over half of all businesses 56% lack the budget for high availability technology. Another ongoing challenge is that 78% of survey participants acknowledged that their companies either lack the skills or simply do not attempt to quantify the monetary and business costs associated with hourly downtime. The reasons for this are well documented. Some organizations don’t routinely do this and those that attempt to calculate costs and damages run into difficulties collecting data because the data resides with many individuals across the enterprise. Inter-departmental communication, cooperation and collaboration is sorely lacking at many firms. Only 22% of survey respondents were able assign a specific cost to one hour of downtime and most of them gave conservative estimates of $1,000 to $25,000 for a one hour network outage. Only 13% of the 22% of survey participants who were able to quantify the cost of downtime indicated that their hourly losses would top $175,000 or more.

Users Confident and Committed to Virtualization Technology
The news was more upbeat with respect to virtualization – especially server virtualization deployments. Organizations are both confident and comfortable with virtualization technology.
72% of respondents indicated the number of desktop and server-based applications demanding high availability has increased over the past two years. The survey also found that a 77% majority of participants run business critical applications on virtual machines. Not surprisingly, the survey data showed that virtualization usage will continue to expand over the next 12 months. A 79% majority – approximately eight-out-of-10 respondents — said the number of business critical applications running on virtual machines and virtual desktops will increase significantly over the next year. Server virtualization is very much a mainstream and accepted technology. The responses to this question indicate increased adoption as well as confidence. Nearly one-quarter of the respondents – 24% say that more than 75% of their production servers are VMs. Overall 44% of respondents say than over 50% of their servers are VMs. However, none of the survey participants indicate that 100% of their servers are virtualized. Additionally, only 6% of survey resp

Virtualization Deployments Soar, But Companies Prefer Terra Firma to Cloud for now Read More »

Networks Without Borders Raise Security, Management Issues

“Networks without Borders” are rapidly becoming the rule rather than the exception.
The demand for all access all the time, along with the rapid rise in remote, telecommuting, part time and transient workers, has rendered network borders obsolete and made networks extremely porous. Today’s 21st Century networks more closely resemble sieves than citadels.
Gone are the days when employees and data resided safely behind the secure confines of the firewall, clocked in promptly at 9:00 a.m., sat stationary in front of their computers, never accessed the Internet, and logged off at 6:00 p.m. and were offline until the next workday.
Today’s workers are extremely mobile, always connected and demand 24×7 access to the corporate network, applications and data via a variety of device types from desktops to smart phones irrespective of location. ITIC survey data indicates that workers at 67% of all businesses worldwide travel telecommute and log in remotely at least several days a month. At present, one-out-of-eight employees use their personal computers, notebooks and smart phones to access corporate data.
From an internal perspective, the ongoing economic downturn has resulted in layoffs, hiring freezes, budget cuts and less money and time available for IT training and certification. At the same time, the corporate enterprise network and applications have become more complex. IT departments face increasing pressure to provide more services with fewer resources. Another recent ITIC survey of 400 businesses found that almost 50% of all businesses have had budget cuts and 42% have had hiring freezes. An overwhelming 84% majority of IT departments just pick up the slack and work longer hours!
External pressures also abound. Many businesses also have business partners, suppliers and customers who similarly require access. Additionally, many organizations employ outside consultants, temporary and transient workers who need access to the corporate network from beyond the secure confines of the firewall.
This type of on demand, dynamic access is distinctly at odds with traditional security models. The conventional approach to security takes a moat and drawbridge approach: to contain and lock down data behind the safety of the firewall. IT managers have been trained to limit access, rights and privileges particularly with respect to transient workers, outside consultants and remote and telecommuting workers. And who can blame them? The more network access that is allowed, the greater the risk of litigation, non-compliance and compromising the integrity of the corporate network and data.
Providing secure, ubiquitous access to an array of mobile and home-based employees, business partners, suppliers, customers and consultants who need permanent or temporary access to the network is a tedious and time consuming process. It necessitates constant vigilance on the part of the IT department to monitor and provision the correct access rights and privileges.
The conundrum for IT departments is to easily, quickly and cost effectively provision user account access while preserving security and maintaining licensing compliance. The emerging Virtual Desktop Infrastructure (VDI) technology, where users control a desktop running on a server remotely, can address some of these issues, but VDI doesn’t solve all the problems.
An intriguing alternative to VDI is nascent software application from MokaFive, which is designed specifically to plug the holes in the so-called “Porous Enterprise.” MokaFive, based in Redwood City, California was founded in 2005 by a group of Stanford University engineers specifically to enable IT departments to swiftly provision network access without the cost and complexity of VDI solutions. MokaFive is not the only vendor exploring this market; its’ competitors include VMware (via the Thinstall acquisition); Microsoft (via the Kidaro acquisition), LANDesk and Provision Networks. However, the MokaFive offering is to date, the only “pure play” offering that enables organizations to provision a secure desktop environment on the fly to individual users rather than just an entire group.
The MokaFive Suite is actually a set of Desktop-as-a-Service facilities that are operating system, hardware and application agnostic. MokaFive’s desktop management features enable IT administrators to centrally create, deliver, secure and update a fully-contained virtual environment, called a LivePC, to thousands of users. Contract workers can log on via Guest Access; there is no need for the IT department to specially provision them. The MokaFive Suite facilitates ubiquitous access to Email, data and applications irrespective of location, device type (e.g., Windows, and Macintosh) or the availability of a hard wired network connection.
I discussed the product with several IT executives and administrators who immediately and enthusiastically grasped the concept.
“This a very cool idea,” says Andrew Baker, a 20 year veteran VP of IT and security who has held those positions at a variety of firms including Bear Stearns, Warner Media Group and The Princeton Review. “The most tedious aspect of configuring a worker’s experience is the desktop,” he says. Typically the IT manager must physically configure the machine, set up the access rights, privileges and security policies and deploy the correct applications. This is especially problematic and time consuming given the increasing number of mobile workers and transient workforces. The other issue is the constant need to re-provision the desktop configuration to keep it up to date, Baker says. The MokaFive Suite, he says, “saves precious time and it solves the issue of the disappearing network perimeter. I love the idea of being able to be secure, platform agnostic and being able to support multiple classes of workers from a central location.”
MokaFive’s LivePC images run locally, so end-users simply download their secure virtual desktop via a Web link, and run it on any computer (Macintosh or Windows). IT administrators apply updates and patches to a single golden image and MokaFive distributes the differentials to each LivePC. The entire process is completed in minutes by a single IT administrator. Once the MokaFive LivePC link is up and published, users are up and running regardless of whether it’s one person or 100 people. The traditional method of physically provisioning an asset can involve several IT managers and take anywhere from two days to a couple of weeks. It involves procurement, imaging, testing, certification and delivery of the device to remote workers. Baker estimates that MokaFive could cut administration and manpower time by 30% to 60% depending on the scope of the company’s network.
MokaFive also requires less of a monetary investment than rival VDI solutions and doesn’t require IT administrators to learn a new skill set, claims MokaFive VP of marketing, Purnima Padmanabhan.
“VDI does enable companies to ramp up and quickly provision and de-provision virtual machines (VMs); however, the IT department is still required to build out fixed server capacity for its transient workforce,” Padmanabhan says. Oftentimes, the additional capacity ends up going to waste. “The whole point of contractors is to dial in, dial up and dial down expenses, and that’s what MokaFive does,” she adds.
Steve Sommer, president of SLS Consulting in Westchester, New York agrees. Sommer spent 25 years simultaneously holding the positions of CIO and CTO at Hughes, Hubbard & Reed a NYC law firm with 1,200 end users – including 300 attorneys — in a dozen remote locations. Sommer observes that corporate politics frequently determine access policy at the expense of security. “A company’s knowledge workers – lawyers, doctors, software developers – who drive large portions of revenue will demand all-access, all the time and security be damned. In the past it was an either/or proposition,” Sommer says.
With the MokaFive desktop-as-a-service approach all the data is encapsulated, encrypted and controlled. Organizations now have the option to manage the permanent workforce as well as temporary contractors and consultants who use their own personal devices quickly and easily. IT managers can provision a virtual machine (VM) on top of MokaFive or give the remote user or contract worker an HTML link which contains the MokaFive LivePC. The end user clicks on the link to get a completely encapsulated VM environment, which is controlled through policies using MokaFive. It can be completely encrypted at the 256-bit AES encryption. The entire environment is managed, contained and is kept updated with the latest passwords, connections, application versions and patches. When the user or contractor worker leaves the company, the IT department issues a root kill signal and all the licenses are retrieved and called back, ensuring compliance.
“MokaFive is a boon for IT departments and end users alike; no more worrying about provisioning and version. I love the fact that it’s application, hardware and operating system agnostic,” Sommer says. “And it also has distinct time saving benefits for the end user, or transient workforce. They can take their work with them wherever they are and they don’t have to worry about borrowing a notebook or PDA and ensuring that it’s properly configured with the correct version.”
MokaFive already has several dozen customers and prospects and is gaining traction in a number of vertical markets including financial services, legal, healthcare, government and education. Given the burgeoning popularity and mainstream adoption of VDI, the MokaFive Suite represents a viable alternative to organizations that want a fast, cost effective and non-disruptive solution that lets IT departments give fast, efficient and secure network access. It’s definitely worth exploring and MokaFive offers free trials for interested parties from its website.

Networks Without Borders Raise Security, Management Issues Read More »

VDI Vendor Wars Intensify

There’s no hotter market in high tech this year than Virtual Desktop Infrastructure (VDI) and you don’t need sales and unit shipment statistics to prove it. No, the best measurement of VDI’s hotness is the sudden flurry of vendor announcements accompanied by a concomitant rise in vitriol.
The main players in the VDI market are actually two sets of pairs. It’s Citrix and Microsoft lining up against VMware and EMC for Round 2 in the ongoing virtualization wars. On March 18, Citrix and Microsoft came out swinging, landing the first potent, preemptive punches right where they hope will hurt VMware the most: in its pocketbook.
Citrix and Microsoft unveiled a series of VDI initiatives that include aggressive promotional pricing deals and more simplified licensing models. To demonstrate just how solid and committed they are to their alliance and taking on and taking down VMware and EMC, the two firms even went so far as to combine their respective VDI graphics technologies.
At stake is the leadership position in the nascent, but rapidly expanding global VDI market. The results of the ITIC 2010 Global Virtualization Deployment and Trends Survey which polled 800+ businesses worldwide in the December/January timeframe indicate that 31% of respondents plan to implement VDI in 2010; that’s more than double the 13% that said they would undertake a VDI deployment in 2009. Application virtualization is also on the rise. The same ITIC survey found that 37% of participants plan application virtualization upgrades this year, up from 15% who responded affirmatively to the same question in the 2009.
The current installed base of VDI deployments is still relatively small; hence the statistics that show the number of deployments doubling year over year must be considered in that context. Nonetheless, double digit deployment figures are evidence of strengthening demand and a market that is robustly transitioning from niche to mainstream. The spate of announcements from Microsoft and Citrix were clearly intended to capitalize on the growth spurt in VDI. At the same time, the companies threw down the gauntlet with initiatives aimed at solidifying and expanding their base of current VDI customers while serving the dual purpose of luring VMware customers away from that company’s VDI platform. They include:
• “VDI Kick Start” This wide ranging sales promotion, which runs from March 18 through December 31, 2010, seeks to jump start VDI deployments by lowering the entry level pricing for customers purchasing Microsoft and Citrix technologies. As part of this deal, existing Microsoft client access licensing (CAL) customers will pay $28 per desktop for up to 250 users to purchase the Microsoft Virtual Desktop Infrastructure Suite, Standard edition, and Citrix’s XenDesktop VDI Edition for one year. That’s roughly a 50% discount off the list prices that corporations have paid up until now for their annual CALs. This is crucial for cost conscious businesses. Client access licenses typically represent the lion’s share of their licensing deals since desktops outnumber servers in mid-sized and large enterprises. In addition to merging Microsoft’s 3-D graphics technology for virtual desktops, called RemoteFX, with Citrix’s high-definition HDX technology.

• The Microsoft Virtual Desktop Access (VDA) License Plan. Organizations that use Thin Client devices which are not included or covered under Microsoft’s SA maintenance plan, can now purchase the VDA licenses at a retail price of $100 per device per annum. This targets end users who travel or telecommute and need to use personal devices or public networks to access their corporate data. Microsoft also made another move towards simplifying its virtualization licensing plan. Starting July 1, Microsoft SA customers will no longer be required to purchase a separate license to access Windows via a VDI.
• The “Rescue for VMware VDI” (the name says it all) this promotion is a direct attack on VMware. Like the VDI Kick Start program it runs from March 18 through December 31, 2010. Under the terms of this deal, any Microsoft Software Assurance licensing/maintenance customer can replace their existing VMware View licenses for free. VMware View users who opt out of that platform in favor of the Citrix and Microsoft offerings will receive up to 500 XenDesktop VDI Edition device licenses and up to 500 Microsoft VDI Standard Suite device licenses free for an entire year once they trade in their VMware View licenses.
Dai Vu, Microsoft’s director of virtualization marketing said the announcements were all about delivering more value to desktop customers and simplifying and extending organizations’ licensing rights.
The Citrix/Microsoft announcements also cement the close working partnership and the “enemy of my enemy is my friend” relationship the firms have enjoyed for many years. By bundling their respective VDI offerings together, the two companies should also ensure integration and interoperability which are crucial components for each and every layer in a virtualized data center environment.
VMware and EMC: Not Standing Still
VMware and EMC executives have yet to publicly respond to the Microsoft/Citrix initiatives. However, it’s almost certain that VMware will have to offer its current and prospective VDI accounts incentives to counter the Microsoft/Citrix alliance. Cash strapped corporations and IT departments are all on the lookout for top notch products at bargain basement prices. And it doesn’t get much better for customers than the free Rescue for VMware VDI program.
VMware built up a commanding lead in the server virtualization arena over the last five years by virtue of being first to market and delivering leading edge features and performance in its signature ESX Server product. VMware’s competitors have spent the last several years playing catch up in server virtualization. This allowed VMware to charge a premium price for its premier offerings. Depending on the size and scope of the individual organization’s server virtualization deployment, customers paid on average 35% to as much as 75% higher for VMware server-based offerings. There were surprisingly few complaints.
The emerging VDI and application virtualization markets are a different story. Only about 5% to 8% of organizations worldwide have fully virtualized their desktop infrastructure. So it’s too soon to declare a clear market winner. It’s safe to say that Citrix, Microsoft and VMware are all market leaders in this segment. This time around though, Microsoft and Citrix are determined not to let VMware and EMC run away with the race by building an insurmountable lead.
Meanwhile, VMware and EMC have not been idle. Former Microsoft executive Paul Maritz succeeded VMware founder Diane Greene following her 2008 departure as the company’s president and chief executive officer. Since then he has made tangible moves to bolster VMware’s position in the VDI and application virtualization arenas. Maritz and EMC CEO Joe Tucci make a formidable combination, as do EMC and VMware. EMC purchased VMware in 2004 for $635 million and it owns an 86% majority stake in the server virtualization market leader. In the past several years, VMware’s fortunes and revenues have risen faster than EMC’s. VMware’s year-over-year (YoY) quarterly revenue growth stands at 18.20% compared with EMC’s modest 2.10% Y0Y quarterly sales. Another key indicator is net earnings and in this regard, VMware experienced negative YoY quarterly earnings growth of -49.4 0% . By contrast its parent EMC recorded a very robust and positive 44.70% jump in YoY quarterly earnings. It is also worth noting that VMware’s annual revenues of $2.02 billion represent only 15% of EMC’s annual sales of $14.03 billion. And to date, EMC’s solutions have only been related tangentially to VMware’s VDI products. For practical purposes, this may continue to be the case. From a PR standpoint though, EMC and VMware are presenting themselves as a sort of virtualization “dynamic duo.”
At an EMC Analyst event at the company’s Hopkinton, MA headquarters on March 11, Pat Gelsinger, president of EMC’s Information Infrastructure Products group described the combination of EMC and VMware – specifically with respect to storage virtualization, virtualization management and private cloud infrastructures — as the “Wild West” of the virtualization market, saying “we want to be disruptive and change the way people fundamentally think of IT.” Though Gelsinger mainly confined his comments to EMC’s core bailiwick in the storage arena, it is clear that EMC and VMware are pro-actively presenting a united front.
In February, the two firms moved to reposition some of their assets; EMC and VMware inked a deal for VMware to acquire certain software products and expertise from EMC’s Ionix IT management business in an all cash deal for $200 million. EMC does retain the Ionix brand and gets full reseller rights to continue to offer customers the products acquired by VMware. Maritz said VMware’s acquisition of the Ionix products and expertise promises to further establish VMware vCenter as the next generation management platform for private cloud infrastructures.
The agreement also calls for VMware to take control of all the technology and intellectual property of FastScale, which EMC acquired in 2009. The FastScale Composer Suite incorporates integrated software management tools to enable organizations to maintain peak performance in a virtualized environment.
Also, recently, VMware introduced ThinApp 4.5, a new version of its application virtualization package designed to simplify enterprises’ migration to Windows 7.
End Users are the Biggest Winners
What makes the latest competition for VDI market dominance noteworthy is the extreme actions the combatants are willing to take in order to retain and gain customers’ at their rivals expense. With last week’s joint announcements and deepening partnership, Citrix and Microsoft have signaled their intention to lead but it’s still too early to call the race.
The joint Microsoft/Citrix initiatives to cut costs and simplify virtualization licensing plans remove two of the more significant barriers to VDI adoption. The largest looming challenge remains the willingness of corporations to embrace a new technology model as their organizations and IT departments continue to grapple with the lingering effects of the ongoing economic crunch. In this regard, all of the virtualization vendors in concert with OEM hardware vendors like Dell, Hewlett-Packard, IBM, Stratus Technologies and Wyse who partner with them must convince customers that transitioning to VDI will provide tangible Total Cost of Ownership (TCO) and Return on Investment (ROI) benefits. This entails providing organizations with the necessary guidance – including tools, training, documentation, Best Practices and solid technical service and support – to ensure that a conversion to VDI can be accomplished with minimal disruption. Admittedly, this is a tall order.
Hardware vendors like Dell, HP, IBM et al all have a stake in the future success of the VDI market. Organizations that migrate to VDI will seek to upgrade to newer, more powerful desktops (PCs, notebooks) and servers, which in turn, potentially boosts the hardware vendors’ individual and collective bottom lines. Additionally, both HP and IBM boast huge service and support organizations, which also stand to benefit from an uptick in VDI adoptions. So the hardware vendors have every reason to partner with Citrix, Microsoft and VMware to promote and expand the VDI market segment. Regardless of which vendor(s) prevails, the biggest winners will be the customers. When several big name vendors vie for the hearts, minds and wallets of customers, it usually means that feature-rich, reliable products get to market sooner at more competitive prices. Let’s hope the VDI race is a long one.

VDI Vendor Wars Intensify Read More »

Database Competition Heats Up

The database market will see lots of activity during the 2010-2011 timeframe as nearly 60% of organizations move to upgrade or expand existing and legacy networks.
That statistic comes from new ITIC survey data, which polled 450 organizations worldwide. Not surprisingly the survey shows that longtime market leaders Oracle, IBM, Microsoft and Sybase will continue to dominate the DBMS market and solidify their positions.
Databases are among the most mature and crucial applications in the entire network infrastructure. Database information is the lifeblood of the business. Databases directly influence and impact every aspect of the organization’s daily operations including: relationships with customers, business partners, suppliers and the organization’s own internal end-users. All of these users must have the ability to locate and access data quickly, efficiently and securely. The corporate database must deliver optimal performance, reliability, security, business intelligence and ease of use. It must also incorporate flexible, advanced management capabilities to enable database administrators (DBAs) to construct and oversee a database management system (DBMS) that best suits the organization from both a technology and business perspective.
What will distinguish the DBMS market this year is that the always intense and vociferous vendor rivalries will heat up even more over the next 12 months.
There are several pragmatic reasons for this. Most notable is the fact that many organizations deferred all but the most pressing network upgrade projects during the severe downturn over the past two-and-a-half years. Many businesses are now in a position where they must upgrade their legacy database infrastructure because it’s obsolete and is adversely impacting or will shortly impact the business. Anytime a company decides on a major upgrade there’s always a chance, that they may switch providers. The DBMS vendors know this and will do their level best to lure customers to their platform, or at the very least get a foot in the door.
Another factor that looms large in the 2010 DBMS market dynamics is Oracle’s purchase of Sun Microsystems. That acquisition finally got the green light from the European Commission last month. Speculation abounds as to the fate of the MySQL, which is a popular and highly regarded Open Source DBMS. For the record, Oracle executives stated publicly within the last two weeks that it will continue to support and develop MySQL and even provide integration with other Oracle offerings. But users are uneasy because MySQL does compete to some extent with some Oracle products. Expect rivals, particularly IBM and Microsoft, to aggressively capitalize on user confusion and fear to entice users to their respective platforms.
The DBMS Vendor Landscape
As nearly everyone knows, the four major DBMS vendors: Oracle, IBM, Microsoft and Sybase account for 90% of the installed base, unit shipments and revenue.
Oracle’s 11g is the undisputed market leader. It offers a full slate of online transactional processing (OLTP) as well as specialized database applications. As such it is being assailed from all sides and with relish by rivals who take every opportunity to criticize its’ products and strategy. Oracle, headed by Larry Ellison one of the most visible and outspoken high technology CEOs, happily reciprocates with its own vitriol.
IBM’s DB2 9.5 for Linux, Windows and UNIX remains firmly entrenched in high end enterprises owing to its rock solid reliability, performance, management, scalability and overall data and application integration capabilities. Users are also loyal to the DB2 platform because of IBM’s strong after-market technical service and support offerings. IBM also secures its position within very large enterprises by giving good deals and discounts on licensing renewals and training and support.
Microsoft’s SQL Server 2008 has shown tremendous improvement in scalability, security, ease of use, programmability and application development functionality and is gaining ground particularly among SMB and SME organizations. Microsoft hopes that the increased functionality of SQL Server 2008 will enable it to erode Oracle’s very entrenched presence among enterprises. A big plus for Microsoft is its legion of committed resellers and consultants who do an excellent job of promoting SQL Server 2008 among SMBs and SMEs.
Cost, Interoperability and Performance Top User DBMS Requirements
DBMS upgrades and new installations will be fought, won and/or lost according to three main factors: they are interoperability, cost and performance/features. The latest ITIC survey data found that nearly 90% rated interoperability with existing or planned infrastructure as the most important factor weighed when choosing a server vendor; 80% chose cost as a main DBMS influencer and 78% cited performance as their main reason for choosing a specific DBMS vendor platform.
But any DBMS vendor that hopes to dislodge or supplant a rival in an existing account will have to work hard to do so. The ITIC survey data also shows that organizations – especially large enterprises – do not readily or often forsake their legacy platforms. According to the survey data, 76% of survey respondents indicated they have not migrated or switched any of their main line of business applications from one database platform to another within the past three years.
This statistic makes a lot of sense. Precisely because DBMS platforms are among the most mature server-based applications in the entire enterprise, it’s much more work to rip out one platform and start fresh. A wholesale switch from one platform to another requires significant capital expenditure monies. Additionally, the business must also invest a lot of time and energy in converting to a new platform, testing new applications, rewriting scripts and re-training DBAs and getting them certified on the new environment. For CIOs, CTOs and IT departments this prospect has roughly the same appeal as having root canal without Novocain.
Nonetheless, one-in-five survey respondents – 20% — did migrate database platforms over the past three years. The most popular reasons for switching DBMS platforms, according to the survey respondents is a move to a custom developed in-house application a customized application developed by a partner. Just over half – 53% — of responding organizations that changed DBMS platforms came from midsized enterprises with 500 to 3,000 end users – a fact that favored Microsoft SQL Server 2008 deployments. Among the 20% of ITIC survey respondents that switched vendors, fully 50% of organizations swapped out Oracle in favor of SQL Server, while 17% migrated from Sybase to SQL Server. Overall, among the 20% of respondents that switched database platforms over the past three years, two-thirds or 67% opted to migrate to SQL Server. In this regard, Microsoft SQL Server converts outpaced rival Oracle by a 2-to-1 margin. Approximately 34% of the 20% of businesses that changed database platforms migrated away from DB2 or SQL Server in favor of Oracle.
IBM DB2 users were among the most satisfied respondents; an overwhelming 96% stayed put.
Analysis: Customer Issues and Chief Challenges
Respondents cite challenges with their database strategies, but are also sanguine about the journey. For instance, one respondent said that the main challenges were “keeping up with changes to the SQL platform and getting our database administrators and appropriate IT managers trained and re-certified on new versions of the technology and then figuring out how it all works with new virtualization and cloud computing technologies. Cost and complexity are also big factors to consider in any upgrade. Networks are getting more complex but our budgets and training are not keeping pace.”
Respondents were particularly focused on the cost issue: “cost, both new licensing and annual maintenance”, “increasing cost of licensing”, “cost is the overriding factor” were just some of the responses.
As for future plans, a 56% majority of respondents report that switching database platforms in the coming months is very unlikely; while 17% said it is not an option to switch and 15% said that switching is a possibility, depending on the circumstances.
Getting organizations to change DBMS platforms is difficult but not impossible. If a rival vendor can offer concomitant performance and functionality, coupled with tangibly better pricing and licensing renewal options which lower Total Cost of Ownership (TCO) and speed Return on Investment (ROI), organizations may be induced to make the switch. The biggest DBMS battle is in the SMB, SME sectors and green field accounts that are adding new databases.
DBMS vendors are anxious to keep the current customers and gain new ones. End users should make the vendors work to keep them as satisfied customers. Dissatisfied customers should voice their concerns and even satisfied customers should let their vendors know what they can do to make them even happier.

Database Competition Heats Up Read More »

Tablets Take Off in 2010, Thanks to Apple’s iPad

Regardless of how well the newest class of Tablet computers fare in terms of sales and unit shipments, the evolution of these portable devices will be divided into two classifications: Before the Apple iPad and After the Apple iPad.
Apple’s iPad — admittedly a late entrant into this market — has already changed the game in the fledgling, niche Tablet market, even before the company has shipped its first device.

The frenzied efforts of industry watchers — from Apple afficiandos, rival vendors to analysts and media — to ferret out the most minute detail of the Apple tablet in advance of its release, served to served to rejuvenate what had been a stalled market segment.
The Tablet computer occupies a still nebulous market arena that puts it somewhere in between smaller NetBooks and smartphones and larger sized portable devices. No one can answer those questions with any surety, but one thing is certain: Apple’s entrance into this crowded field has sparked renewed interest into this device category.
The long rumored iPad was shrouded in mystery for months before the official January 27 announcement. Apple stubbornly refused to confirm its existence, much less any details. Nonetheless, the anticipation was so great, that it sent several vendors scrambling to preview rival Tablet offerings at the Consumer Electronics Show (CES) in Las Vegas in advance of the iPad debut.
No one was shocked when Apple CEO Steve Jobs introduced the company’s latest “creation.” However, Apple did manage to stun the industry by hitting the $500 price barrier for the entry level device. This affordable tag makes the feature laden iPad Tablet competitive with the wildly successful, low-cost NetBooks which were all the rage in 2009. Additionally, the Apple iPads list tags will almost certainly follow the normal discounted street pricing patterns and decline by 10% to 30% over the next six months. Apple’s aggressive pricing maneuver has also succeeded in causing consternation among competitors who must now re-evaluate their own price structures in order to follow Apple’s lead.
Still even at $499, the Apple iPad is not the lowest priced Tablet device. That distinction currently belongs to Freescale Semiconductors which introduced a touch screen Tablet that retails for $199. The Freescale tablet lacks many of the iPad’s high end features, such as advanced graphics, which accounts for the price differential. It runs on either Android or Linux and also incorporates a battery that lasts for eight to 10 hours. Consumers can also opt to add a keyboard to hold the Freescale tablet like a monitor. Available in a selection of colors, the tablet includes Wi-Fi, Bluetooth and optional support for 3G. Users can add an external keyboard and mount the tablet on the keyboard as its display. Freescale Semiconductors is marketing the device able to OEMs who want to quickly get to market with a Tablet.
Tablet Market: Narrow Niche or Mainstream Appeal?
The real question now is: will the recent flurry of new Tablet releases translate into mainstream success or will Tablets remain a niche device in search of a market? Many industry observers have openly scoffed at the notion that these devices will ever achieve widespread adoption. In recent months the rising tide of speculation about the Apple iPad also engendered debate as to why anyone would need or want yet another portable device in a field that is already crowded with smart phones, a wide variety of portable notebooks and the very popular and inexpensive Netbooks.
These are all valid questions. Tablet devices have been available for the past five years. To say that they have met with only moderate success is an understatement. This is partially due to the economic downturn and also due in large measure to the fact that the marketing around these devices never identified a clear and compelling use for them outside a few narrow niches.
There was also confusion about what constituted a Tablet computer. There is no standard, one-size-fits-all device that addresses all market segments. In the 2006-2007 timeframe some vendors opted to see larger Tablets that more closely resembled traditional notebooks or laptops. The higher end devices from vendors like Acer, HP and Toshiba often incorporated advanced features like handwriting recognition, inking capabilities in the Windows presentation subsystem and fingerprint security ID. Conversely, several suppliers marketed hybrid mini-Tablets/eBook readers with small (six inches or less) form factors.
And over the last two years, the Tablet segment was eclipsed by the burgeoning popularity of NetBooks, which have an average price range of $150 to approximately $400.
Nonetheless, nearly every major hardware vendor boasts at least one Tablet in their product portfolio. Acer, Asustek Computer, Dell, Fujitsu, Gateway, Hewlett-Packard (HP), Lenovo, Micro-Star International (MSI), Motion Computing, Toshiba, Viewsonic and Wacom are all betting that consumers and eventually businesses will embrace the Tablet form factor.
In recent months Asustek Computer, HP, Dell and MSI all debuted new tablet offerings to beat Apple to the punch. MSI launched its 10-inch Tablet at CES and HP is readying its offering, an Inventec-manufactured device set to debut in the spring. Asustek released its tablet Eee PC T91 and will launch 10-inch model along with Windows 7.
Bottom line: There is a wide range of form factors and features from which to choose. Models range from very small lightweight, like the Apple iPad that weigh 1.5lbs. , and use a stylus, to larger 5-6 lb. notebook-type form factors, that swivel and have full or hidden mobile keyboards.
The Price is Right
One thing about Tablets that should help spur acceptance and adoption,and may even trump NetBooks, is cost. Tablet computer prices have dropped significantly from 2007 when pricing ranged from $599 to $2,700, with the media tag averaging $1,600. Thanks to the rise of NetBooks and Apple’s uncharacteristic move to be a price/performance leader, the average selling price (ASPs) for Tablets is now between $400 and $800. Special promotions abound and leasing and financing solutions are widely available from all the vendors. HP, for example, markets its HP/Compaq Mini 110, 210 and 311 Series of mobile laptops and mini NetBooks which range in price from $269 to $399 with 10 to just under 12 inch screens and is outfitted with Intel’s Atom processor 1.60 GHz. Additionally, HP also sells the TouchSmart tm2t series of high-end customizable tablets, whose list pricing begins at $899 and ranges to about $1,300. The TouchSmart tm2t tablets, have a 12.1 inch display screen. They allow users to swivel the screen, fold it over, write and draw on it using a digital pen or alternatively employ touch screen fingertip navigation. They also have a full keyboard. The HP tablets are available with 64-bit Windows 7; either 2GB or 3GB of memory; a 250GB or 320GB hard drive and a choice of Intel 1.3GHz Pentium processor or an Intel Core 2 Duo 1.60GHz processor. The HP TouchSmart tm2t series pricing is closer to traditional notebooks, though it incorporates the tablet features and functions. HP also regularly offers special sales and promotions on the TouchSmart tm2t tablets which can lower the price by 20% or more. Dell and Toshiba both have multiple Tablet models. Toshiba’s Portege M750 is a high end model that can convert from a notebook to a tablet and has digital pen and touch screen capabilities with pricing starting at $1,279.
Apple CEO Steve Jobs has made no secret of his disdain for NetBooks and he now seems determined to at least bring the iPad entry level list prices within a couple of hundred dollars (US) of the low cost NetBooksin the hopes of luring users away. . Credit Suisse financial analyst, Bill Shope published a Research Note earlier this week based on his meetings with Apple executives. According to Shope, Apple is positioning the iPad to be the device of choice for Web browsing and all forms of mobile media and the company is willing to cut the price, if that’s what it takes to ensure success. Other vendors will be forced to follow suit.
Meanwhile, with features ranging from mobility, portability and widespread applications like gaming, videos, photos, E-book reader, Email, Web browsing, maps, weather forecasts as well as the ability to write notes and draw pictures, the appeal of Tablets is taking on a much sharper focus. Seen in this context Tablet devices would appeal to a wide range of consumers as well as commercial and business users in fields like:
• Legal
• Healthcare
• Manufacturing (factory floor)
• Construction
• Academic
• Consultants
• Press
• Defense
• Aerospace

With Tablet devices now sporting features, performance, applications and pricing to rival high end notebooks and low-cost E-book readers and NetBooks, it’s highly likely that their popularity and adoption will soar in the coming months. The competition will be intense and that spells good news for consumers and corporations that are looking for competitively priced devices for their mobile and remote workers.

Tablets Take Off in 2010, Thanks to Apple’s iPad Read More »

Apple iPad Debuts and Surprise, Lives Up to the Hype

“It” is finally here. Apple CEO Steve Jobs unveiled the iPad tablet device at the Yerba Buena Center for the Arts in San Francisco to a packed house amidst thunderous applause.
After months of speculation, which reached a fevered pitch over the last two weeks, it was absolutely imperative that Apple‘s iPad live up to the hype. And it does. Jobs characterized the iPad as a third device category between a notebook and a smart phone; and given the features and the form factor that is a credible claim.
The biggest and most pleasant surprise was the very affordable price tag: iPad list pricing begins at $499 for the basic 16GB model and goes up to $829 for the most expensive 64GB model which includes Wi-Fi and 3G. While many industry watchers expected the iPad to sell for less than $1,000 (US), it’s safe to say that no one expected it to break the $500 barrier. This aggressive tag should enable the iPad to effectively compete and competitively priced compared to the smaller and wildly popular Netbooks, which is no doubt exactly what Steve Jobs intended.
The iPad incorporates all of the rumored features and elements that consumers have come to expect and demand from Apple and then some. It incorporates superior graphics, an elegant case, a slick user interface and a multi-touch virtual keyboard. In another nod to usability, the iPad can be angled or tilted in any direction while still allowing the user to view the screen. And at just half an inch thick and weighing only 1 ½ lbs. the iPad sports a sylph-like silhouette that would be the envy of every supermodel, not to mention potentially millions of consumers who will love the portability of the slim, lightweight form factor.
The iPad, which comes equipped with a 1GHz Apple A4 chip, is also available in a variety of configurations to fit various budgets. Customers can purchase the iPad with 16-, 32-, or 64 GB solid state hard drives. And in what will surely be a boon to consumer and corporate road warriors, the iPad has a battery life of 10 hours for mainstream applications. And the iPad can sit on Standby for a month without requiring a charge, according to Jobs. All models come equipped with Wi-Fi and Bluetooth connectivity.
The iPad is also fully interoperable with Apple’s other top selling products the iPhone, iPod and iTunes. Interoperability is a necessary and crucial component to the iPad’s future success. It also has the speed and power to run the latest games, TV and movies; an E-book reader and content from multiple external sources.
Broad Appeal
The iPad seemingly has something for everyone: enough speed and power to attract the gaming crowd; E-book reader capabilities; Google Maps; the ability to watch TV, movies and video – YouTube can be viewed in high definition (HD). It also features broad application support which is the life blood and a necessary element for the success of any hardware device. It already supports popular applications such as Calendaring, Google Maps, Facebook and even Major League Baseball. The iPad will also appeal to scrapbooking and photography buffs. It has a photo scrubber bar on the bottom of the screen that has multiple settings, that lets the user flip through photo albums, run slideshows and listen to music. And while it may not be the [Amazon] Kindle Killer as some have dubbed it, at the very least the iPad will give the Kindle some tough competition. Apple has already lined up five publishing powerhouses including: Harper Collins, Macmillan, Simon and Shuster, Hatchett House and Penguin Books. More such partnerships will likely be announced in the coming months.
Analysis
The iPad has two missions to fulfill. The first is that it must equal or exceed the very high bar that Apple has set for itself. This is no mean feat. Apple aficionados and critics alike have been spoiled by the dizzying array of devices Apple has released over the past several years. These range from new innovative Mac Books like the MacBook Air to the market changing iPhone and iPod and the ubiquitous iTunes for music downloads.
Apple now finds itself in the enviable or unenviable position of having to top itself in the quest to deliver “the next big thing” and secure its spot on the top of the hardware mountain.
Secondarily, the iPad is Apple’s attempt to fell multiple competitors — from Amazon to Google to the Net book vendors — with a single arrow.
So how does the iPad stack up? From a feature/function standpoint it lives up to the hype and it exceeds expectations from a pricing standpoint. Steve Jobs may very well have introduced a third device category. The iPad appeals to a broad user constituency that includes gamers, E-book readers, music and photography lovers, Web surfers and mobile and remote users (and probably some corporate knowledge workers as well) as well as casual consumers who just want to get the latest and greatest consumer offering that won’t break their budgets.
Undoubtedly, there will be some users who will simply shrug their shoulders and say, “I already have a notebook or Net book, why do I need the iPad?” And that’s fine.
And while it may not kill Amazon’s Kindle or the rival Net books it will force those competitors to respond with more advanced features and aggressive price points in the near and intermediate term. There is no doubt that other vendors fear Apple as witnessed by the many new tablet devices that were introduced at the Consumer Electronics Show earlier this month. Everyone wanted to beat Apple’s iPad to market.
No, the iPad is not Moses coming down from the mountain with tablets containing The 10 Commandments, but then again Moses didn’t have such a large audience, the benefit of sending his message out via the Web or the advantage of Apple’s marketing machine.
When all is said and done, the sales to end users – consumer and corporate alike – will be the final arbiters of the iPad’s success. The first sales figures, including pre-orders should be available within the next few months. Meanwhile, Apple has done its part by imbuing the iPad with the features, functions and broad application and industry support that are necessary to make it a success. Barring any unforeseen or show stopping bugs, the iPad looks like a winner.

Apple iPad Debuts and Surprise, Lives Up to the Hype Read More »

Want to Save IT $$$: Review, Renegotiate Licensing Contracts

If your business is strapped for cash and wondering how it’s going to find the money to pay for much needed hardware, software and network upgrades in 2010, it’s time to revisit your existing licensing contracts.
The specific terms and conditions of your licensing contracts could literally translate into money in the company’s coffers. C-level executives and IT departments may be pleasantly surprised to find that there’s gold in those contracts that may potentially net your organization much needed licenses and other already negotiated extras. While there are no guarantees, the chances are good that the organization’s existing licensing contracts could net you a windfall similar to unclaimed funds or finding treasure in Grandma’s attic. These overlooked items which may include things like unused and available desktop, server and software application licenses; discounted and free training and technical service and support could be worth thousands or even millions depending on the size and scope of the company are licensing agreements. ITIC primary research indicates that eight out of 10 businesses will undertake a major product or application upgrade during 2010. Eight out of 10 businesses will perform a major network migration in the next 12 to 15 months, and with budgets still tight, upper management is demanding tangible TCO and ROI.
Natural skepticism many prompt many of you reading this to question how organizations could fail to notice licenses and tools that they’ve already paid for, which are so crucial to the bottom line.
Very easily and it happens all the time. As an analyst at Giga Information Group, my colleague, Julie Giera and I put together a series of licensing boot camps or user seminars throughout the U.S., Canada and Europe. We were stunned to realize that the majority of organizations don’t know what licenses they’ve bought, what they’re using or not using and they frequently don’t take advantage of extras and freebies that are written into their contracts.
I’m not accusing users of being ignorant or lazy. But the fact is, licensing agreements are most often negotiated by persons within the organization who are only tasked with getting the deal done. Once the contract is signed, the negotiator hands it off to the appropriate executive or accounting person, who promptly files the document and forgets about it. Lax communication amongst departments means that IT departments may not see the actual contracts. Thus, they may be unaware that they are entitled to myriad “extras” like expanded technical service and support; access to days or weeks of free training on specific products and access to free online inventory and asset management tools that can assist the organization in tracking license usage and remaining compliant.
Compounding the problem is the fact that the majority of licensing contracts are negotiated once every two, three or even four years. ITIC research indicates that 60% of the time a different person will negotiate the licensing contracts once it comes due for renewal. And since organizations, oftentimes don’t keep good records; the new contract negotiator may be unaware of specific terms and conditions and whether or not the organization or the vendor fulfilled their responsibilities.
The result: organizations – from academic institutions and non-profits to the largest commercial enterprises –can unwittingly cheat themselves out of licenses and benefits that are rightfully theirs, leaving tens of thousands or even million on the table. Not everyone does this of course. Approximately 10% of organizations aggressively negotiate their contracts and keep tabs on their T&Cs, with the passionate obsession of Les Miserables’ Inspector Javert pursuing Jean Valjean through Paris.
Here’s a scary statistic: recent ITIC survey data indicates only 7% of organizations polled said they had attempted to renegotiate their licensing contracts in the past 12 to 18 months!
In this instance, the ongoing economic downturn can work your organization’s favor. Vendors and resellers are aware that most businesses are either strapped for cash or that their IT budgets don’t allow for extras. Your vendors and resellers are all anxious to retain your business and get you to re-sign your contracts once the licenses expire. Even if you just signed a new contract six months or a year ago, you can still contact the vendor or reseller and initiate interim negotiations. But you won’t get anything if you don’t at least make the attempt to renegotiate.
Negotiating to Save
First things, first: assemble a team that includes the appropriate members of the organization such as the CIO, CTO, VP of IT and the appropriate network administrators (e.g. database, server, messaging, security, storage, etc.) to review the T&Cs of your various licensing contracts. It’s also a good idea to involve the corporate attorneys. If your firm doesn’t have in-house counsel, engage the services of an outside firm. Legal counsel will help unravel the confusing and nebulous terms.
Do a thorough cost/analysis of your current environment. Next, conduct a thorough assessment of your current environment, tally up your licenses: are you using everything you paid for and are you paying for all the seats you’re using? Compliance is crucial. You won’t be able to negotiate a better deal if your organization has not paid for all its licenses – even if it was an honest mistake. There are lots of free software inventory and asset management tools available to assist your organization in this task. You may discover that your current licensing agreement entitles your organization to an online asset management tool. This tool will act as a discovery mechanism to uncover unused or available licenses for key products and applications. This is “found money” because your business has already paid for these product licenses.
Organizations that have recently been involved in mergers, acquisitions or divestitures should pay especially close attention to the T&Cs of the licensing contracts for all of the acquired or discarded business units. Some licenses will carry over but some may not and M&A activity will affect planned purchasing decisions.
Next, the team should collaborate and define the business needs and goals. Set priorities. There are many ways to improve TCO and ROI.
Where the Money Is
The team also must determine whether or not the organization purchased a maintenance and upgrade plan. These plans can be a real treasure trove, including everything from free or discounted upgrades; access to online training, learning and assessment tools. Additionally, they may also entitle the organization to many free services such as 24×7 phone support; free training vouchers for specific products; access to onsite technical training and support. Customers who purchased Microsoft’s Software Assurance maintenance and upgrade plan, for example, have the ability to swap or convert their Software Assurance tech support incidents for Microsoft Premier Problem Resolution incidents. The latter provides a much more detailed and hands on level of support service. Microsoft’s SA agreements also allow customers to purchase extended Hot Fix support to resolve code issues on products that are no longer sold or supported and complimentary “cold backup” server licenses for the purpose of disaster recovery.
If you’re not in compliance, take steps to return to compliance in advance of any product negotiations. Next, do a cost/analysis of your projected environment for at least two years and preferably three years. This should include estimates on staff increases or decreases which will affect future the purchasing levels and licensing agreements. Don’t over-estimate. It’s better to buy at a lower level and upgrade than to commit to purchase a higher discount level and be forced to downgrade and give back a percentage discount to your vendor or reseller in the event your company’s fortunes wane.
Before approaching your vendor/reseller, investigate what types of deals your peers are getting on their licensing contracts. Compare notes to determine that the T&Cs of your contracts are competitively priced. User groups are a great source of information. When it comes to negotiating for better terms, knowledge really is power.
Approach your vendor or reseller with several “wish list” items. Be as specific as possible. “I’d like a 10% discount on 50 licenses for XYZ product,” will yield better results than an open-ended request like, “How much of a discount can you give me?” or “What can you do for us?”
And above all be reasonable. The economic recession has had an adverse impact on vendors as well as end users so don’t ask for the Sun, the Moon and the stars.
If you have a good relationship with your vendor or reseller sales representative, there’s a good chance they’ll be receptive to negotiating things like fixed annual payments or extended payment plans and even negotiating down the percentage of the True-Up payment if your organization has experienced a reversal of fortune over the past year or two. Here’s a list of things your organization may want to negotiate:
• Keep your unused licenses and have them carry over when you re-sign a new contract.
• Negotiate for price caps on product and licensing increases
• Price protection for the duration of your licensing contract
• Contract buy-outs
• Licensing transfer fees
• Penalty waivers if you’re non-compliant
• Flexibility in signing upgrade and maintenance agreements
• Discounted or free training
• Discounted or free technical service and support incidents
• Free training vouchers

Again, this is all saved money that will shave your organization’s capital expenditure and operational expenditure budget. Don’t get discouraged if your vendor or reseller initially balks. That’s all part of the negotiating process. Be persistent; remember your vendor wants to keep you as a customer. Be prepared with a counter-offer. Remember: you have nothing to lose and everything to gain.

Want to Save IT $$$: Review, Renegotiate Licensing Contracts Read More »

Deal You Can’t Refuse: Stratus’ Zero Downtime or $50K back to customers

The most incredible deal of this holiday season — and one that customers will be hard pressed to refuse — is Stratus Technologies’ pledge of Zero downtime for customers or $50,000 cash back.
Here’s how it works: organizations that purchase any standard configuration of Stratus Technologies’ most current ftServer 6300 enterprise-class x86 fault tolerant server equipped with Microsoft Windows Server 2008 and the required service contract, are eligible for $50,000 or product credit if the server hardware, Stratus system software or operating system failures cause unplanned downtime in a production environment within the guarantee period. The guarantee period lasts up to six months following server deployment. Stratus executives vow that there are no hidden clauses or trap doors in the guarantee.
Stratus Technologies, headquartered in Maynard, Ma. has built its reputation on delivering rock-solid reliability of 99.999% uptime. That’s the equivalent of less than one minute of per server downtime in a year! This is an admirable achievement by any standard.
The ftServer 6300 line is Powered by 2.93 GHz X5570 Intel Quad-Core Xeon™ processors, the ftServer 6300 is optimized for large data center multi-tasking applications with high transaction rates, such as credit card authorization processing, high speed ATM networks, and as a powerful engine for database applications and virtualization environments. A typical ftServer 6300 configuration can actually cost less than the value of the payout. The offer is open to customers worldwide, and the program ends Feb. 26, 2010.
Specifically, customers can choose from a custom version of the ftServer 6300 or one of two pre-configured bundled configurations. The ftServer 6300 Power Bundles #1 and #2 are robust, high-end configurations that consist of Microsoft Windows Server operating system, disk drives and supporting peripherals, with a significant package discount compared to individually priced system components. Other server models in the ftServer line are not included in this program.
Stratus Technologies’ decision to quite literally put its money where its mouth is is a bold move and one that the overwhelming majority of vendors would never consider. In fact, ITIC can’t recall any high tech hardware vendor in recent memory, offering these same terms. However, Roy Sanford, Stratus chief marketing officer, said the deal underscores confidence in Stratus Technologies is of its ability to deliver the highest levels — 99.999% uptime — or greater. “The Zero Downtime program is a show of confidence that our products consistently perform at the highest levels of availability. Our guarantee is right out there for all to see, customers and competitors alike.”
Corporate enterprises that are risk averse, those that demand the highest levels of uptime or those that are in a betting mood are well advised to check out the Terms and Conditions of Stratus Technologies offer. You’ve literally got nothing to lose. Stratus Technologies: http://www.stratus.com

Deal You Can’t Refuse: Stratus’ Zero Downtime or $50K back to customers Read More »

IBM Launches Systems Software Biz Unit; Emphasizes Virtualization, Cloud & Management

The decision by IBM’s Systems and Technology Group (STG), to launch a new Systems Software Business Unit (BU) was one of the more significant announcements in what was inarguably a jam-packed Analyst Summit. Helene Armitage, who will serve as General Manager (GM) of System Software noted that it aligns perfectly with IBM’s broader strategy in hardware, services and networking, stating that “Systems software is the integrating force in the data center. Virtualization is the foundation of the data center and management is the backbone [of the data center]. The Systems Software Business Unit is a key STG growth engine and it will enable us to deliver value across all IBM hardware plans.”
The Systems Software Business Unit will provide the integration framework for STG and act as the glue that enables seamless end to end virtualization and platform management and other capabilities. Systems Software covers some 160 products including: management, energy, security, availability, operating systems (OS) and virtualization. According to Armitage, IBM recently conducted a study with over 200 of its corporate clients on virtualization and management and found that clients are strategically investing in their IT infrastructure to drive business value.
IBM’s findings track closely with the results of ITIC’s 2009-2010 Virtualization Deployment Trends Survey conducted in August and the 2010 IT & Technology Trends Survey which polled 500 businesses worldwide in December 2009. The results of both surveys revealed that upgrading server hardware; deploying server virtualization software and deploying new applications in support of business objectives were among the top three IT spending priorities for 2010 for nearly 50% of the survey respondents.
Additionally, the 2009-2010 Virtualization Deployment Trends Survey revealed that almost 30% of businesses will undertake a private or public cloud computing initiative over the next 12 months. This makes virtualization management and fast, efficient, reliable service and support imperative. The results from both ITIC surveys both emphasize that C-level executive managers and IT departments strongly base their purchasing decisions on doing business with vendors who have a track record of superior technology, service and support.
From this standpoint, Armitage said IBM is perfectly poised, via its comprehensive System Software product portfolio, to address the shift from purely physical management to the integration of physical and virtual systems, storage and network resources. IBM, she said is adapting as the business needs of its corporate customers similarly adjust “to optimize energy usage, maximize resource utilization and keep the corporate data assets secure.”
Armitage acknowledged that IBM has not been in the “industry conversation on virtualization,” but said that Big Blue aims to change that in the coming months to be more visible. To accomplish this IBM will focus on a number of key areas including: physical consolidation; virtual system pools; integrated service management and cloud computing.
Armitage noted that the 200 corporate clients that participated in the aforementioned IBM study are using cloud computing as an access model. The goal of STG and the new Systems Software group is to help corporate customers unlock more value in virtualization than they are currently realizing. To accomplish this, the company will deliver products, tools and services that will assist customers in automating and optimizing, Armitage said.
IBM’s just released Systems Director version 6.1 is one of the lynchpins in the company’s strategy and is designed to run as a standalone product. Though the Systems Software BU and IBM’s Tivoli group exist and operate independent of one another, they do share a joint design and architecture team which have agreed upon APIs. “It’s not quite a joint development team,” Armitage said, “but there is a strong collaborative effort between System Software and the Tivoli team,” she said.
IBM Systems Director Software v 6.1 provides businesses with single point of control to manage all aspects of their data center operations, and integrates best-of-breed IBM virtualization capabilities to provide faster, more efficient means of ameliorating the management of physical and virtual platform resources. Systems Director 6.1 incorporates a singular user interface (UI) to perform common tasks and also delivers a consistent and unified view of the IT environment in its entirety, including servers, storage and network assets. Corporations can use Systems Director as a standalone tool or in conjunction with IBM’s Tivoli to reduce data center management tasks and expense.
Armitage said that IBM will ship Systems Director v 6.1 with every server. Initially however, the Systems Software Business Unit’s revenues will not appear as a separate line item but will be incorporated into IBM STG’s overall sales figures.
Analysis
IBM’s decision to launch the new Systems Software BU within STG has both short term tactical and long term strategic impact and implications for IBM and hardware customers. Most immediately, it will enable IBM to more comprehensively and cogently address the business and technology needs of its tens of thousands of enterprise customers who are deploying or plan to deploy, virtualization and cloud computing environments. Given that virtualization and cloud computing are two of the hottest emerging technologies, IBM’s move is an excellent one for the immediate, intermediate and long term.
Additionally, networks are growing in size, scope and complexity even as the economic downturn keeps budgets and resources tight. Organizations are more than ever seeking guidance from their vendors. And those vendors that deliver on promises and provide such guidance will reap the rewards of continuing and expanding opportunities. IBM has a proven track record of delivering leading edge technology and superior technical service and support. The latest ITIC survey data found that 77% of organizations rated IBM service and support “Excellent” or “Very Good.”
In order to fully realize the potential of this unit and deliver the hoped-for value to customers, Armitage and her team will have to work hard to carve an identity for Systems Software . IBM is certainly providing its installed base and potential customer base with added value by shipping Systems Director v 6.1 loaded onto every server. However, the product must be accompanied by a strong marketing plan as well as the appropriate accompanying documentation and training materials to assist cash strapped and resource-constrained IT departments in unlocking and maximizing the potential of this software tool. It is crucial for STG and the Systems Software BU to rise to this challenge and distinguish the new unit within the next six-to-nine months as organizations begin to earmark their 2010 corporate expenditures.

IBM Launches Systems Software Biz Unit; Emphasizes Virtualization, Cloud & Management Read More »

Women in IT Need to Network to Break Out of the “Pink Ghetto”

“Make your employers understand that you are in their service as workers, not as women.” Susan B. Anthony in an article excerpted from October 8, 1868 edition of The Revolution, a women suffrage newspaper.

Note to working women: if you want to break out of the “Pink Ghetto” tear a page out of your male co-workers playbooks, start a Good Old Girls group and get serious about networking.
The Pink Ghetto is a largely invisible, often unspoken and unacknowledged place that impedes womens’ upward mobility in the workplace, ranging from achieving equal pay for equal work; to being offered the same opportunities as male co-workers to getting promoted as quickly as men or getting promoted at all.
There are no magic formulas or quick fixes to address ingrained inequities. Networking and mentoring initiatives offer immediate, tactical as well as long term strategic solutions to assist women in breaking down gender-based barriers. There are compelling reasons why women in high technology and in all professions, should make networking an integral part of their daily routines, formalize their efforts and set specific goals.
The ongoing recession of the last two years has made the Pink Ghetto more palpable than ever. The competition for job retention, promotions and to secure new positions is intense. The ongoing economic crisis has spared no one. And with the unemployment rate hitting 10.2% in October – the highest levels in 30 years – everyone is feeling the pressure. Consider these statistics:
• Women now constitute roughly 50% of the workforce, but on average, they make just over three-fourths of the salary of their male counterparts.
• The most recent Bureau of Labor statistics show that salary disparity between men’s’ and women’s wages widened slightly from during 2008. On average, women now earn $.77 for every $1 a man earns, down from $.78 in 2007, for an annual median salary of just over $36,000.
• The National Research Council reported that women leave high technology, computer, science and engineering careers twice as frequently as men and women’s salaries in those professions still lag behind those of males by 12% to 15%.
• The number of women CEOs also declined slightly in the past two years. Currently, women hold the top spots at only one dozen Fortune 500 companies; while 24 Fortune 1000 companies are run by women, according to Fortune Magazine.
According to the latest statistics released by the Bureau of Labor Statistics on November 6, men bore the brunt of the layoffs representing 72% of the 7.3 million jobs lost since the recession began in December 2007. The disproportionately higher job losses incurred by men are attributable to the fact that over 50% of the jobs lost have been in male dominated fields such as automotive, construction and manufacturing.
With so many men losing their jobs, many women now find themselves the family breadwinner, so the pressure is on to make up the salary shortfall and move up the corporate ladder.
The average disparity of 23 cents between a man and a woman’s wages may sound negligible, but over the course of a working lifetime those pennies add up. The wage gap costs the average American full-time woman worker between $700,000 and $2 million over the course of her lifetime, according to economist Evelyn Murphy, president of the Women Are Getting Even (WAGE) Project, a non-profit, grass roots organization formed in 2006 to close the salary gap.
In the high technology, engineering and scientific sectors, the macro-economic levels of male vs. female do not obviously “ show up,” noted Caroline Simard, vice president of research and executive programs for the Anita Borg Institute for Women and Technology in Palo Alto, California. Simard’s research indicates women are more vulnerable specifically because they are less networked and therefore more susceptible to losing a job and are faced with more challenges when seeking new employment opportunities.
“It’s hugely important for women to network; it’s not enough to just work hard. Networking is one of the most powerful predictor’s of advancement and salaries,” Simard said.
Anecdotally, men are very supportive of other men and have typically lobbied on each other’s behalf for swifter promotions, bigger raises and better performance reviews. One woman who spent over 20 years performing admirably at her consulting firm in the Northeast, including traveling the globe and being a top revenue generator, was consistently passed over for a promotion to vice president. Her male counterparts who had a fraction of her experience, came in a lower grade and salary level but quickly passed her in the ranks, achieving the coveted VP title in two or three years. Another woman in this same organization was assigned to report to a younger, less experienced male colleague who was pegged as an up-and-comer and put on the fast track for promotion. When it came time for performance reviews and merit raises, the more experienced woman got a miniscule salary increase and was bypassed for a promotion because her younger boss deemed that her writing lacked the necessary analytic abilities. Ironically, the woman in question had garnered numerous writing awards and was in great demand among the consulting firm’s clients!
While women in high technology will often chat and engage in social activities during the regular office day, they have not heretofore made a concerted effort at networking.
The traditional tried and proven male methods of networking like golf outings or bonding over drinks after work at a local watering hole do not come easily or naturally to women. More often than not, a woman engineer, IT manager, software developer or C-level executive will be a very small minority or perhaps the only female in her immediate group. This can be an isolating and daunting experience. While not specifically excluded from accompanying her male peers to sporting events as a participant or spectator or going with them for drinks after work, many women feel uncomfortable. And many women, who are also wives and mothers, simply don’t have the luxury of going to bars after hours for networking over peanuts and beers.
“Women must network laterally and upwardly – including with supportive men. Women need the connections up to help open the doors to upward mobility,” Simard said, observing that “if you’re the only woman in your group it will be harder to network.”
Women are well advised to get on internal corporate as well as industry committees and task forces and to join their specific industry associations in order to gain external recognition, which they can then use as leverage within their organizations. in order to bring it back to you internally.
“Working harder does not make you more visible it can make you invisible,” Simard observed. “Women need to view networking as being a part of their daily work,” she added.
The Anita Borg Institute runs negotiation programs to teach women specific networking and negotiation tactics. Women who don’t negotiate for better pay and benefits at the outset of their careers are negatively impacted over the long term and will almost certainly get paid less over the course of their careers, Simard said.
Theory and practice are frequently at loggerheads. The growing bodies of research on gender-based workplace disparities are clear that women must become more assertive in order to be heard, especially in male dominated fields. The conundrum facing women is that if they’re too assertive they will be viewed negatively and classified as intimidating or worse.
“Women must learn to navigate that high wire act,” Simard said, noting that even women will view an assertive woman negatively. To correctly assess the tone of your organization, women should seek out a mentor who will help them read and clarify various work related issues and advise them on the best courses of actions for dealing with specific situations and different personality types.
Another way to burst out of the Pink Ghetto is to address the innate gender bias that exists in many organizations’ hiring, recruiting and retention practices. The Anita Borg Institute’s initiatives center on helping companies to realize that they need and want diversity in their corporate culture and communications styles. “The upcoming generation is the most diverse this country has ever seen. Good managers are those that can adequately deal with diversity,” Simard said.
Women in high technology who want to wend their way through the organization and reach the upper echelons in salary and job titles should avail themselves of the growing number of women’s conferences. Online social networking sites like Facebook and LinkedIn are also great sources for networking, reconnecting with former colleagues and supervisors and meeting potential mentors. Don’t hesitate to ask Facebook and LinkedIn connections to write references and recommendations for you. And above all, cultivate these relationships, seek out mentors and be a mentor.

Women in IT Need to Network to Break Out of the “Pink Ghetto” Read More »

Scroll to Top