News

IBM STG Group Posts Positive Gains, Offers Strong Strategy & Growth Roadmap

Vendor sponsored Analyst conferences are oftentimes long on self-congratulatory hyperbole and short on substance. That wasn’t the case with IBM’s Systems and Technology Group Analyst conference held last week in Rye Brook, NY.

The STG conference, led by Rod Adkins, Senior Vice President of the STG Group, showcased the division’s solid accomplishments over the last several years and detailed the current and future product roadmap and investment strategy. Investments focused around three major areas: Systems, growth markets and strategic acquisitions. Adkins could have easily added a fourth category: patents. The U.S. Patent Office granted IBM’s STG division 2,680 patents in 2010 and it could exceed that number in 2011. One only has to scan the headlines and peruse the ongoing patent purchasing frenzy and the plethora of lawsuits involving all of the major vendors to realize the pivotal role patents play as both and offensive and defensive weapon. IBM, in its Centenary year, holds more patents than any other U.S. technology vendor.

STG 2011 Milestones

Noting that STG is aligned with IBM’s overall growth strategy, Adkins detailed the division’s milestones throughout the first three quarters in 2011. They included: …

IBM STG Group Posts Positive Gains, Offers Strong Strategy & Growth Roadmap Read More »

Spring 2011: Hackers Had a Bonanza

Hackers have had a bonanza in April, May and June (so far). Nary has a day gone by without news of yet another major attack. Here’s a partial list of some of the most publicized hacks of the last 10 weeks:

RSA Security: On April 1, in a move akin to raiding Fort Knox, RSA’s Secure ID technology (one of the industry’s gold standards in security software) was hacked. RSA executives described the hack as “very sophisticated.” They characterized it as an advanced persistent threat (APT)-type targeted attack. It used a routine tactic – a phishing Email that contained an infected attachment that was triggered when opened.

Epsilon:  This Irving, TX –based company handles customer email messaging for over 150 firms, including large banks and retailers like Best Buy, JPMorgan Chase, Citigroup and L.L.Bean. In April, millions of consumers learned that Epsilon’s networks were breached when they received Emails from their banks and credit card companies informing them that the hack might have exposed their names and Email addresses to the hackers. Epsilon released a statement assuring consumers that only Email addresses and names were compromised and that no sensitive data was disclosed. …

Spring 2011: Hackers Had a Bonanza Read More »

Security Wars: Time to Use Continuous Monitoring Tools to Thwart Hackers

It’s time for corporations to wise up and use the latest, most effective weapons to safeguard and secure their data.

High tech devices, software applications, Emails, user accounts, social media and networks – even those presumed safe — are being hacked with alarming alacrity and ease.

Security tools, encryption and updating your networks with the latest patches are certainly necessary, but they are not enough. Corporations must arm themselves with the latest security tools and devices in order to effectively combat the new breed of malware, malicious code and ever more proficient hackers. I’m referring to the new breed of continuous monitoring tools that identify, detect and shut down vulnerabilities before hackers can find and exploit them. …

Security Wars: Time to Use Continuous Monitoring Tools to Thwart Hackers Read More »

2011 YTD in High Tech: Bold Aggressive Actions

It’s hard to believe but the first quarter of 2011 is now a memory and we’re well into spring. The tone for the year in high technology was set in early January: fast, bold, aggressive action and sweeping management changes.

In the first four months of the year high tech vendors moved quickly and decisively to seize opportunities in established sectors (smart phones, virtualization, back-up and disaster recovery) and emerging markets (cloud computing, tablet devices and unified storage management). As 2011 unfolds, it’s apparent that high technology vendors are willing to shift strategies and shed executives in order to stay one step ahead of or keep pace with competitors. The competition is cutthroat and unrelenting. No vendor, no matter how dominant its market share, how pristine its balance sheet or how deep its order backlog and book to bill ratio dares relax or rest on its laurels for even a nanosecond.

Recaps of some of the year’s highlights thus far are very revealing. …

2011 YTD in High Tech: Bold Aggressive Actions Read More »

As Ellison Rips Rivals, Oracle Services Slip, Support Prices Soar

Memo to Larry Ellison: The Roman Coliseum halted gladiator combats around 435 A.D. SAP has thrown in the towel and has no interest in continuing a court battle. Hewlett-Packard executives are refusing to accept service on your lawsuits and HP’s newly named chief executive Leo Apotheker is laying low, presumably dodging your increasingly vituperative verbal assaults. You’ve got no takers for the bloody, bare knuckles brawl you crave. What does that tell you?

It should signal an end to the Circus Maximus sideshow but it won’t.

No one desires this much attention or sticks their chin out spoiling for a fight like Ellison. And in an industry like high tech that’s overflowing with giant egos, that’s saying something. It’s true that Ellison’s antics always make for reams and reams of good copy. Reporters calling for comments on the latest developments don’t even bother to suppress their mirth. Enough is enough, though. The Larry Ellison Show would be more amusing if corporate customers weren’t getting caught in the crossfire. …

As Ellison Rips Rivals, Oracle Services Slip, Support Prices Soar Read More »

Oracle & HP Appear to Have Made Up But They’re Gearing up for Battle

“When two elephants fight, it is the grass that gets trampled.”

— African proverb

Hewlett-Packard Co. and Oracle Corp.’s decision to settle the lawsuit over Oracle’s hiring of Mark Hurd as co-President after weeks of public wrangling is welcome news to everyone but the corporate attorneys.

But don’t expect the two vendors to just pick up and resume their former close partnership. It got very ugly, very fast. And the reverberations from Hurd’s hiring to HP’s recent appointment of Leo Apotheker, as the new CEO effective November 1, will be felt for a long time. HP’s decision to hire the German-born Apotheker, who is also the former CEO of SAP, is to put it politely a big “take that, Oracle!” Forget the surface smiles, behind the scenes Oracle and HP have their ears pinned back, teeth bared and swords sharpened as they gird for battle.

This was not the typical cross-competitive carping that vendors routinely spew to denigrate their rivals’ products and strategies. The issues between HP and Oracle are very personal and very deep. The verbal volleys Oracle CEO Larry Ellison lobbed at HP in recent weeks exposed the changing nature of this decades old alliance. It is morphing from a close, mutually beneficial collaboration to a head-on collision in several key product areas. Ellison’s words did more than just wound HP: they also opened up deep fissures in the relationship which are as big as the San Andreas Fault. …

Oracle & HP Appear to Have Made Up But They’re Gearing up for Battle Read More »

SQL Server Most Secure Database; Oracle Least Secure Database Since 2002

Ask any 10 qualified people to guess which of the major database platforms is the most secure and chances are at least half would say Oracle. That is incorrect.

The correct answer is Microsoft’s SQL Server. In fact, the Oracle database has recorded the most number of security vulnerabilities of any of the major database platforms over the last eight years.

This is not a subjective statement. The data comes directly from the National Institute of Standards and Technology.

Since 2002, Microsoft’s SQL Server has compiled an enviable record. It is the most secure of any of the major database platforms. SQL Server has recorded the fewest number of reported vulnerabilities — just 49 from 2002 through June 2010 — of any database. These statistics were compiled independently by the National Institute of Standards and Technology (NIST), the government agency that monitors security vulnerabilities by technology, vendor, and product (see Exhibit 1). So far in 2010, through June, SQL Server has a perfect record — no security bugs have been recorded by NIST CVE.

And SQL Server was the most secure database by a wide margin: Its closest competitor, MySQL (which was owned by Sun Microsystems until its January 2010 acquisition by Oracle) recorded 98 security flaws or twice as many as SQL Server.

By contrast, during the same eight-and-a-half year period spanning 2002 through June 2010, the NIST CVE recorded 321 security vulnerabilities associated with the Oracle database platform, the highest total of any major vendor. Oracle had more than six times as many reported security flaws as SQL Server during the same time span. NIST CVE statistics recorded 121 security-related issues for the IBM DB2 platform during the past eight-and-a-half years.

Solid security is an essential element for many mainstream line-of-business (LOB) applications, and a crucial cornerstone in the foundation of every organization’s network infrastructure. Databases are the information repositories for many organizations; they contain much of the sensitive corporate data and intellectual property. If database security is compromised, the entire business is potentially at risk.

SQL Server’s unmatched security record is no fluke. It is the direct result of significant Microsoft investment in its Trustworthy Computing Initiative, which the company launched in 2002. In January of that year, Microsoft took the step of halting all new code development for several months across its product lines to scrub the code base and make its products more secure.

The strategy is working. In the past 21 months since January 2009, Microsoft has issued only eight (8) SQL Server security-related alerts. To date in 2010 (January through June), there have been no SQL Server vulnerabilities recorded by Microsoft or NIST. Microsoft is the only database vendor with a spotless security record the first six months of 2010.

ITIC conducted an independent Web-based survey on SQL Server security that polled 400 companies worldwide during May and June 2010. The results of the ITIC 2010 SQL Server Security survey support the NIST CVE findings. Among the survey highlights:
• An 83% majority rated SQL Server security “excellent” or “very good” (see Exhibit 2, below).
• None of the 400 survey respondents gave SQL Server security a “poor” or “unsatisfactory” rating.
• A 97% majority of survey participants said they experienced no inherent security issues with SQL Server.
• Anecdotal data obtained during first-person customer interviews also elicited a very high level of satisfaction with the embedded security functions and capabilities of SQL Server 7, SQL Server 2000, SQL Server 2005, SQL Server 2008, and the newest SQL Server 2008 R2 release. In fact, database administrators, CIOs and CTOs interviewed by ITIC expressed their approbation with Microsoft’s ongoing initiatives to improve SQL Server’s overall security and functionality during the last decade starting with SQL Server 2000.

Strong security is a must for every organization irrespective of size or vertical industry. Databases are among the most crucial applications in the entire network infrastructure. Information in databases is the organization’s intellectual property and life blood.

Databases are essentially a company’s electronic filing system. The information contained in the database directly influences and impacts every aspect of the organization’s daily operations including relationships with customers, business partners, suppliers and its own internal end users. All of these users must have the ability to quickly, efficiently and securely locate and access data. The database platform must be secure. An insecure, porous database platform will almost certainly compromise business operations and by association, any firm that does business with it. Any lapses in database security, including deliberate internal and external hacks, inadvertent misconfiguration, or user errors can mean lost or damaged data, lost revenue, and damage to the company’s reputation, raising the potential for litigation and loss of business.

It’s also true that organizations bear at least 50 percent of the responsibility for keeping their databases and their entire network infrastructures secure. As the old proverb goes, “The chain is only as secure as its weakest link.” Even the strongest security can be undone or bypassed by user error, misconfiguration or weak computer security practices. No database or network is 100 percent hack-proof or impregnable.Organizations should consult with their vendors regarding any questions and concerns they may have about the security of ANY of their database platforms. They should also ensure they stay updated with the latest patches and install the necessary updates. Above all, bolster the inherent security of your databases with the appropriate third party security tools and applications. Make sure your organization strictly adheres to best computer security computing practices. At the end of the day only you can defend your data.

Registered ITIC site users can Email me at: ldidio@itic-corp.com for a copy of the full report.

SQL Server Most Secure Database; Oracle Least Secure Database Since 2002 Read More »

Cloud Computing: Pros and Cons

Cloud computing like any emerging new technology has both advantages and disadvantages. Before beginning any infrastructure upgrade or migration, organizations are well advised to first perform a thorough inventory and review of their existing legacy infrastructure and make the necessary upgrades, revisions and modifications. Next, the organization should determine its business goals for the next three-to-five years to determine when, if and what type of cloud infrastructure to adopt. It should also construct an operational and capital expenditure budget and a timeframe that includes research, planning, testing, evaluation and final rollout.
Public Clouds: Advantages and disadvantages
The biggest allure of a public cloud infrastructure over traditional premises-based network infrastructures is the ability to offload the tedious and time consuming management chores to a third party. This in turn can help businesses:
• Shave precious capital expenditure monies because they avoid the expensive investment in new equipment including hardware, software, and applications as well as the attendant configuration planning and provisioning that accompanies any new technology rollout.
• Accelerated deployment timetable. Having an experienced third party cloud services provider do all the work also accelerates the deployment timetable and most likely means less time spent on trial and error.
• Construct a flexible, scalable cloud infrastructure that is tailored to their business needs. A company that has performed its due diligence and is working with an experienced cloud provider can architect a cloud infrastructure that will scale up or down according to the organization’s business and technical needs and budget.
The potential downside of a public cloud is that the business is essentially renting common space with other customers. As such, depending on the resources of the particular cloud model, there exists the potential for performance, latency and security issues as well as acceptable response and service and support from the cloud provider.
Risk is another potential pitfall associated with outsourcing any of your firm’s resources and services to a third party. To mitigate risk and lower it to an acceptable level, it’s essential that organizations choose a reputable, experienced third party cloud services provider very carefully. Ask for customer references; check their financial viability. Don’t sign up with a service provider whose finances are tenuous and who might not be in business two or three years from now.
The cloud services provider must work closely and transparently with the corporation to build a cloud infrastructure that best suits the business’ budget, technology and business goals.
To ensure that the expectations of both parties are met, organizations should create a checklist of the items and issues that are of crucial importance to their business and incorporate them into Service Level Agreements (SLAs) Be as specific as possible. These should include but are not limited to:

• What types of equipment do they use?
• How old is the server hardware? Is the configuration powerful enough?
• How often is the data center equipment/infrastructure upgraded?
• How much bandwidth does the provider have?
• Does the service provider use open standards or is it a proprietary datacenter?
• How many customers will you be sharing data; resources with?
• Where is the cloud services provider’s datacenter physically located?
• What specific guarantees if any, will it provide for securing sensitive data?
• What level of guaranteed response time will it provide for service and support?
• What is the minimum acceptable latency/response time for its cloud services?
• Will it provide multiple access points to and from the cloud infrastructure?
• What specific provisions will apply to Service Level Agreements (SLAs)?
• How will financial remuneration for SLA violations be determined?
• What are the capacity ceilings for the service infrastructure?
• What provisions will there be for service failures and disruptions?
• How are upgrade and maintenance provisions defined?
• What are the costs over the term of the contract agreement?
• How much will the costs rise over the term of the contract?
• Does the cloud service provider use the Secure Sockets Layer (SSL) to transmit data?
• Does the cloud services provider encrypt the resting data to prohibit and restrict access?
• How often does the cloud services provider perform audits?
• What mechanisms will it use to quickly shut down a hack and can it track a hacker?
• If your cloud services provider is located outside your country of origin, what are the privacy and security rules of that country and what impact will that have on your firm’s privacy and security issues?
Finally, the corporation should appoint a liaison and that person should meet regularly with a representative from the cloud services provider to ensure that the company attains its immediate goals and that it is always aware and working on future technology and business goals. Outsourcing all or any part of your infrastructure to a public cloud does not mean forgetting and abandoning it.
Private Clouds: Advantages and Disadvantages
The biggest advantage of a private cloud infrastructure is that your organization keeps control of its corporate assets and can safeguard and preserve its privacy and security. Your organization is in command of its own destiny. That can be a double-edged sword.
Before committing to build a private cloud model the organization must do a thorough assessment of its current infrastructure, its budget and the expertise and preparedness of its IT department. Is your firm ready to assume the responsibility for such a large burden from both a technical and ongoing operational standpoint? Only you can answer that. Remember that the private cloud should be highly reliable and highly available – at least 99.999% uptime with built-in redundancy and failover capabilities. Many organizations currently struggle to maintain 99.9% uptime and reliability which is the equivalent of 8.76 hours of per server, per annum downtime. When your private cloud is down for any length of time, your end users (and anyone else who has access to the cloud) will be unable to access resources.
Realistically, in order for an organization to successfully implement and maintain a private cloud, it needs the following:
• Robust equipment that can handle the workloads efficiently during peak usage times
• An experienced, trained IT staff that is familiar with all aspects of virtualization, virtualization management, grid, utility and chargeback computing models
• An adequate capital expenditure and operational expenditure budget
• The right set of private cloud product offerings and service agreements
• Appropriate third party virtualization and management tools to support the private cloud
• Specific SLA agreements with vendors, suppliers and business partners
• Operational level agreements (OLAs) to ensure that each person within the organization is responsible for specific routine tasks and in the event of an outage
• A disaster recovery and backup strategy
• Strong security products and policies
• Efficient chargeback utilities, policies and procedures
Other potential private cloud pitfalls include: deciding which applications to virtualize; vendor lock-in and integration and interoperability issues. Businesses grapple with these same issues today in their existing environments. At present, however, the product choices from vendors and third party providers are more limited for virtualized private cloud offerings. Additionally, since the technology is still relatively new, it will be difficult from both a financial as well as technical standpoint to switch horses in midstream from one cloud provider to another if you encounter difficulties.
There is no doubt that virtualized public and private cloud infrastructures adoptions will grow significantly in the next 12 to 18 months. In order to capitalize on their benefits, lower your total cost of ownership (TCO), accelerate return on investment (ROI) and mitigate risk your organization should take its time and do it right.

Cloud Computing: Pros and Cons Read More »

Apple, Google Grapple for Top Spot in Mobile Web

Since January, the high technology industry has witnessed a dizzying spate of dueling, vendor product announcements.
So what else is new? It’s standard operating procedure for vendors to regularly issue hyperbolic proclamations about their latest/greatest offering, even (or especially) when the announcements are as devoid of content as cotton candy is of nutritional value. Maybe it’s just an outgrowth of the digital information age. We live and breathe instant information that circumnavigates the globe faster than you can say Magellan; the copy monster must be fed constantly. Or maybe it’s the protracted economic downturn which is making vendors hungrier than ever for consumer and corporate dollars.
Whatever the reason, there’s no doubt that high technology vendors – led by Google and Apple – are engaged in a near constant game of one-upmanship.
Apple indirectly started this trend in early January, when word began leaking out that Apple would finally announce the long-rumored iPad tablet in late January. The race was on among other tablet vendors to announce their products at the Consumer Electronics Show (CES) in Las Vegas in mid-January to beat Apple to the punch. A half-dozen vendors including, ASUSTeK Computer (ASUS), Dell, Hewlett-Packard, Lenovo, Taiwanese manufacturer Micro Star International (MSI) and Toshiba all raced to showcase their forthcoming wares in advance of Apple. It made good marketing sense: all of these vendors knew that once Apple released the iPad, that their chances of getting PR would be sorely diminished.
I have no problem with smaller vendors or even large vendors like Dell and HP, who rightfully reckon that they have to make their announcements in advance of a powerhouse like Apple to ensure that their products don’t get overlooked.
Apple vs. Google Battle of the Mobile Web Titans
But when the current industry giants and media darlings like Apple and Google start slugging it out online, in print and at various conferences, it’s overwhelming.
Apple and Google are just the latest in a long line of high technology rivalries. In the 1970s it was IBM vs. HP; in the 1980s, the rise of networking created several notable rivalries: IBM vs. Digital Equipment Corp. (DEC); IBM vs. Microsoft; Oracle vs. IBM; Novell vs. 3Com; Novell vs. Microsoft; Cabletron vs. Synoptics and Cisco vs. all the internetworking vendors. By the 1990s it was Microsoft vs. Netscape and Microsoft vs. pretty much everyone else.
The Apple vs. Google rivalry differs from earlier technology contests in that the relationship between the two firms began as a friendly one and to date, there has been no malice. Until August, 2009 Google CEO Eric Schmidt was on Apple’s board of directors. And while the competition between these two industry giants is noticeably devoid of the rancor that characterized past high tech rivalries, it’s safe to say that the two are respectfully wary of each other. Apple and Google are both determined not to let the other one get the upper hand, something they fear will happen if there is even the slightest pause in the endless stream of headlines.
Google and Apple started out in different markets – Google in the online search engine and advertising arena and Apple as a manufacturer of consumer hardware devices and software applications. Their respective successes – Apple’s with its Mac hardware and Google’s with its search engine of the same name have led them to this point: a head to head rivalry in the battle for supremacy of the mobile Web arena.
On paper, they appear to be two equally matched gladiators. Both companies have huge amounts of cash. Apple has $23 billion in the bank and now boasts the highest valuation of any high technology company, with a current market cap of $236.3 billion, surpassing Microsoft for the top spot. Google has $26.5 billion in cash and a valuation of $158.6 billion. Both firms have two of the strongest management and engineering teams in Silicon Valley. Apple has the iconic Steve Jobs who since his return has re-vitalized the company. Google is helmed by co-founders and creative geniuses Larry Page and Sergey Brin and since 2006 and Eric Schmidt, the CEO who knows how to build computers and make the trains run on time.
Fueling this rivalry is Apple’s and Google’s stake in mobile devices and operating systems. In Apple’s case this means the wildly successful iPhone, iPod Touch and most recently the iPad and the Mac Mini. Google’s lineup consists of its Chrome OS and Android OS which will power tablet devices like Dell’s newly announced Streak, Lenovo’s forthcoming U1 hybrid tablet/notebook due out later this year. The rivalry between the two is quite literally getting down to the chip level. Intel, which has for so long been identified with Microsoft’Windows-based PC platform is now expanding its support for Android – a move company executives have described as its “port of choice” gambit. Apple is no slouch in this area, either: its Macs – from the Mac Minis’ to the MacBook Pros, ship with Intel inside. Last week Nvidia CEO Jen-Hsun Huang weighed in on the Apple/Google rivalry on Google’s side, predicting that the tablet designs will converge around Google’s operating system.
But a stroll through any airport, mall, consumer home or office would give a person cause to dispute Huang’s claim: iPads and iPhones are everywhere. Apple recently announced that it has sold over two million iPads since the device first shipped in April. During a business trip from Boston to New Orleans last week I found that Apple iPads were as much in evidence as hot dogs at a ballpark.
Ironically, Microsoft, a longer term traditional rival of both Apple and Google is not mentioned nearly so often in the smart phone and tablet arenas. That’s because Microsoft’s Windows OS is still searching for a tablet to call its own. Longtime Microsoft partner HP, abruptly switched course: after Microsoft CEO Steve Ballmer got on stage and demonstrated Windows 7 running on HP’s slate, HP bought Palm and earlier this week acquired the assets of Phoenix Technologies which makes an operating system for tablets. That leaves Microsoft to promote its business centric Windows 7 phone which will run Xbox LIVE games, Zune music and the company’s Bing search engine. All is not lost for Microsoft: longtime “frenemy” Apple CEO Steve Jobs said recently that the new iPhone 4G will run Microsoft’s Bing fueling speculation that Apple will drop support for Google’s search engine. Both Google and Apple are still competing with Microsoft in other markets like operating systems, games and application software to name a few, but that’s another story.
There are other competitors in the smart phone and tablet markets but you’d hardly know it from the headlines. Research In Motion’s (RIM) Blackberry is still a market leader. But Apple and Google continue to dominate the coverage. I guess high technology just like sports revels in a classic rivalry. And this one promises to be a hard fought struggle.

Apple, Google Grapple for Top Spot in Mobile Web Read More »

Virtualization Deployments Soar, But Companies Prefer Terra Firma to Cloud for now

The ongoing buzz surrounding cloud computing – particularly public clouds – is far outpacing actual deployments by mainstream users. To date only 14% of companies have deployed or plan to deploy a private cloud infrastructure within the next two calendar quarters.
Instead, as businesses slowly recover from the ongoing economic downturn, their most immediate priorities are to upgrades to legacy desktop and server hardware, outmoded applications and to expand their virtualization deployments. Those are the results of the latest ITIC 2010 Virtualization and High Availability survey which polled C-level executives and IT managers at 400 organizations worldwide.
ITIC partnered with Stratus Technologies and Sunbelt Software to conduct the Web-based survey of multiple choice questions and essay comments. ITIC also conducted first person interviews with over two dozen end to obtain anecdotal responses on the primary accelerators or impediments to virtualization, high availability and reliability, cloud computing. The survey also queried customers on whether or not their current network infrastructure and mission critical applications were adequate enough to handle new technologies and the increasing demands of the business.
The survey showed that for now at least, although, many midsized and large enterprises are contemplating a move to the cloud – especially a private cloud infrastructure – the technology and business model is still not essential for most businesses. Some 48% of survey participants said they have no plans to migrate to private cloud architecture within the next 12 months while another 33% said their companies are studying the issue but have no firm plans to deploy.

The study also indicates that Private Cloud deployments are outpacing Public Cloud Infrastructure deployments by a 2 to 1 margin. However before businesses can begin to consider a private cloud deployment they must first upgrade the “building block” components of their existing environments e.g., server and desktop hardware, WAN infrastructure; storage, security and applications. Only 11% of businesses described their server and desktop hardware as leading edge or state-of-the-art. And just 8% of respondents characterized their desktop and application environment as leading edge.

The largest proportion of the survey participants – 52% – described their desktop and server hardware working well, while 48% said their applications were up-to-date. However, 34% acknowledged that some of their server hardware needed to be updated. A higher percentage of users 41% admitted that their mission critical software applications were due to be refreshed. And a small 3% minority said that a significant portion of both their hardware and mission critical applications were outmoded and adversely impacting the performance and reliability of their networks.

Based on the survey data and customer interviews, ITIC anticipates that from now until October, companies’ primary focus will be on infrastructure improvements.

Reliability and Uptime Lag

The biggest surprise in this survey from the 2009 High Availability and Fault Tolerant survey, which ITIC & Stratus conducted nearly one year ago, was the decline in the number of survey participants who said their organizations required 99.99% uptime and reliability. In this latest survey, the largest portion of respondents – 38% — or nearly 4 out of 10 businesses said that 99.9% uptime — the equivalent of 8.76 hours of per server, per annum downtime was the minimum acceptable amount for their mission critical line of business (LOB) applications. This is more than three times the 12% of respondents who said that 99.9% uptime was acceptable in the prior 2009 survey. Overall, 62% or nearly two-thirds of survey participants indicated their organizations are willing to live with higher levels of downtime than were considered acceptable in previous years.
Some 39% of survey respondents – almost 4 out of 10 respondents indicated that their organizations demand high availability which ITIC defines as four nines of uptime or greater. Specifically, 27% said their organizations require 99.99% uptime; another 6% need 99.999% uptime and a 3% minority require the highest 99.999% level of availability.
The customer interviews found that the ongoing economic downturn, aged/aging network infrastructures (server and desktop hardware and older applications), layoffs, hiring freezes and the new standard operating procedure (SOP) “do more with less” has made 99.9% uptime more palatable than in previous years.
Those firms that do not keep track of the number and severity of their outages have no way of gauging the financial and data losses to the business. Even a cursory comparison indicates substantial cost disparities between 99% uptime and 99.99% uptime. The monetary costs, business impact and risks associated with downtime will vary by company as well as the duration and severity of individual outage incidents. However a small or midsize business, for example, which estimates the hourly cost of downtime to be a very conservative $10,000 per hour, would potentially incur losses of $876,000 per year at a data center with 99% application availability (87 hours downtime). By contrast, a company whose data center operations has 99.99% uptime, would incur losses of $87,600 or one-tenth that of a firm with conventional 99% availability.
Ironically, the need for rock-solid network reliability has never been greater. The rise of Web-based applications and new technologies like virtualization and Service Oriented Architecture (SOA), as well as the emergence of public or shared cloud computing models are designed to maximize productivity. But without the proper safeguards these new datacenter paradigms may raise the risk of downtime. The Association for Computer Operations Management/ Data Center Institute (AFCOM) forecasts that one-in-four data centers will experience a serious business disruption over the next five years.
At the same time, customer interviews revealed that over half of all businesses 56% lack the budget for high availability technology. Another ongoing challenge is that 78% of survey participants acknowledged that their companies either lack the skills or simply do not attempt to quantify the monetary and business costs associated with hourly downtime. The reasons for this are well documented. Some organizations don’t routinely do this and those that attempt to calculate costs and damages run into difficulties collecting data because the data resides with many individuals across the enterprise. Inter-departmental communication, cooperation and collaboration is sorely lacking at many firms. Only 22% of survey respondents were able assign a specific cost to one hour of downtime and most of them gave conservative estimates of $1,000 to $25,000 for a one hour network outage. Only 13% of the 22% of survey participants who were able to quantify the cost of downtime indicated that their hourly losses would top $175,000 or more.

Users Confident and Committed to Virtualization Technology
The news was more upbeat with respect to virtualization – especially server virtualization deployments. Organizations are both confident and comfortable with virtualization technology.
72% of respondents indicated the number of desktop and server-based applications demanding high availability has increased over the past two years. The survey also found that a 77% majority of participants run business critical applications on virtual machines. Not surprisingly, the survey data showed that virtualization usage will continue to expand over the next 12 months. A 79% majority – approximately eight-out-of-10 respondents — said the number of business critical applications running on virtual machines and virtual desktops will increase significantly over the next year. Server virtualization is very much a mainstream and accepted technology. The responses to this question indicate increased adoption as well as confidence. Nearly one-quarter of the respondents – 24% say that more than 75% of their production servers are VMs. Overall 44% of respondents say than over 50% of their servers are VMs. However, none of the survey participants indicate that 100% of their servers are virtualized. Additionally, only 6% of survey resp

Virtualization Deployments Soar, But Companies Prefer Terra Firma to Cloud for now Read More »

Scroll to Top