advertising

Facebook IPO Flops; Can it be Fixed?

The honeymoon is over for Facebook and Mark Zuckerberg. In fact, it ended before it began.

Facebook’s long-awaited and much hyped IPO is less than a week old and the blame game is on as the company has lost nearly 20% of its value since the initial offering.

After three days of trading Wall Street’s take on Facebook has gone from jubilant to jaundiced.

The stock ended its first full day of trading at $38.23 – essentially flat from its $38 opening price though it did manage to set an IPO record for the sheer volume of trades — 567 million shares on opening day last Friday. Investors hoped for a turnaround. That never materialized. On Monday, a selloff prompted the shares to fall by nearly 11%, ending at $34.03. The news worsened Tuesday. The stock sank another eight percent down trading in the $31 – $32 range. …

Facebook IPO Flops; Can it be Fixed? Read More »

National Advertising Council Tells Oracle to Discontinue Misleading IBM Ads

The always heated ongoing rivalry between Oracle and IBM, just got more contentious, with the recent news that the National Advertising Division (NAD) has called out Oracle for publishing misleading ads in The Wall Street Journal and The Economist claiming Oracle’s T4-4 server is 2x faster and 66% cheaper than IBM’s comparable P795 server.

NAD, a division of the Council of Better Business Bureaus, based in New York City recommended that Oracle discontinue “certain comparative performance and pricing claims” in the national newspaper ads and on the www.Oracle.com website. Specifically, the NAD took exception to Oracle advertisements claim that “Oracle’s SPARC SuperCluster T4-4 system retails for $1.2 million whereas IBM’s P795 high end server costs $4.5 million – an improbable $3.3 million price discrepancy.

The NAD functions as an objective and impartial self-regulatory forum for the advertising industry. In its official determination, the NAD took pains to remain objective. It noted that both the advertiser (Oracle) and the challenger (IBM) produce high quality computer systems. …

National Advertising Council Tells Oracle to Discontinue Misleading IBM Ads Read More »

Security Wars: Time to Use Continuous Monitoring Tools to Thwart Hackers

It’s time for corporations to wise up and use the latest, most effective weapons to safeguard and secure their data.

High tech devices, software applications, Emails, user accounts, social media and networks – even those presumed safe — are being hacked with alarming alacrity and ease.

Security tools, encryption and updating your networks with the latest patches are certainly necessary, but they are not enough. Corporations must arm themselves with the latest security tools and devices in order to effectively combat the new breed of malware, malicious code and ever more proficient hackers. I’m referring to the new breed of continuous monitoring tools that identify, detect and shut down vulnerabilities before hackers can find and exploit them. …

Security Wars: Time to Use Continuous Monitoring Tools to Thwart Hackers Read More »

The Patent Game: Everybody’s Playing, You Snooze, You Lose

“Let the future tell the truth, and evaluate each one according to his work and accomplishments. The present is theirs; the future, for which I have really worked, is mine.”

— Nikola Tesla

Thomas Edison and Nikola Tesla have a lot in common with Apple, Google, HTC, and Motorola & Research in Motion.

They were/are all warriors in the ongoing war to see who can amass the largest number of the most lucrative technology patents. Edison and Tesla waged their battle from the late 1860s through the 1920s and the stakes were just as high then as they are now.

Nary has a week gone by without mention of the latest contretemps among the high tech industry titans. There’s been no cessation of hostilities during the holiday season. If anything, top tier companies have become even more aggressive about solidifying and extending their dominance in and out of their core competencies as 2010 comes to a close. …

The Patent Game: Everybody’s Playing, You Snooze, You Lose Read More »

Happy 1st Birthday Windows 7; Now Can We Please Cancel Microsoft’s MidLife Crisis?

Windows 7 is now officially a year old. Since it was released October 22, 2009, Microsoft has sold over 240 million copies of the operating system — approximately seven copies per second. That makes it the fastest selling operating system in Microsoft’s history or any vendor’s history. Some industry pundits estimate that Windows 7 sales will top 300 million within the next six-to-eight months.

Microsoft has plenty of other reasons to celebrate Windows 7’s first birthday. Windows 7 has also been one of the most stable, reliable and secure releases in Microsoft’s history.

A three-quarters majority – 73 percent of the 400+ respondents to the latest joint ITIC/Sunbelt Software poll, gave Windows 7 an “excellent,” “very good” or “good” rating. …

Happy 1st Birthday Windows 7; Now Can We Please Cancel Microsoft’s MidLife Crisis? Read More »

Application Availability, Reliability and Downtime: Ignorance is NOT Bliss

Two out of five businesses – 40% – report that their major business applications require higher availability rates than they did two or three years ago. However an overwhelming 81% are unable to quantify the cost of downtime and only a small 5% minority of businesses are willing to spend whatever it takes to guarantee the highest levels of application availability 99.99% and above. Those are the results of the latest ITIC survey which polled C-level executives and IT managers at 300 corporations worldwide.

ITIC partnered with Stratus Technologies in Maynard, Ma. a vendor that specializes in high availability and fault tolerant hardware and software solutions, to compose the Web-based survey. ITIC conducted this blind, non-vendor and non-product specific survey which polled businesses on their application availability requirements, virtualization and the compliance rate of their service level agreements (SLAs). None of the respondents received any remuneration. The Web-based survey consisted of multiple choice and essay questions. ITIC analysts also conducted two dozen first person customer interviews to obtain detailed anecdotal data.

Respondents ranged from SMBs with 100 users to very large enterprises with over 100,000 end users. Industries represented: academic, advertising, aerospace, banking, communications, consumer products, defense, energy, finance, government, healthcare, insurance, IT services, legal, manufacturing, media and entertainment, telecommunications, transportation, and utilities. None of the survey respondents received any remuneration for their participation. The respondents hailed from 15 countries; 85% were based in North America.

Survey Highlights

The survey results uncovered many “disconnects” between the levels of application reliability that corporate enterprises profess to need and the availability rates their systems and applications actually deliver. Additionally, a significant portion of the survey respondents had difficulty defining what constitutes high application availability; do not specifically track downtime and could not quantify or qualify the cost of downtime and its impact on their network operations and business.

Among the other survey highlights:

  • A 54% majority of IT managers and executives surveyed said more than two-thirds of their companies’ applications require the highest level of availability – 99.99% — or four nines of uptime.
  • Over half – 52% of survey respondents said that virtualization technology increases application uptime and availability; only 4% said availability decreased as a result of virtualization deployments.
  • In response to the question, “which aspect of application availability is most important” to the business, 59% of those polled cited the prevention of unplanned downtime as being most crucial; 40% said disaster recovery and business continuity were most important; 38% said that minimizing planned downtime to apply patches and upgrades was their top priority; 16% said the ability to meet SLAs was most important and 40% of the survey respondents said all of the choices were equally crucial to their business needs.
  • Some 41% said they would be satisfied with conventional 99% to 99.9% (the equivalent of two or three nines) availability for their most critical applications. Ninety-nine percent or 99.9% does not qualify as a high-availability or continuous-availability solution.
  • An overwhelming 81% of survey respondents said the number of applications that demand high availability has increased in the past two-to-three years.
  • Of those who said they have been unable to meet service level agreements (SLAs), 72% can’t or don’t keep track of the cost and productivity losses created by downtime.
  • Budgetary constraints are a gating factor prohibiting many organizations from installing software solutions that would improve application availability. Overall, 70% of the survey respondents said they lacked the funds to purchase value-added availability solutions (40%); or were unsure how much or if their companies would spend to guarantee application availability (30%).
  • Of the 30% of businesses that quantified how much their firms would spend on availability solutions, 3% indicated they would spend $2,000 to $4,000; 8% said $4,000 to $5,000; another 3% said $5,000 to $10,000; 11% — mainly large enterprises indicated they were willing to allocate $10,000 to $15,000 to ensure application availability and 5% said they would spend “whatever it takes.”

According to the survey findings, just under half of all businesses – 49% – lack the budget for high availability technology and 40% of the respondents reported they don’t understand what qualifies as high availability. An overwhelming eight out of 10 IT managers – 80% — are unable to quantify the cost of downtime to their C-level executives.

To reiterate, the ITIC survey polled users on the various aspects and impact of application availability and downtime but it did not specify any products or vendors.

The survey results supplemented by ITIC first person interviews with IT managers and C-level executives clearly shows that on a visceral level, businesses are very aware of the need for increased application availability has grown. This is particularly true in light of the emergence of new technologies like application and desktop virtualization, cloud computing, Service Oriented Architecture (SOA). The fast growing remote, mobile and telecommuting end user population utilizes unified communications and collaboration applications and utilities is also spurring the need for greater application availability and reliability.

High Application Availability Not a Reality for 80% of Businesses

The survey results clearly show that network uptime isn’t keeping pace with the need for application availability. At the same time, IT managers and C-level executives interviewed by ITIC did comprehend the business risks associated with downtime, even though most are unable to quantify the cost of downtime or qualify the impact to the corporation, its customers, suppliers and business partners when unplanned application and network outages occur.

“We are continually being asked to do more with less,” said an IT manager at a large enterprise in the Northeast. “We are now at a point, where the number of complex systems requiring expert knowledge has exceeded the headcount needed to maintain them … I am dreading vacation season,” he added.

Another executive at an Application Service provider acknowledged that even though his firm’s SLA guarantees to customers are a modest 98%, it has on occasion, been unable to meet those goals. The executive said his firm compensated one of its clients for a significant outage incident. “We had a half day outage a couple of years ago which cost us in excess of $40,000 in goodwill payouts to a handful of our clients, despite the fact that it was the first outage in five years,” he said.

Another user said a lack of funds prevented his firm from allocating capital expenditure monies to purchase solutions that would guarantee 99.99% application availability. “Our biggest concern is keeping what we have running and available. Change usually costs money, and at the moment our budgets are simply in survival mode,” he said.

Another VP of IT at a New Jersey-based business said that ignorance is not bliss. “If people knew the actual dollar value their applications and customers represent, they’d already have the necessary software availability solutions in place to safeguard applications,” he said. “Yes, it does cost money to purchase application availability solutions, but we’d rather pay now, then wait for something to fail and pay more later,” the VP of IT said.

Overall, the survey results show that the inability of users to put valid metrics and cost formulas in place to track and quantify what uptime means to their organization is woefully inadequate and many corporations are courting disaster.

ITIC advises businesses to track downtime, the actual cost of downtime to the organization and to take the necessary steps to qualify the impact of downtime including lost data, potential liability risks e.g. lost business, lost customers, potential lawsuits and damage to the company’s reputation. Once a company can quantify the amount of downtime associated with its main line of business applications, the impact of downtime and the risk to the business, it can then make an accurate assessment of whether or not its current IT infrastructure adequately supports the degree of application availability the corporation needs to maintain its SLAs.

Application Availability, Reliability and Downtime: Ignorance is NOT Bliss Read More »

Scroll to Top