Laura DiDio

The Dog Days of Summer & High Tech Hijinks

In the mid-to-late 1980s colleagues and friends were surprised when I transitioned from working as an on camera investigative TV reporter to cover the then-fledgling high technology industry for specialized trade magazines.
After all they reasoned, how could I be content covering semiconductors, memory boards, server hardware, software and computer networks after working as a mainstream journalist covering stories such as lurid political and law enforcement corruption scandals ; drug trafficking; prostitution; dumping tainted substances on unsuspecting third world nations and cover-ups by big business when their planes, trains and automobiles malfunctioned? How could I trade in “murder and mayhem” for the staid, sterile world of high technology?
They needn’t have worried.
Admittedly, mastering the technology was a challenge. For the first few weeks every time I did story on PALs and had to spell out the acronym I wrote “Police Athletic League” instead of Programmable Array Logic. And then there was my first work-related trip to Las Vegas to cover the mammoth spectacle that was Comdex circa 1988. In the dark ages before wireless, laptops and decent broadband, it was nearly impossible to file stories from your hotel room because the trunk lines were overwhelmed. A colleague and I were forced to trek down to a bank of pay phones to transmit our news articles at 2:30 a.m. and were mistaken for hookers. The pay was arguably better than a journalist’s salary but we passed. Incidents like this made me feel close to my cops and crimes, murder and mayhem investigative TV roots.
I felt at home covering technology right away. Within a month, I was chronicling tales of high tech companies sending their top executives off to rehab for drug and alcohol addiction; there was a rash of top executives leaving established powerhouses like and taking top engineers and sales executives with them, which in turn precipitated a slew of theft of trade secrets and patent infringement lawsuits. Things really got interesting when Robert Morris, Jr. launched his now infamous Internet Worm; there were myriad other tales of sex scandals, involving corporate executives, board of director fights and coups, price fixing, hostile takeovers, corporate espionage and fiscal chicanery that entailed everything from embezzlement and theft to cooking the books .
Reality TV and the tabloids have nothing on high technology industry hijinks.
Fast forward to what’s making headlines during these “Dog Days” of summer 2010. The ancient Greeks and Romans believed that the dog days of summer (named after the constellation Sirius or Dog Star) lasted from late July to early September and hot weather foreshadowed evil doings. John Brady’s “Clavis Calendarium of 1813 describes it as “an evil time when the seas boiled, wine turned sour, dogs grew mad, and all creatures became languid, causing to man burning fevers, hysterics, and phrensies.” The recent spate of high tech headlines seems to bear that out. Here’s a sampling:
• The Hewlett-Packard board of directors abruptly fired CEO Mark Hurd, after allegations of sexual harassment surfaced.
• Oracle CEO Larry Ellison publicly blasted the HP board for firing Mark Hurd.
• Oracle sued Google for alleged patent and copyright infringement involving the use of Java intellectual property in Google’s mobile Android operating system.
• Google StreetView maps prompts privacy lawsuits and raids in several countries including South Korea
• Google releases version 6 of its Chrome web browser and vows to issue a stable new release every six weeks.
The headlines provide an accurate assessment of both the current state and the direction of the high tech industry. Four words say it all: sex, money, power and posturing. Let’s examine some of the stories in more detail.
The HP board of directors’ decision to fire CEO Mark Hurd after five years of stewardship remains cloaked in mystery. Hurd may or may not have been guilty of fudging expense reports and engaging in conduct not up to HP’s standards with Jodie Fisher, a contract HP “adviser” and sometime actress. In addition to being an adviser, Fisher also received $5,000 to attend HP events acting as a “meet and greet” hostess. Fisher, who retained the services of celebrity lawyer Gloria Allred, may or may not have been a victim of harassment. We don’t know for sure because all of the principals in this tableau are mum. Rumors are rife that the “real reason” the HP’s board may have shown Hurd the door is because: 1) he may have been more involved than was previously thought in the 2006 HP board of directors “pretexting” scandal. At that time, HP board members illegally spied on other board members to learn the source of news leaks and 2) Hurd was exceedingly unpopular with rank and file HP employees.
By all monetary measures, Hurd’s five year stint at HP was a resounding success. And for that, Hurd will walk away with a $40 to $50 million severance package. No one knows how much Fisher received, because Hurd and Fisher settled whatever transpired between them, privately. But it must be a pretty good sum, because Fisher issued a very upbeat and conciliatory statement saying she did not intend for Hurd to lose his job and wishes Hurd, his family and HP all the best. Thankfully, I read this on an empty stomach!
What’s wrong with this picture? Plenty.
The real victims here are HP’s rank and file employees, the American worker and sexual harassment victims – both men and women – who lack the clout to hire a Gloria Allred to rattle her saber for another 15 minutes of fame and a quick, inglorious settlement.
The average Joe and Jane worker have seen their ranks decimated with each new acquisition and round of layoffs. HP currently ranks number 9 on Fortune 500 list. In the past several years it has acquired Compaq, EDS, 3Com and Palm. Those mergers and acquisitions helped HP become the first high tech company to have annual revenues that exceed the $100 billion threshold. HP is also first in another category – albeit an unwelcome one: despite its stellar financial performance, over the last decade HP has cut more jobs (most of them here in the U.S.) than any other high tech firm. The head count stands at approximately 85,000.
So Mark Hurd gets $40 to $50 million and tens of thousands of HP’s American employees get shown the door.
Then there’s Ms. Fisher. I know nothing about the woman. One must presume if Hurd was willing to settle with her that her claim had some merit. However, as soon as I heard she was represented by Allred, I cringed. Allred has turned into a modern day Carrie Nation for the tabloid TV generation. In an age of instant and continual information via the Tabloids and the Web, publicity is the chief currency – the more salacious and lurid, the bigger the settlement. I phoned Allred’s office to inquire how many pro bono and non-celebrity sexual harassment cases she handles. I haven’t heard back yet and I’m not too hopeful.
The Equal Employment Opportunity Commission (EEOC) received 12,696 complaints of sexual harassment in the workplace – 16% of them by men. The EEOC says it recovered $51.5 million in monetary benefits for those nearly 13,000 workers. That’s probably just about what Mark Hurd, Jodie Fisher and Gloria Allred pocketed among the three of them. Nice work if you can get it.
That brings me to another prominent headline of the past couple of weeks: Oracle chief Larry Ellison, in an interview with the New York Times blasted the HP board for firing his longtime friend Mark Hurd. Ellison’s comments have all the credence of a professional athlete convicted of using steroids writing an editorial extolling the virtues of doping. Oracle, which completed its acquisition of Sun Microsystems earlier this year, is gearing up to axe up to one-third to one-half of Sun’s workforce of over 25,000. No one is sure exactly how many Oracle employees will be pink slipped but estimates range from 5,000 to as high as 10,000. Oracle disclosed in a recent government finding that it will take write off $825,000 in restructuring charges.
The question is will Larry Ellison make room for Mark Hurd at Oracle? He might. Hurd has a proven record of cutting costs, cutting people and thus delivering value to shareholders.
The real measure of a company’s success should not be measured by how many jobs it cuts by how many jobs it creates for the American worker.
Oracle also made headlines and flexed its muscles last week with the announcement that it is suing Internet search engine giant Google for allegedly infringing on the Java patents Oracle now owns as part of the Sun acquisition, that are used in Google’s mobile Android operating system. This is all about Oracle making a preemptive strike to try and contain Google in what’s shaping up to be a battle of high tech titans. Google’s Android OS runs on many of the major mobile phone platforms including Motorola and HTC Corp. The implications are enormous. Don’t expect this one will ever get to court. Neither firm wants to spend millions or expend precious corporate resources in a protracted legal battle, which would be detrimental to both sides. Expect them to settle. But we can also expect the acrimony between these two rivals to rise commensurately along with the stakes in the mobile market.
Google meanwhile engaged in some posturing of its own. The company released beta version 6 of its Google Chrome web browser. Google also says it will issue a stable new release of the browser every six weeks. This move is clearly designed as a challenge to Microsoft Internet Explorer, Mozilla Firefox and Apple Safari. While I applaud Google’s initiative and desire to retain its competitive edge, releasing a new version of its browser every six weeks is overkill. No matter how fast Google or any vendor makes its browser, the actual speeds are still determined by the user’s broadband. And frankly, the constant application upgrades to everyday packages like Adobe, WordPress and the various browsers are a nuisance. One can barely log on to an application without being hounded to upgrade to the latest version. It’s a major nuisance.
But these days, companies feel compelled to make an announcement just to keep their names in the headlines at all costs. There’s never a dull moment in the high tech industry, especially during the dog days of summer. I can’t wait to see what fall brings. If you have any ideas, Email me at: ldidio@itic-corp.com.

The Dog Days of Summer & High Tech Hijinks Read More »

Cloud Computing: Pros and Cons

Cloud computing like any emerging new technology has both advantages and disadvantages. Before beginning any infrastructure upgrade or migration, organizations are well advised to first perform a thorough inventory and review of their existing legacy infrastructure and make the necessary upgrades, revisions and modifications. Next, the organization should determine its business goals for the next three-to-five years to determine when, if and what type of cloud infrastructure to adopt. It should also construct an operational and capital expenditure budget and a timeframe that includes research, planning, testing, evaluation and final rollout.
Public Clouds: Advantages and disadvantages
The biggest allure of a public cloud infrastructure over traditional premises-based network infrastructures is the ability to offload the tedious and time consuming management chores to a third party. This in turn can help businesses:
• Shave precious capital expenditure monies because they avoid the expensive investment in new equipment including hardware, software, and applications as well as the attendant configuration planning and provisioning that accompanies any new technology rollout.
• Accelerated deployment timetable. Having an experienced third party cloud services provider do all the work also accelerates the deployment timetable and most likely means less time spent on trial and error.
• Construct a flexible, scalable cloud infrastructure that is tailored to their business needs. A company that has performed its due diligence and is working with an experienced cloud provider can architect a cloud infrastructure that will scale up or down according to the organization’s business and technical needs and budget.
The potential downside of a public cloud is that the business is essentially renting common space with other customers. As such, depending on the resources of the particular cloud model, there exists the potential for performance, latency and security issues as well as acceptable response and service and support from the cloud provider.
Risk is another potential pitfall associated with outsourcing any of your firm’s resources and services to a third party. To mitigate risk and lower it to an acceptable level, it’s essential that organizations choose a reputable, experienced third party cloud services provider very carefully. Ask for customer references; check their financial viability. Don’t sign up with a service provider whose finances are tenuous and who might not be in business two or three years from now.
The cloud services provider must work closely and transparently with the corporation to build a cloud infrastructure that best suits the business’ budget, technology and business goals.
To ensure that the expectations of both parties are met, organizations should create a checklist of the items and issues that are of crucial importance to their business and incorporate them into Service Level Agreements (SLAs) Be as specific as possible. These should include but are not limited to:

• What types of equipment do they use?
• How old is the server hardware? Is the configuration powerful enough?
• How often is the data center equipment/infrastructure upgraded?
• How much bandwidth does the provider have?
• Does the service provider use open standards or is it a proprietary datacenter?
• How many customers will you be sharing data; resources with?
• Where is the cloud services provider’s datacenter physically located?
• What specific guarantees if any, will it provide for securing sensitive data?
• What level of guaranteed response time will it provide for service and support?
• What is the minimum acceptable latency/response time for its cloud services?
• Will it provide multiple access points to and from the cloud infrastructure?
• What specific provisions will apply to Service Level Agreements (SLAs)?
• How will financial remuneration for SLA violations be determined?
• What are the capacity ceilings for the service infrastructure?
• What provisions will there be for service failures and disruptions?
• How are upgrade and maintenance provisions defined?
• What are the costs over the term of the contract agreement?
• How much will the costs rise over the term of the contract?
• Does the cloud service provider use the Secure Sockets Layer (SSL) to transmit data?
• Does the cloud services provider encrypt the resting data to prohibit and restrict access?
• How often does the cloud services provider perform audits?
• What mechanisms will it use to quickly shut down a hack and can it track a hacker?
• If your cloud services provider is located outside your country of origin, what are the privacy and security rules of that country and what impact will that have on your firm’s privacy and security issues?
Finally, the corporation should appoint a liaison and that person should meet regularly with a representative from the cloud services provider to ensure that the company attains its immediate goals and that it is always aware and working on future technology and business goals. Outsourcing all or any part of your infrastructure to a public cloud does not mean forgetting and abandoning it.
Private Clouds: Advantages and Disadvantages
The biggest advantage of a private cloud infrastructure is that your organization keeps control of its corporate assets and can safeguard and preserve its privacy and security. Your organization is in command of its own destiny. That can be a double-edged sword.
Before committing to build a private cloud model the organization must do a thorough assessment of its current infrastructure, its budget and the expertise and preparedness of its IT department. Is your firm ready to assume the responsibility for such a large burden from both a technical and ongoing operational standpoint? Only you can answer that. Remember that the private cloud should be highly reliable and highly available – at least 99.999% uptime with built-in redundancy and failover capabilities. Many organizations currently struggle to maintain 99.9% uptime and reliability which is the equivalent of 8.76 hours of per server, per annum downtime. When your private cloud is down for any length of time, your end users (and anyone else who has access to the cloud) will be unable to access resources.
Realistically, in order for an organization to successfully implement and maintain a private cloud, it needs the following:
• Robust equipment that can handle the workloads efficiently during peak usage times
• An experienced, trained IT staff that is familiar with all aspects of virtualization, virtualization management, grid, utility and chargeback computing models
• An adequate capital expenditure and operational expenditure budget
• The right set of private cloud product offerings and service agreements
• Appropriate third party virtualization and management tools to support the private cloud
• Specific SLA agreements with vendors, suppliers and business partners
• Operational level agreements (OLAs) to ensure that each person within the organization is responsible for specific routine tasks and in the event of an outage
• A disaster recovery and backup strategy
• Strong security products and policies
• Efficient chargeback utilities, policies and procedures
Other potential private cloud pitfalls include: deciding which applications to virtualize; vendor lock-in and integration and interoperability issues. Businesses grapple with these same issues today in their existing environments. At present, however, the product choices from vendors and third party providers are more limited for virtualized private cloud offerings. Additionally, since the technology is still relatively new, it will be difficult from both a financial as well as technical standpoint to switch horses in midstream from one cloud provider to another if you encounter difficulties.
There is no doubt that virtualized public and private cloud infrastructures adoptions will grow significantly in the next 12 to 18 months. In order to capitalize on their benefits, lower your total cost of ownership (TCO), accelerate return on investment (ROI) and mitigate risk your organization should take its time and do it right.

Cloud Computing: Pros and Cons Read More »

Cloud Computing: De-Mystifying the Cloud

Every year or so the high technology industry gets a new buzzword or experiences a paradigm shift which is hyped as “the next big thing.”
For the last 12 months or so, cloud computing has had that distinction. Anyone reading all the vendor-generated cloud computing press releases and associated news articles and blogs would conclude that corporations are building and deploying both private and public clouds in record breaking numbers. The reality is much more sobering. An ITIC independent Web-based survey that polled IT managers and C-level professionals at 700 organizations worldwide in January 2010, found that spending on cloud adoption was not a priority for the majority of survey participants during calendar 2010. In fact only 6 percent of participants said that private cloud spending was a priority this year and an even smaller 3 percent minority say that public cloud spending is a priority this year.
Those findings are buttressed by the latest joint ITIC/Sunbelt Software survey data (which is still live); it indicates that just under 20 percent of organizations have implemented a public or a private cloud. When asked why, nearly two-thirds or 65 percent of the respondents said they felt no compelling business need. Translation: they feel safe inside the confines of their current datacenters here on Terra Firma.

While there is a great deal of interest in the cloud infrastructure model, the majority of midsized and enterprise organizations are not rushing to install and deploy private or public clouds in 2010.

However, that is not to say that organizations – especially mid-sized and large enterprises – are not considering cloud implementations. ITIC research indicates that many businesses are more focused on performing much needed upgrades to such essentials as disaster recovery, desktop and server hardware, operating systems, applications, bandwidth and storage before turning their attention to new technologies like cloud computing.
Despite the many articles written about public and private cloud infrastructures over the past 18 months, many businesses remain confused about cloud specifics such as characteristics, costs, operational requirements, integration and interoperability with their existing environment or how to even get started.
De-Mystifying the Cloud
But just what is cloud computing, exactly? Definitions vary. The simplest, most straightforward definition is that a cloud is a grid or utility style pay-as-you-go computing model that uses the Web to deliver applications and services in real-time.
Organizations can choose to deploy a private cloud infrastructure wherein they host their services on-premises from behind the safety of the corporate firewall. The advantage here is that the IT department always knows what’s going on with all aspects of the corporate data from bandwidth, CPU utilization to all-important security issues. Alternatively, organizations can opt for a public cloud deployment in which a third party like Amazon Web Services (a division of Amazon.com) hosts the services at a remote location. This latter scenario saves businesses money and manpower hours by utilizing the host provider’s equipment and management. All that is needed is a Web browser and a high-speed Internet connection to connect to the host to access applications, services and data. However, the public cloud infrastructure is also a shared model in which corporate customers share bandwidth and space on the host’s servers.
Organizations that are extremely concerned about security and privacy issues and those that desire more control over their data can opt for a private cloud infrastructure in which the hosted services are delivered to the corporation’s end users from behind the safe confines of an internal corporate firewall. However, a private cloud is more than just a hosted services model that exists behind the confines of a firewall. Any discussion of private and/or public cloud infrastructure must also include virtualization. While most virtualized desktop, server, storage and network environments are not yet part of a cloud infrastructure, just about every private and public cloud will feature a virtualized environment.
Organizations contemplating a private cloud also need to ensure that they feature very high (near fault tolerant) availability with at least “five nines” 99.999% uptime or better. The private cloud should also be able to scale dynamically to accommodate the needs and demands of the users. And unlike most existing, traditional datacenters, the private cloud model should also incorporate a high degree of user-based resource provisioning. Ideally, the IT department should also be able to track resource usage in the private cloud by user, department or groups of users working on specific projects, for chargeback purposes.
Private clouds will also make extensive use of business intelligence and business process automation to guarantee that resources are available to the users on demand.
Given the Spartan economic conditions of the last two years, all but the most cash-rich organizations (and there are very few of those) will almost certainly have to upgrade their network infrastructure in advance of migrating to a private cloud environment. Organizations considering outsourcing any of their datacenter needs to a public cloud will also have to perform due diligence to determine the bona fides of their potential cloud service providers.
There are three basic types of cloud computing although the first two are the most prevalent. They are:
• Software as a Service (SaaS) which uses the Web to deliver software applications to the customer. Examples of this are Salesforce.com, which has one of the most popular, widely deployed, and the earliest cloud-based CRM application and Google Apps, which is experiencing solid growth. Google Apps comes in three editions – Standard, Education and Premier (the first two are free). It provides consumers and corporations with customizable versions of the company’s applications like Google Mail, Google Docs and Calendar.
• Platform as a Service (PaaS) offerings; examples of this include the above-mentioned Amazon Web Services and Microsoft’s nascent Windows Azure Platform. The Microsoft Azure cloud platform offering contains all the elements of a traditional application stack from the operating system up to the applications and the development framework. It includes the Windows Azure Platform AppFabric (formerly .NET Services for Azure) as well as the SQL Azure Database service. Customers that build applications for Azure will host it in the cloud. However, it is not a multi-tenant architecture meant to host your entire infrastructure. With Azure, businesses will rent resources that will reside in Microsoft datacenters. The costs are based on a per usage model. This gives customers the flexibility to rent fewer or more resources depending on their business needs.
• Infrastructure as a Service (IaaS) is exactly what its name implies: the entire infrastructure becomes a multi-tiered hosted cloud model and delivery mechanism.
Both public and private clouds should be flexible and agile: the resources should be available on demand and should be able to scale up or scale back as the businesses’ needs dictate.

Next: In Part 2 The Pros and Cons of the Cloud

Cloud Computing: De-Mystifying the Cloud Read More »

ITIC/Sunbelt Survey Shows Apple Users Extremely Satisfied with Performance, Reliability and Ease of use

In a clear indication of Apple’s continuing strength with business customers, a new survey of enterprise technology managers shows accelerating interest in purchasing first-time or additional Mac OS computers and iPhones.
Satisfaction with the performance, reliability and security of Apple devices – particularly Mac hardware, OS X 10.x operating systems and the iPhone 3 and 4 (the antenna problems of the newest iPhone 4 notwithstanding) were very high. On average, approximately two-thirds of the survey participants rated the performance and reliability of Apple devices as “Excellent” or “Very Good.”
In addition, the survey responses validate the record breaking iPad sales statistics. As of June 22, Apple said it had sold over three million iPads in the 80 days since its’ April release. The figure is presumably much higher today. The ITIC/Sunbelt survey also found that the iPad is off to a very strong start, with 23 percent or nearly one in four IT managers stating they’ve already purchased or ordered the new Apple tablet. Another 18 percent said they plan to purchase an iPad within the next nine months, while just over half – 51 percent — said they have no definitive timetable. The remaining 8 percent said they plan to wait until Apple cuts the iPad prices for the first time.
And 86% of the respondents who have already bought an iPad say they are using it for both personal and business functions.
The responses to the question, “How often do you or your business experience technical issues with Apple products/devices?” were very positive and encouraging. Some 12 percent said they never had any problems; 50 percent or half the respondents said they “rarely” experienced problems; 20 percent said they “occasionally” encountered technical issues every few months; 5 percent said “once a month;” 6 percent said “two or three times per month;” 5percent said “regularly or once a week,” while a very small 2 percent minority indicated they/their businesses encountered technical issues on a daily basis.

Among the other survey highlights:
• Nearly two-thirds of respondents — 63 percent — indicated they/their organizations use the various Apple devices for both personal and business functions.
• An overwhelming 82 percent majority of survey participants said they use their iPhones to access corporate Email and data.
• 24 percent, who did not currently own an iPhone, said they “have already decided” or are “very likely to switch” with an additional 35 percent saying “it’s possible we’ll switch when the current contract expires.”
• Eight out of 10 organizations said they are “more likely to allow more users to deploy Macintoshes as their enterprise desktops” in 2010-2011, up from 68 percent in the 2009 survey.
• The number of organizations reporting large complements of Macs and OS X 10.x in their organizations continues to climb. Some 7 percent of respondents said they have more than 250 Macs in their enterprise. In the 2008 survey, only 2 percent had more than 250 Macs.
• The percentage of mobile/remote users using Apple devices is rising quickly & significantly
• The line between Apple consumer and enterprise usage continues to blur : 79 percent of survey respondents said that their firms will increase integration with existing Apple consumer products such as the iPhone to allow users to access corporate e-mail and other applications in the 2010-2011 timeframe. This is an 11 percent increase from the 68 percent of respondents who answered that query in the ITIC/Sunbelt 2009 Apple Enterprise Usage survey.

Analysis
The growing popularity of Apple products in the personal lives of IT managers is having a continued spillover effect in the enterprise. The acceleration of interest compared to our previous surveys tells me this trend will continue unabated during the next 12 to 18 months.

This is the third Apple Consumer and Enterprise Survey conducted by ITIC and Sunbelt since 2008. Each successive survey has shown a steady increase in both the number of Macs and Apple devices being deployed by corporate enterprises. ITIC will release the results of additional survey questions on Apple product satisfaction, reliability, security and ease of adoption/integration in August, 2010.
Particularly noteworthy is the survey participants’ strong interest and enthusiasm for the iPad, a product just a few months old. Plus the already strong iPhone adoption will continue as old wireless contracts expire. One can only project that if iPhone becomes available on Verizon in the U.S., the numbers of additional enterprise-based units could be staggering.
Thus far, consumer and corporate users appear to be nonplussed and largely unaffected by the iPhone 4’s much publicized antenna problems which have led to reports of dropped calls the essay comments and first person customer interviews. First person customers interviews on the topic have elicited little more than a shrug. One user said, “So what? All mobile phones and PDAs drop calls.”
Still, Apple must respond decisively and quickly to address any performance, quality and reliability issues related to any and all of its products. Apple has a press conference scheduled for later today to address the issues.
At present however, these issues do not appear to be having an adverse impact on iPhone 4 sales.
With Apple’s enterprise success though, will come new challenges. IT managers who participated in the ITIC/Sunbelt survey extolled the features and functions of the Apple Macs, OS X 10.x, iPhone and iPad for consumers. However, as more and more Apple devices make their way into the enterprise, the lack of enterprise-class third-party management and performance-enhancement tools and technical support is becoming a significant barrier and impediment to widespread enterprise adoption. It is not as problematic though, for organizations that currently have just a few Macs or isolated pockets of Macs and OS X 10.x in specific departments such as graphics. Still, Apple will have to address these issues if it is to mount a serious challenge to Microsoft’s dominance. So far, the company has been silent about its enterprise strategy.
A new consortium of five third-party vendors calling itself the Enterprise Desktop Alliance (EDA) has taken the lead to promote the management, integration and interoperability capabilities of the Mac in corporate environments. Apple is well advised to forge a closer relationship with the EDA and its member organizations to foster greater third party integration and interoperability between Apple devices and rival platforms.
Part 2 of the Apple survey results as they relate to security issues will appear in a subsequent blog.

ITIC/Sunbelt Survey Shows Apple Users Extremely Satisfied with Performance, Reliability and Ease of use Read More »

Apple, Google Grapple for Top Spot in Mobile Web

Since January, the high technology industry has witnessed a dizzying spate of dueling, vendor product announcements.
So what else is new? It’s standard operating procedure for vendors to regularly issue hyperbolic proclamations about their latest/greatest offering, even (or especially) when the announcements are as devoid of content as cotton candy is of nutritional value. Maybe it’s just an outgrowth of the digital information age. We live and breathe instant information that circumnavigates the globe faster than you can say Magellan; the copy monster must be fed constantly. Or maybe it’s the protracted economic downturn which is making vendors hungrier than ever for consumer and corporate dollars.
Whatever the reason, there’s no doubt that high technology vendors – led by Google and Apple – are engaged in a near constant game of one-upmanship.
Apple indirectly started this trend in early January, when word began leaking out that Apple would finally announce the long-rumored iPad tablet in late January. The race was on among other tablet vendors to announce their products at the Consumer Electronics Show (CES) in Las Vegas in mid-January to beat Apple to the punch. A half-dozen vendors including, ASUSTeK Computer (ASUS), Dell, Hewlett-Packard, Lenovo, Taiwanese manufacturer Micro Star International (MSI) and Toshiba all raced to showcase their forthcoming wares in advance of Apple. It made good marketing sense: all of these vendors knew that once Apple released the iPad, that their chances of getting PR would be sorely diminished.
I have no problem with smaller vendors or even large vendors like Dell and HP, who rightfully reckon that they have to make their announcements in advance of a powerhouse like Apple to ensure that their products don’t get overlooked.
Apple vs. Google Battle of the Mobile Web Titans
But when the current industry giants and media darlings like Apple and Google start slugging it out online, in print and at various conferences, it’s overwhelming.
Apple and Google are just the latest in a long line of high technology rivalries. In the 1970s it was IBM vs. HP; in the 1980s, the rise of networking created several notable rivalries: IBM vs. Digital Equipment Corp. (DEC); IBM vs. Microsoft; Oracle vs. IBM; Novell vs. 3Com; Novell vs. Microsoft; Cabletron vs. Synoptics and Cisco vs. all the internetworking vendors. By the 1990s it was Microsoft vs. Netscape and Microsoft vs. pretty much everyone else.
The Apple vs. Google rivalry differs from earlier technology contests in that the relationship between the two firms began as a friendly one and to date, there has been no malice. Until August, 2009 Google CEO Eric Schmidt was on Apple’s board of directors. And while the competition between these two industry giants is noticeably devoid of the rancor that characterized past high tech rivalries, it’s safe to say that the two are respectfully wary of each other. Apple and Google are both determined not to let the other one get the upper hand, something they fear will happen if there is even the slightest pause in the endless stream of headlines.
Google and Apple started out in different markets – Google in the online search engine and advertising arena and Apple as a manufacturer of consumer hardware devices and software applications. Their respective successes – Apple’s with its Mac hardware and Google’s with its search engine of the same name have led them to this point: a head to head rivalry in the battle for supremacy of the mobile Web arena.
On paper, they appear to be two equally matched gladiators. Both companies have huge amounts of cash. Apple has $23 billion in the bank and now boasts the highest valuation of any high technology company, with a current market cap of $236.3 billion, surpassing Microsoft for the top spot. Google has $26.5 billion in cash and a valuation of $158.6 billion. Both firms have two of the strongest management and engineering teams in Silicon Valley. Apple has the iconic Steve Jobs who since his return has re-vitalized the company. Google is helmed by co-founders and creative geniuses Larry Page and Sergey Brin and since 2006 and Eric Schmidt, the CEO who knows how to build computers and make the trains run on time.
Fueling this rivalry is Apple’s and Google’s stake in mobile devices and operating systems. In Apple’s case this means the wildly successful iPhone, iPod Touch and most recently the iPad and the Mac Mini. Google’s lineup consists of its Chrome OS and Android OS which will power tablet devices like Dell’s newly announced Streak, Lenovo’s forthcoming U1 hybrid tablet/notebook due out later this year. The rivalry between the two is quite literally getting down to the chip level. Intel, which has for so long been identified with Microsoft’Windows-based PC platform is now expanding its support for Android – a move company executives have described as its “port of choice” gambit. Apple is no slouch in this area, either: its Macs – from the Mac Minis’ to the MacBook Pros, ship with Intel inside. Last week Nvidia CEO Jen-Hsun Huang weighed in on the Apple/Google rivalry on Google’s side, predicting that the tablet designs will converge around Google’s operating system.
But a stroll through any airport, mall, consumer home or office would give a person cause to dispute Huang’s claim: iPads and iPhones are everywhere. Apple recently announced that it has sold over two million iPads since the device first shipped in April. During a business trip from Boston to New Orleans last week I found that Apple iPads were as much in evidence as hot dogs at a ballpark.
Ironically, Microsoft, a longer term traditional rival of both Apple and Google is not mentioned nearly so often in the smart phone and tablet arenas. That’s because Microsoft’s Windows OS is still searching for a tablet to call its own. Longtime Microsoft partner HP, abruptly switched course: after Microsoft CEO Steve Ballmer got on stage and demonstrated Windows 7 running on HP’s slate, HP bought Palm and earlier this week acquired the assets of Phoenix Technologies which makes an operating system for tablets. That leaves Microsoft to promote its business centric Windows 7 phone which will run Xbox LIVE games, Zune music and the company’s Bing search engine. All is not lost for Microsoft: longtime “frenemy” Apple CEO Steve Jobs said recently that the new iPhone 4G will run Microsoft’s Bing fueling speculation that Apple will drop support for Google’s search engine. Both Google and Apple are still competing with Microsoft in other markets like operating systems, games and application software to name a few, but that’s another story.
There are other competitors in the smart phone and tablet markets but you’d hardly know it from the headlines. Research In Motion’s (RIM) Blackberry is still a market leader. But Apple and Google continue to dominate the coverage. I guess high technology just like sports revels in a classic rivalry. And this one promises to be a hard fought struggle.

Apple, Google Grapple for Top Spot in Mobile Web Read More »

Microsoft Azure Platform, BPOS Cloud Vision Must Address Licensing

Microsoft did a very credible job at its TechEd conference in New Orleans last week, laying out the technology roadmap and strategy for a smooth transition from premises-based networks/services to its emerging Azure cloud infrastructure and software + services model.

One of the biggest challenges facing Microsoft and its customers as it stands on the cusp of what Bob Muglia, president of Microsoft’s Server & Tools Business (STB) unit characterized as a “major transformation in the industry called cloud computing,” is how the Redmond, Wash. software giant will license its cloud offerings.

Licensing programs and plans—even those that involve seemingly straightforward and mature software, PC- and server-based product offerings—are challenging and complex in the best of circumstances. This is something Microsoft knows only too well from experience. Constructing an equitable, easy-to-understand licensing model for cloud-based services could prove to be one of the most daunting tasks on Microsoft’s Azure roadmap.

It is imperative that Microsoft proactively address the cloud licensing issues now, and Microsoft executives are well aware of this. During the Q&A portion of one cloud-related TechEd session, Robert Wahbe, corporate vice president, STB Marketing was asked, “What about licensing?” He took a sip from his water bottle and replied, “That’s a big question.”

That is an understatement.

Microsoft has continually grappled with simplifying and refining its licensing strategy since it made a major misstep with Licensing 6.0 in May, 2001, where the initial offering was complex, convoluted and potentially very expensive. It immediately met with a huge vocal outcry and backlash. The company was compelled to postpone the Licensing 6.0 launch while it re-tooled the program to make it more user-friendly from both a technical and cost perspective.

Over the last nine years, Microsoft’s licensing program and strategy has become one of the best in the high-technology industry. It offers simplified terms and conditions (T&Cs); greater discounts for even the smallest micro SMBs and a variety of add-on tools (e.g. licensing compliance and assessment utilities), as well as access to freebies, such as online and onsite technical service and training for customers who purchase the company’s Software Assurance (SA) maintenance and upgrade agreement along with their Volume Licensing deals.

Licensing from Premises to the Cloud
Microsoft’s cloud strategy is a multi-pronged approach that incorporates a wide array of offerings, including Windows Azure, SQL Azure and Microsoft Online Services (MOS). MOS consists of hosted versions of Microsoft’s most popular and widely deployed server applications, such as Exchange Server, PowerPoint and SharePoint. Microsoft’s cloud strategy also encompasses consumer products like Windows Live, Xbox Live and MSN.

Microsoft is also delivering a hybrid cloud infrastructure that will enable organizations to combine premises-based with hosted cloud solutions. This will indisputably provide Microsoft customers with flexibility and choice as they transition from a fixed-premises computing model to a hosted cloud model. In addition, it will allow them to migrate to the cloud at their own pace as their budgets and business needs dictate. However, the very flexibility, breadth and depth of offerings that make Microsoft products so appealing to customers, ironically, are the very issues that increase the complexity and challenges of creating an easily accessible, straightforward licensing model.

Dueling Microsoft Clouds: Azure vs. BPOS
Complicating matters is that Microsoft has dueling cloud offerings; the Business Productivity Online Suite (BPOS) and the Windows Azure Platform. As a result, Microsoft must also develop, delineate and differentiate its strategy, pricing and provisions for Azure and BPOS. It’s unclear (at least to this analyst) as to when and how a customer will choose one or mix and match BPOS and Azure offerings. Both are currently works in progress.

BPOS is a licensing suite and a set of collaborative end-user services that run on Windows Server, Exchange Server, and SQL Server. Microsoft offers the BPOS Standard Suite, which incorporates Exchange Online, SharePoint Online, Office Live Meeting, and Office Communications (OCS) Online. The availability of the latter two offerings is a key differentiator that distinguishes Microsoft’s BPOS and rival offerings from Google. Microsoft also sells the BPOS Business Productivity Online Deskless Worker Suite. It consists of Exchange Online Deskless Worker, SharePoint Online Deskless Worker and Outlook Web Access Light. This BPOS package is targeted at SMBs, small branch offices or companies that want basic, entry-level messaging and document collaboration functions.

By contrast, Azure is a cloud platform offering that contains all the elements of a traditional application stack from the operating system up to the applications and the development framework. It includes the Windows Azure Platform AppFabric (formerly .NET Services for Azure), as well as the SQL Azure Database service.

While BPOS is aimed squarely at end users and IT managers, Azure targets third-party ISVs and internal corporate developers. Customers that build applications for Azure will host it in the cloud. However, it is not a multi-tenant architecture meant to host your entire infrastructure. With Azure, businesses will rent resources that will reside in Microsoft datacenters. The costs are based on a per-usage model. This gives customers the flexibility to rent fewer or more resources, depending on their business needs.

Cloud Licensing Questions
Any cloud licensing or hybrid cloud licensing program that Microsoft develops must include all of the elements of its current fixed premises and virtualization models. This includes:

1. Volume Licensing: As the technology advances from fixed premises software and hardware offerings to private and public clouds, Microsoft must find ways to translate the elements of its current Open, Select and Enterprise agreements to address the broad spectrum of users from small and midsized (SMBs) companies to the largest enterprises with the associated discounts for volume purchases.
2. Term Length: The majority of volume license agreements are based on a three-year product lifecycle. During the protracted economic downturn, however, many companies could not afford to upgrade. A hosted cloud model, though, will be based on usage and consumption, so the terms should and most likely will vary.
3. Software Assurance: Organizations will still need upgrade and maintenance plans regardless of where their data resides and whether or not they have traditional subscription licensing or the newer consumption/usage model.
4. Service and Support: Provisions for after-market technical services, support and maintenance will be crucial for Microsoft, its users, resellers and OEM channel partners. ITIC survey data indicates that the breadth and depth of after-market technical service and support is among the top four items that make or break a purchasing deal.
5. Defined areas of responsibility and indemnification: This will require careful planning on Microsoft’s part. Existing premises-based licensing models differ according to whether or not the customer purchases their products directly from Microsoft, a reseller or an OEM hardware manufacturer. Organizations that adopt a hybrid premises/cloud offering and those that opt for an entirely hosted cloud offering will be looking more than ever before to Microsoft for guidance. Microsoft must be explicit as to what it will cover and what will be covered by OEM partners and/or host providers.

Complicating the cloud licensing models even further is the nature of the cloud itself. There is no singular cloud model. There may be multiple clouds, and they may be a mixture of public and private clouds that also link to fixed premises and mobile networks.

Among the cloud licensing questions that Microsoft must address and specifically answer in the coming months are:

• What specific pricing models and tiers for SMBs, midsize and enterprises will be based on a hybrid and full cloud infrastructures?
• What specific guarantees if any, will it provide for securing sensitive data?
• What level of guaranteed response time will it provide for service and support?
• What is the minimum acceptable latency/response time for its cloud services?
• Will it provide multiple access points to and from the cloud infrastructure?
• What specific provisions will apply to Service Level Agreements (SLAs)?
• How will financial remuneration for SLA violations be determined?
• What are the capacity ceilings for the service infrastructure?
• What provisions will there be for service failures and disruptions?
• How are upgrade and maintenance provisions defined?

From the keynote speeches and throughout the STB Summit and TechEd conference, Microsoft’s Muglia and Wahbe both emphasized and promoted the idea that there is no singular cloud. Instead, Microsoft’s vision is a world of multiple private, public and hybrid clouds that are built to individual organizations’ specific needs.

That’s all well and good. But in order for this strategy to succeed, Microsoft will have to take the lead on both the technology and the licensing fronts. The BPOS and Azure product managers and marketers should actively engage with the Worldwide Licensing Program (WWLP) managers and construct a simplified, straightforward licensing model. We recognize that this is much easier said than done. But customers need and will demand transparency in licensing pricing, models and T&Cs before committing to the Microsoft cloud.

Microsoft Azure Platform, BPOS Cloud Vision Must Address Licensing Read More »

Virtualization Deployments Soar, But Companies Prefer Terra Firma to Cloud for now

The ongoing buzz surrounding cloud computing – particularly public clouds – is far outpacing actual deployments by mainstream users. To date only 14% of companies have deployed or plan to deploy a private cloud infrastructure within the next two calendar quarters.
Instead, as businesses slowly recover from the ongoing economic downturn, their most immediate priorities are to upgrades to legacy desktop and server hardware, outmoded applications and to expand their virtualization deployments. Those are the results of the latest ITIC 2010 Virtualization and High Availability survey which polled C-level executives and IT managers at 400 organizations worldwide.
ITIC partnered with Stratus Technologies and Sunbelt Software to conduct the Web-based survey of multiple choice questions and essay comments. ITIC also conducted first person interviews with over two dozen end to obtain anecdotal responses on the primary accelerators or impediments to virtualization, high availability and reliability, cloud computing. The survey also queried customers on whether or not their current network infrastructure and mission critical applications were adequate enough to handle new technologies and the increasing demands of the business.
The survey showed that for now at least, although, many midsized and large enterprises are contemplating a move to the cloud – especially a private cloud infrastructure – the technology and business model is still not essential for most businesses. Some 48% of survey participants said they have no plans to migrate to private cloud architecture within the next 12 months while another 33% said their companies are studying the issue but have no firm plans to deploy.

The study also indicates that Private Cloud deployments are outpacing Public Cloud Infrastructure deployments by a 2 to 1 margin. However before businesses can begin to consider a private cloud deployment they must first upgrade the “building block” components of their existing environments e.g., server and desktop hardware, WAN infrastructure; storage, security and applications. Only 11% of businesses described their server and desktop hardware as leading edge or state-of-the-art. And just 8% of respondents characterized their desktop and application environment as leading edge.

The largest proportion of the survey participants – 52% – described their desktop and server hardware working well, while 48% said their applications were up-to-date. However, 34% acknowledged that some of their server hardware needed to be updated. A higher percentage of users 41% admitted that their mission critical software applications were due to be refreshed. And a small 3% minority said that a significant portion of both their hardware and mission critical applications were outmoded and adversely impacting the performance and reliability of their networks.

Based on the survey data and customer interviews, ITIC anticipates that from now until October, companies’ primary focus will be on infrastructure improvements.

Reliability and Uptime Lag

The biggest surprise in this survey from the 2009 High Availability and Fault Tolerant survey, which ITIC & Stratus conducted nearly one year ago, was the decline in the number of survey participants who said their organizations required 99.99% uptime and reliability. In this latest survey, the largest portion of respondents – 38% — or nearly 4 out of 10 businesses said that 99.9% uptime — the equivalent of 8.76 hours of per server, per annum downtime was the minimum acceptable amount for their mission critical line of business (LOB) applications. This is more than three times the 12% of respondents who said that 99.9% uptime was acceptable in the prior 2009 survey. Overall, 62% or nearly two-thirds of survey participants indicated their organizations are willing to live with higher levels of downtime than were considered acceptable in previous years.
Some 39% of survey respondents – almost 4 out of 10 respondents indicated that their organizations demand high availability which ITIC defines as four nines of uptime or greater. Specifically, 27% said their organizations require 99.99% uptime; another 6% need 99.999% uptime and a 3% minority require the highest 99.999% level of availability.
The customer interviews found that the ongoing economic downturn, aged/aging network infrastructures (server and desktop hardware and older applications), layoffs, hiring freezes and the new standard operating procedure (SOP) “do more with less” has made 99.9% uptime more palatable than in previous years.
Those firms that do not keep track of the number and severity of their outages have no way of gauging the financial and data losses to the business. Even a cursory comparison indicates substantial cost disparities between 99% uptime and 99.99% uptime. The monetary costs, business impact and risks associated with downtime will vary by company as well as the duration and severity of individual outage incidents. However a small or midsize business, for example, which estimates the hourly cost of downtime to be a very conservative $10,000 per hour, would potentially incur losses of $876,000 per year at a data center with 99% application availability (87 hours downtime). By contrast, a company whose data center operations has 99.99% uptime, would incur losses of $87,600 or one-tenth that of a firm with conventional 99% availability.
Ironically, the need for rock-solid network reliability has never been greater. The rise of Web-based applications and new technologies like virtualization and Service Oriented Architecture (SOA), as well as the emergence of public or shared cloud computing models are designed to maximize productivity. But without the proper safeguards these new datacenter paradigms may raise the risk of downtime. The Association for Computer Operations Management/ Data Center Institute (AFCOM) forecasts that one-in-four data centers will experience a serious business disruption over the next five years.
At the same time, customer interviews revealed that over half of all businesses 56% lack the budget for high availability technology. Another ongoing challenge is that 78% of survey participants acknowledged that their companies either lack the skills or simply do not attempt to quantify the monetary and business costs associated with hourly downtime. The reasons for this are well documented. Some organizations don’t routinely do this and those that attempt to calculate costs and damages run into difficulties collecting data because the data resides with many individuals across the enterprise. Inter-departmental communication, cooperation and collaboration is sorely lacking at many firms. Only 22% of survey respondents were able assign a specific cost to one hour of downtime and most of them gave conservative estimates of $1,000 to $25,000 for a one hour network outage. Only 13% of the 22% of survey participants who were able to quantify the cost of downtime indicated that their hourly losses would top $175,000 or more.

Users Confident and Committed to Virtualization Technology
The news was more upbeat with respect to virtualization – especially server virtualization deployments. Organizations are both confident and comfortable with virtualization technology.
72% of respondents indicated the number of desktop and server-based applications demanding high availability has increased over the past two years. The survey also found that a 77% majority of participants run business critical applications on virtual machines. Not surprisingly, the survey data showed that virtualization usage will continue to expand over the next 12 months. A 79% majority – approximately eight-out-of-10 respondents — said the number of business critical applications running on virtual machines and virtual desktops will increase significantly over the next year. Server virtualization is very much a mainstream and accepted technology. The responses to this question indicate increased adoption as well as confidence. Nearly one-quarter of the respondents – 24% say that more than 75% of their production servers are VMs. Overall 44% of respondents say than over 50% of their servers are VMs. However, none of the survey participants indicate that 100% of their servers are virtualized. Additionally, only 6% of survey resp

Virtualization Deployments Soar, But Companies Prefer Terra Firma to Cloud for now Read More »

Networks Without Borders Raise Security, Management Issues

“Networks without Borders” are rapidly becoming the rule rather than the exception.
The demand for all access all the time, along with the rapid rise in remote, telecommuting, part time and transient workers, has rendered network borders obsolete and made networks extremely porous. Today’s 21st Century networks more closely resemble sieves than citadels.
Gone are the days when employees and data resided safely behind the secure confines of the firewall, clocked in promptly at 9:00 a.m., sat stationary in front of their computers, never accessed the Internet, and logged off at 6:00 p.m. and were offline until the next workday.
Today’s workers are extremely mobile, always connected and demand 24×7 access to the corporate network, applications and data via a variety of device types from desktops to smart phones irrespective of location. ITIC survey data indicates that workers at 67% of all businesses worldwide travel telecommute and log in remotely at least several days a month. At present, one-out-of-eight employees use their personal computers, notebooks and smart phones to access corporate data.
From an internal perspective, the ongoing economic downturn has resulted in layoffs, hiring freezes, budget cuts and less money and time available for IT training and certification. At the same time, the corporate enterprise network and applications have become more complex. IT departments face increasing pressure to provide more services with fewer resources. Another recent ITIC survey of 400 businesses found that almost 50% of all businesses have had budget cuts and 42% have had hiring freezes. An overwhelming 84% majority of IT departments just pick up the slack and work longer hours!
External pressures also abound. Many businesses also have business partners, suppliers and customers who similarly require access. Additionally, many organizations employ outside consultants, temporary and transient workers who need access to the corporate network from beyond the secure confines of the firewall.
This type of on demand, dynamic access is distinctly at odds with traditional security models. The conventional approach to security takes a moat and drawbridge approach: to contain and lock down data behind the safety of the firewall. IT managers have been trained to limit access, rights and privileges particularly with respect to transient workers, outside consultants and remote and telecommuting workers. And who can blame them? The more network access that is allowed, the greater the risk of litigation, non-compliance and compromising the integrity of the corporate network and data.
Providing secure, ubiquitous access to an array of mobile and home-based employees, business partners, suppliers, customers and consultants who need permanent or temporary access to the network is a tedious and time consuming process. It necessitates constant vigilance on the part of the IT department to monitor and provision the correct access rights and privileges.
The conundrum for IT departments is to easily, quickly and cost effectively provision user account access while preserving security and maintaining licensing compliance. The emerging Virtual Desktop Infrastructure (VDI) technology, where users control a desktop running on a server remotely, can address some of these issues, but VDI doesn’t solve all the problems.
An intriguing alternative to VDI is nascent software application from MokaFive, which is designed specifically to plug the holes in the so-called “Porous Enterprise.” MokaFive, based in Redwood City, California was founded in 2005 by a group of Stanford University engineers specifically to enable IT departments to swiftly provision network access without the cost and complexity of VDI solutions. MokaFive is not the only vendor exploring this market; its’ competitors include VMware (via the Thinstall acquisition); Microsoft (via the Kidaro acquisition), LANDesk and Provision Networks. However, the MokaFive offering is to date, the only “pure play” offering that enables organizations to provision a secure desktop environment on the fly to individual users rather than just an entire group.
The MokaFive Suite is actually a set of Desktop-as-a-Service facilities that are operating system, hardware and application agnostic. MokaFive’s desktop management features enable IT administrators to centrally create, deliver, secure and update a fully-contained virtual environment, called a LivePC, to thousands of users. Contract workers can log on via Guest Access; there is no need for the IT department to specially provision them. The MokaFive Suite facilitates ubiquitous access to Email, data and applications irrespective of location, device type (e.g., Windows, and Macintosh) or the availability of a hard wired network connection.
I discussed the product with several IT executives and administrators who immediately and enthusiastically grasped the concept.
“This a very cool idea,” says Andrew Baker, a 20 year veteran VP of IT and security who has held those positions at a variety of firms including Bear Stearns, Warner Media Group and The Princeton Review. “The most tedious aspect of configuring a worker’s experience is the desktop,” he says. Typically the IT manager must physically configure the machine, set up the access rights, privileges and security policies and deploy the correct applications. This is especially problematic and time consuming given the increasing number of mobile workers and transient workforces. The other issue is the constant need to re-provision the desktop configuration to keep it up to date, Baker says. The MokaFive Suite, he says, “saves precious time and it solves the issue of the disappearing network perimeter. I love the idea of being able to be secure, platform agnostic and being able to support multiple classes of workers from a central location.”
MokaFive’s LivePC images run locally, so end-users simply download their secure virtual desktop via a Web link, and run it on any computer (Macintosh or Windows). IT administrators apply updates and patches to a single golden image and MokaFive distributes the differentials to each LivePC. The entire process is completed in minutes by a single IT administrator. Once the MokaFive LivePC link is up and published, users are up and running regardless of whether it’s one person or 100 people. The traditional method of physically provisioning an asset can involve several IT managers and take anywhere from two days to a couple of weeks. It involves procurement, imaging, testing, certification and delivery of the device to remote workers. Baker estimates that MokaFive could cut administration and manpower time by 30% to 60% depending on the scope of the company’s network.
MokaFive also requires less of a monetary investment than rival VDI solutions and doesn’t require IT administrators to learn a new skill set, claims MokaFive VP of marketing, Purnima Padmanabhan.
“VDI does enable companies to ramp up and quickly provision and de-provision virtual machines (VMs); however, the IT department is still required to build out fixed server capacity for its transient workforce,” Padmanabhan says. Oftentimes, the additional capacity ends up going to waste. “The whole point of contractors is to dial in, dial up and dial down expenses, and that’s what MokaFive does,” she adds.
Steve Sommer, president of SLS Consulting in Westchester, New York agrees. Sommer spent 25 years simultaneously holding the positions of CIO and CTO at Hughes, Hubbard & Reed a NYC law firm with 1,200 end users – including 300 attorneys — in a dozen remote locations. Sommer observes that corporate politics frequently determine access policy at the expense of security. “A company’s knowledge workers – lawyers, doctors, software developers – who drive large portions of revenue will demand all-access, all the time and security be damned. In the past it was an either/or proposition,” Sommer says.
With the MokaFive desktop-as-a-service approach all the data is encapsulated, encrypted and controlled. Organizations now have the option to manage the permanent workforce as well as temporary contractors and consultants who use their own personal devices quickly and easily. IT managers can provision a virtual machine (VM) on top of MokaFive or give the remote user or contract worker an HTML link which contains the MokaFive LivePC. The end user clicks on the link to get a completely encapsulated VM environment, which is controlled through policies using MokaFive. It can be completely encrypted at the 256-bit AES encryption. The entire environment is managed, contained and is kept updated with the latest passwords, connections, application versions and patches. When the user or contractor worker leaves the company, the IT department issues a root kill signal and all the licenses are retrieved and called back, ensuring compliance.
“MokaFive is a boon for IT departments and end users alike; no more worrying about provisioning and version. I love the fact that it’s application, hardware and operating system agnostic,” Sommer says. “And it also has distinct time saving benefits for the end user, or transient workforce. They can take their work with them wherever they are and they don’t have to worry about borrowing a notebook or PDA and ensuring that it’s properly configured with the correct version.”
MokaFive already has several dozen customers and prospects and is gaining traction in a number of vertical markets including financial services, legal, healthcare, government and education. Given the burgeoning popularity and mainstream adoption of VDI, the MokaFive Suite represents a viable alternative to organizations that want a fast, cost effective and non-disruptive solution that lets IT departments give fast, efficient and secure network access. It’s definitely worth exploring and MokaFive offers free trials for interested parties from its website.

Networks Without Borders Raise Security, Management Issues Read More »

VDI Vendor Wars Intensify

There’s no hotter market in high tech this year than Virtual Desktop Infrastructure (VDI) and you don’t need sales and unit shipment statistics to prove it. No, the best measurement of VDI’s hotness is the sudden flurry of vendor announcements accompanied by a concomitant rise in vitriol.
The main players in the VDI market are actually two sets of pairs. It’s Citrix and Microsoft lining up against VMware and EMC for Round 2 in the ongoing virtualization wars. On March 18, Citrix and Microsoft came out swinging, landing the first potent, preemptive punches right where they hope will hurt VMware the most: in its pocketbook.
Citrix and Microsoft unveiled a series of VDI initiatives that include aggressive promotional pricing deals and more simplified licensing models. To demonstrate just how solid and committed they are to their alliance and taking on and taking down VMware and EMC, the two firms even went so far as to combine their respective VDI graphics technologies.
At stake is the leadership position in the nascent, but rapidly expanding global VDI market. The results of the ITIC 2010 Global Virtualization Deployment and Trends Survey which polled 800+ businesses worldwide in the December/January timeframe indicate that 31% of respondents plan to implement VDI in 2010; that’s more than double the 13% that said they would undertake a VDI deployment in 2009. Application virtualization is also on the rise. The same ITIC survey found that 37% of participants plan application virtualization upgrades this year, up from 15% who responded affirmatively to the same question in the 2009.
The current installed base of VDI deployments is still relatively small; hence the statistics that show the number of deployments doubling year over year must be considered in that context. Nonetheless, double digit deployment figures are evidence of strengthening demand and a market that is robustly transitioning from niche to mainstream. The spate of announcements from Microsoft and Citrix were clearly intended to capitalize on the growth spurt in VDI. At the same time, the companies threw down the gauntlet with initiatives aimed at solidifying and expanding their base of current VDI customers while serving the dual purpose of luring VMware customers away from that company’s VDI platform. They include:
• “VDI Kick Start” This wide ranging sales promotion, which runs from March 18 through December 31, 2010, seeks to jump start VDI deployments by lowering the entry level pricing for customers purchasing Microsoft and Citrix technologies. As part of this deal, existing Microsoft client access licensing (CAL) customers will pay $28 per desktop for up to 250 users to purchase the Microsoft Virtual Desktop Infrastructure Suite, Standard edition, and Citrix’s XenDesktop VDI Edition for one year. That’s roughly a 50% discount off the list prices that corporations have paid up until now for their annual CALs. This is crucial for cost conscious businesses. Client access licenses typically represent the lion’s share of their licensing deals since desktops outnumber servers in mid-sized and large enterprises. In addition to merging Microsoft’s 3-D graphics technology for virtual desktops, called RemoteFX, with Citrix’s high-definition HDX technology.

• The Microsoft Virtual Desktop Access (VDA) License Plan. Organizations that use Thin Client devices which are not included or covered under Microsoft’s SA maintenance plan, can now purchase the VDA licenses at a retail price of $100 per device per annum. This targets end users who travel or telecommute and need to use personal devices or public networks to access their corporate data. Microsoft also made another move towards simplifying its virtualization licensing plan. Starting July 1, Microsoft SA customers will no longer be required to purchase a separate license to access Windows via a VDI.
• The “Rescue for VMware VDI” (the name says it all) this promotion is a direct attack on VMware. Like the VDI Kick Start program it runs from March 18 through December 31, 2010. Under the terms of this deal, any Microsoft Software Assurance licensing/maintenance customer can replace their existing VMware View licenses for free. VMware View users who opt out of that platform in favor of the Citrix and Microsoft offerings will receive up to 500 XenDesktop VDI Edition device licenses and up to 500 Microsoft VDI Standard Suite device licenses free for an entire year once they trade in their VMware View licenses.
Dai Vu, Microsoft’s director of virtualization marketing said the announcements were all about delivering more value to desktop customers and simplifying and extending organizations’ licensing rights.
The Citrix/Microsoft announcements also cement the close working partnership and the “enemy of my enemy is my friend” relationship the firms have enjoyed for many years. By bundling their respective VDI offerings together, the two companies should also ensure integration and interoperability which are crucial components for each and every layer in a virtualized data center environment.
VMware and EMC: Not Standing Still
VMware and EMC executives have yet to publicly respond to the Microsoft/Citrix initiatives. However, it’s almost certain that VMware will have to offer its current and prospective VDI accounts incentives to counter the Microsoft/Citrix alliance. Cash strapped corporations and IT departments are all on the lookout for top notch products at bargain basement prices. And it doesn’t get much better for customers than the free Rescue for VMware VDI program.
VMware built up a commanding lead in the server virtualization arena over the last five years by virtue of being first to market and delivering leading edge features and performance in its signature ESX Server product. VMware’s competitors have spent the last several years playing catch up in server virtualization. This allowed VMware to charge a premium price for its premier offerings. Depending on the size and scope of the individual organization’s server virtualization deployment, customers paid on average 35% to as much as 75% higher for VMware server-based offerings. There were surprisingly few complaints.
The emerging VDI and application virtualization markets are a different story. Only about 5% to 8% of organizations worldwide have fully virtualized their desktop infrastructure. So it’s too soon to declare a clear market winner. It’s safe to say that Citrix, Microsoft and VMware are all market leaders in this segment. This time around though, Microsoft and Citrix are determined not to let VMware and EMC run away with the race by building an insurmountable lead.
Meanwhile, VMware and EMC have not been idle. Former Microsoft executive Paul Maritz succeeded VMware founder Diane Greene following her 2008 departure as the company’s president and chief executive officer. Since then he has made tangible moves to bolster VMware’s position in the VDI and application virtualization arenas. Maritz and EMC CEO Joe Tucci make a formidable combination, as do EMC and VMware. EMC purchased VMware in 2004 for $635 million and it owns an 86% majority stake in the server virtualization market leader. In the past several years, VMware’s fortunes and revenues have risen faster than EMC’s. VMware’s year-over-year (YoY) quarterly revenue growth stands at 18.20% compared with EMC’s modest 2.10% Y0Y quarterly sales. Another key indicator is net earnings and in this regard, VMware experienced negative YoY quarterly earnings growth of -49.4 0% . By contrast its parent EMC recorded a very robust and positive 44.70% jump in YoY quarterly earnings. It is also worth noting that VMware’s annual revenues of $2.02 billion represent only 15% of EMC’s annual sales of $14.03 billion. And to date, EMC’s solutions have only been related tangentially to VMware’s VDI products. For practical purposes, this may continue to be the case. From a PR standpoint though, EMC and VMware are presenting themselves as a sort of virtualization “dynamic duo.”
At an EMC Analyst event at the company’s Hopkinton, MA headquarters on March 11, Pat Gelsinger, president of EMC’s Information Infrastructure Products group described the combination of EMC and VMware – specifically with respect to storage virtualization, virtualization management and private cloud infrastructures — as the “Wild West” of the virtualization market, saying “we want to be disruptive and change the way people fundamentally think of IT.” Though Gelsinger mainly confined his comments to EMC’s core bailiwick in the storage arena, it is clear that EMC and VMware are pro-actively presenting a united front.
In February, the two firms moved to reposition some of their assets; EMC and VMware inked a deal for VMware to acquire certain software products and expertise from EMC’s Ionix IT management business in an all cash deal for $200 million. EMC does retain the Ionix brand and gets full reseller rights to continue to offer customers the products acquired by VMware. Maritz said VMware’s acquisition of the Ionix products and expertise promises to further establish VMware vCenter as the next generation management platform for private cloud infrastructures.
The agreement also calls for VMware to take control of all the technology and intellectual property of FastScale, which EMC acquired in 2009. The FastScale Composer Suite incorporates integrated software management tools to enable organizations to maintain peak performance in a virtualized environment.
Also, recently, VMware introduced ThinApp 4.5, a new version of its application virtualization package designed to simplify enterprises’ migration to Windows 7.
End Users are the Biggest Winners
What makes the latest competition for VDI market dominance noteworthy is the extreme actions the combatants are willing to take in order to retain and gain customers’ at their rivals expense. With last week’s joint announcements and deepening partnership, Citrix and Microsoft have signaled their intention to lead but it’s still too early to call the race.
The joint Microsoft/Citrix initiatives to cut costs and simplify virtualization licensing plans remove two of the more significant barriers to VDI adoption. The largest looming challenge remains the willingness of corporations to embrace a new technology model as their organizations and IT departments continue to grapple with the lingering effects of the ongoing economic crunch. In this regard, all of the virtualization vendors in concert with OEM hardware vendors like Dell, Hewlett-Packard, IBM, Stratus Technologies and Wyse who partner with them must convince customers that transitioning to VDI will provide tangible Total Cost of Ownership (TCO) and Return on Investment (ROI) benefits. This entails providing organizations with the necessary guidance – including tools, training, documentation, Best Practices and solid technical service and support – to ensure that a conversion to VDI can be accomplished with minimal disruption. Admittedly, this is a tall order.
Hardware vendors like Dell, HP, IBM et al all have a stake in the future success of the VDI market. Organizations that migrate to VDI will seek to upgrade to newer, more powerful desktops (PCs, notebooks) and servers, which in turn, potentially boosts the hardware vendors’ individual and collective bottom lines. Additionally, both HP and IBM boast huge service and support organizations, which also stand to benefit from an uptick in VDI adoptions. So the hardware vendors have every reason to partner with Citrix, Microsoft and VMware to promote and expand the VDI market segment. Regardless of which vendor(s) prevails, the biggest winners will be the customers. When several big name vendors vie for the hearts, minds and wallets of customers, it usually means that feature-rich, reliable products get to market sooner at more competitive prices. Let’s hope the VDI race is a long one.

VDI Vendor Wars Intensify Read More »

Database Competition Heats Up

The database market will see lots of activity during the 2010-2011 timeframe as nearly 60% of organizations move to upgrade or expand existing and legacy networks.
That statistic comes from new ITIC survey data, which polled 450 organizations worldwide. Not surprisingly the survey shows that longtime market leaders Oracle, IBM, Microsoft and Sybase will continue to dominate the DBMS market and solidify their positions.
Databases are among the most mature and crucial applications in the entire network infrastructure. Database information is the lifeblood of the business. Databases directly influence and impact every aspect of the organization’s daily operations including: relationships with customers, business partners, suppliers and the organization’s own internal end-users. All of these users must have the ability to locate and access data quickly, efficiently and securely. The corporate database must deliver optimal performance, reliability, security, business intelligence and ease of use. It must also incorporate flexible, advanced management capabilities to enable database administrators (DBAs) to construct and oversee a database management system (DBMS) that best suits the organization from both a technology and business perspective.
What will distinguish the DBMS market this year is that the always intense and vociferous vendor rivalries will heat up even more over the next 12 months.
There are several pragmatic reasons for this. Most notable is the fact that many organizations deferred all but the most pressing network upgrade projects during the severe downturn over the past two-and-a-half years. Many businesses are now in a position where they must upgrade their legacy database infrastructure because it’s obsolete and is adversely impacting or will shortly impact the business. Anytime a company decides on a major upgrade there’s always a chance, that they may switch providers. The DBMS vendors know this and will do their level best to lure customers to their platform, or at the very least get a foot in the door.
Another factor that looms large in the 2010 DBMS market dynamics is Oracle’s purchase of Sun Microsystems. That acquisition finally got the green light from the European Commission last month. Speculation abounds as to the fate of the MySQL, which is a popular and highly regarded Open Source DBMS. For the record, Oracle executives stated publicly within the last two weeks that it will continue to support and develop MySQL and even provide integration with other Oracle offerings. But users are uneasy because MySQL does compete to some extent with some Oracle products. Expect rivals, particularly IBM and Microsoft, to aggressively capitalize on user confusion and fear to entice users to their respective platforms.
The DBMS Vendor Landscape
As nearly everyone knows, the four major DBMS vendors: Oracle, IBM, Microsoft and Sybase account for 90% of the installed base, unit shipments and revenue.
Oracle’s 11g is the undisputed market leader. It offers a full slate of online transactional processing (OLTP) as well as specialized database applications. As such it is being assailed from all sides and with relish by rivals who take every opportunity to criticize its’ products and strategy. Oracle, headed by Larry Ellison one of the most visible and outspoken high technology CEOs, happily reciprocates with its own vitriol.
IBM’s DB2 9.5 for Linux, Windows and UNIX remains firmly entrenched in high end enterprises owing to its rock solid reliability, performance, management, scalability and overall data and application integration capabilities. Users are also loyal to the DB2 platform because of IBM’s strong after-market technical service and support offerings. IBM also secures its position within very large enterprises by giving good deals and discounts on licensing renewals and training and support.
Microsoft’s SQL Server 2008 has shown tremendous improvement in scalability, security, ease of use, programmability and application development functionality and is gaining ground particularly among SMB and SME organizations. Microsoft hopes that the increased functionality of SQL Server 2008 will enable it to erode Oracle’s very entrenched presence among enterprises. A big plus for Microsoft is its legion of committed resellers and consultants who do an excellent job of promoting SQL Server 2008 among SMBs and SMEs.
Cost, Interoperability and Performance Top User DBMS Requirements
DBMS upgrades and new installations will be fought, won and/or lost according to three main factors: they are interoperability, cost and performance/features. The latest ITIC survey data found that nearly 90% rated interoperability with existing or planned infrastructure as the most important factor weighed when choosing a server vendor; 80% chose cost as a main DBMS influencer and 78% cited performance as their main reason for choosing a specific DBMS vendor platform.
But any DBMS vendor that hopes to dislodge or supplant a rival in an existing account will have to work hard to do so. The ITIC survey data also shows that organizations – especially large enterprises – do not readily or often forsake their legacy platforms. According to the survey data, 76% of survey respondents indicated they have not migrated or switched any of their main line of business applications from one database platform to another within the past three years.
This statistic makes a lot of sense. Precisely because DBMS platforms are among the most mature server-based applications in the entire enterprise, it’s much more work to rip out one platform and start fresh. A wholesale switch from one platform to another requires significant capital expenditure monies. Additionally, the business must also invest a lot of time and energy in converting to a new platform, testing new applications, rewriting scripts and re-training DBAs and getting them certified on the new environment. For CIOs, CTOs and IT departments this prospect has roughly the same appeal as having root canal without Novocain.
Nonetheless, one-in-five survey respondents – 20% — did migrate database platforms over the past three years. The most popular reasons for switching DBMS platforms, according to the survey respondents is a move to a custom developed in-house application a customized application developed by a partner. Just over half – 53% — of responding organizations that changed DBMS platforms came from midsized enterprises with 500 to 3,000 end users – a fact that favored Microsoft SQL Server 2008 deployments. Among the 20% of ITIC survey respondents that switched vendors, fully 50% of organizations swapped out Oracle in favor of SQL Server, while 17% migrated from Sybase to SQL Server. Overall, among the 20% of respondents that switched database platforms over the past three years, two-thirds or 67% opted to migrate to SQL Server. In this regard, Microsoft SQL Server converts outpaced rival Oracle by a 2-to-1 margin. Approximately 34% of the 20% of businesses that changed database platforms migrated away from DB2 or SQL Server in favor of Oracle.
IBM DB2 users were among the most satisfied respondents; an overwhelming 96% stayed put.
Analysis: Customer Issues and Chief Challenges
Respondents cite challenges with their database strategies, but are also sanguine about the journey. For instance, one respondent said that the main challenges were “keeping up with changes to the SQL platform and getting our database administrators and appropriate IT managers trained and re-certified on new versions of the technology and then figuring out how it all works with new virtualization and cloud computing technologies. Cost and complexity are also big factors to consider in any upgrade. Networks are getting more complex but our budgets and training are not keeping pace.”
Respondents were particularly focused on the cost issue: “cost, both new licensing and annual maintenance”, “increasing cost of licensing”, “cost is the overriding factor” were just some of the responses.
As for future plans, a 56% majority of respondents report that switching database platforms in the coming months is very unlikely; while 17% said it is not an option to switch and 15% said that switching is a possibility, depending on the circumstances.
Getting organizations to change DBMS platforms is difficult but not impossible. If a rival vendor can offer concomitant performance and functionality, coupled with tangibly better pricing and licensing renewal options which lower Total Cost of Ownership (TCO) and speed Return on Investment (ROI), organizations may be induced to make the switch. The biggest DBMS battle is in the SMB, SME sectors and green field accounts that are adding new databases.
DBMS vendors are anxious to keep the current customers and gain new ones. End users should make the vendors work to keep them as satisfied customers. Dissatisfied customers should voice their concerns and even satisfied customers should let their vendors know what they can do to make them even happier.

Database Competition Heats Up Read More »

Scroll to Top