2010

IBM Powers Up New PowerLinux Products, Strategy

IBM this week unveiled its latest generation of industry standard Linux-only servers optimized for its Power architecture along with a new strategy targeting specific x86 applications and workloads.

IBM has been a longtime Linux proponent, supporting industry standard distributions like Red Hat Enterprise Linux (RHEL) and SUSE Linux Enterprise – on its Power Systems line for the last 12 years. This week’s announcement reaffirms Big Blue’s commitment to Linux and broadens its scope with offerings designed to drive more growth for the Power platform in the lucrative x86 arena. IBM will fuel this growth via its mantra, “Tuned to the task,” which emphasizes delivering higher quality and superior economics than rivals.

According to Scott Handy, vice president of IBM’s PowerLinux Strategy and Business Development, “This is an extension to our overall Power strategy to address the Linux x86 space and drive more growth for our Power Systems servers.” …

IBM Powers Up New PowerLinux Products, Strategy Read More »

SQL Server Most Secure Database; Oracle Least Secure Database Since 2002

Ask any 10 qualified people to guess which of the major database platforms is the most secure and chances are at least half would say Oracle. That is incorrect.

The correct answer is Microsoft’s SQL Server. In fact, the Oracle database has recorded the most number of security vulnerabilities of any of the major database platforms over the last eight years.

This is not a subjective statement. The data comes directly from the National Institute of Standards and Technology.

Since 2002, Microsoft’s SQL Server has compiled an enviable record. It is the most secure of any of the major database platforms. SQL Server has recorded the fewest number of reported vulnerabilities — just 49 from 2002 through June 2010 — of any database. These statistics were compiled independently by the National Institute of Standards and Technology (NIST), the government agency that monitors security vulnerabilities by technology, vendor, and product (see Exhibit 1). So far in 2010, through June, SQL Server has a perfect record — no security bugs have been recorded by NIST CVE.

And SQL Server was the most secure database by a wide margin: Its closest competitor, MySQL (which was owned by Sun Microsystems until its January 2010 acquisition by Oracle) recorded 98 security flaws or twice as many as SQL Server.

By contrast, during the same eight-and-a-half year period spanning 2002 through June 2010, the NIST CVE recorded 321 security vulnerabilities associated with the Oracle database platform, the highest total of any major vendor. Oracle had more than six times as many reported security flaws as SQL Server during the same time span. NIST CVE statistics recorded 121 security-related issues for the IBM DB2 platform during the past eight-and-a-half years.

Solid security is an essential element for many mainstream line-of-business (LOB) applications, and a crucial cornerstone in the foundation of every organization’s network infrastructure. Databases are the information repositories for many organizations; they contain much of the sensitive corporate data and intellectual property. If database security is compromised, the entire business is potentially at risk.

SQL Server’s unmatched security record is no fluke. It is the direct result of significant Microsoft investment in its Trustworthy Computing Initiative, which the company launched in 2002. In January of that year, Microsoft took the step of halting all new code development for several months across its product lines to scrub the code base and make its products more secure.

The strategy is working. In the past 21 months since January 2009, Microsoft has issued only eight (8) SQL Server security-related alerts. To date in 2010 (January through June), there have been no SQL Server vulnerabilities recorded by Microsoft or NIST. Microsoft is the only database vendor with a spotless security record the first six months of 2010.

ITIC conducted an independent Web-based survey on SQL Server security that polled 400 companies worldwide during May and June 2010. The results of the ITIC 2010 SQL Server Security survey support the NIST CVE findings. Among the survey highlights:
• An 83% majority rated SQL Server security “excellent” or “very good” (see Exhibit 2, below).
• None of the 400 survey respondents gave SQL Server security a “poor” or “unsatisfactory” rating.
• A 97% majority of survey participants said they experienced no inherent security issues with SQL Server.
• Anecdotal data obtained during first-person customer interviews also elicited a very high level of satisfaction with the embedded security functions and capabilities of SQL Server 7, SQL Server 2000, SQL Server 2005, SQL Server 2008, and the newest SQL Server 2008 R2 release. In fact, database administrators, CIOs and CTOs interviewed by ITIC expressed their approbation with Microsoft’s ongoing initiatives to improve SQL Server’s overall security and functionality during the last decade starting with SQL Server 2000.

Strong security is a must for every organization irrespective of size or vertical industry. Databases are among the most crucial applications in the entire network infrastructure. Information in databases is the organization’s intellectual property and life blood.

Databases are essentially a company’s electronic filing system. The information contained in the database directly influences and impacts every aspect of the organization’s daily operations including relationships with customers, business partners, suppliers and its own internal end users. All of these users must have the ability to quickly, efficiently and securely locate and access data. The database platform must be secure. An insecure, porous database platform will almost certainly compromise business operations and by association, any firm that does business with it. Any lapses in database security, including deliberate internal and external hacks, inadvertent misconfiguration, or user errors can mean lost or damaged data, lost revenue, and damage to the company’s reputation, raising the potential for litigation and loss of business.

It’s also true that organizations bear at least 50 percent of the responsibility for keeping their databases and their entire network infrastructures secure. As the old proverb goes, “The chain is only as secure as its weakest link.” Even the strongest security can be undone or bypassed by user error, misconfiguration or weak computer security practices. No database or network is 100 percent hack-proof or impregnable.Organizations should consult with their vendors regarding any questions and concerns they may have about the security of ANY of their database platforms. They should also ensure they stay updated with the latest patches and install the necessary updates. Above all, bolster the inherent security of your databases with the appropriate third party security tools and applications. Make sure your organization strictly adheres to best computer security computing practices. At the end of the day only you can defend your data.

Registered ITIC site users can Email me at: ldidio@itic-corp.com for a copy of the full report.

SQL Server Most Secure Database; Oracle Least Secure Database Since 2002 Read More »

Cloud Computing: Pros and Cons

Cloud computing like any emerging new technology has both advantages and disadvantages. Before beginning any infrastructure upgrade or migration, organizations are well advised to first perform a thorough inventory and review of their existing legacy infrastructure and make the necessary upgrades, revisions and modifications. Next, the organization should determine its business goals for the next three-to-five years to determine when, if and what type of cloud infrastructure to adopt. It should also construct an operational and capital expenditure budget and a timeframe that includes research, planning, testing, evaluation and final rollout.
Public Clouds: Advantages and disadvantages
The biggest allure of a public cloud infrastructure over traditional premises-based network infrastructures is the ability to offload the tedious and time consuming management chores to a third party. This in turn can help businesses:
• Shave precious capital expenditure monies because they avoid the expensive investment in new equipment including hardware, software, and applications as well as the attendant configuration planning and provisioning that accompanies any new technology rollout.
• Accelerated deployment timetable. Having an experienced third party cloud services provider do all the work also accelerates the deployment timetable and most likely means less time spent on trial and error.
• Construct a flexible, scalable cloud infrastructure that is tailored to their business needs. A company that has performed its due diligence and is working with an experienced cloud provider can architect a cloud infrastructure that will scale up or down according to the organization’s business and technical needs and budget.
The potential downside of a public cloud is that the business is essentially renting common space with other customers. As such, depending on the resources of the particular cloud model, there exists the potential for performance, latency and security issues as well as acceptable response and service and support from the cloud provider.
Risk is another potential pitfall associated with outsourcing any of your firm’s resources and services to a third party. To mitigate risk and lower it to an acceptable level, it’s essential that organizations choose a reputable, experienced third party cloud services provider very carefully. Ask for customer references; check their financial viability. Don’t sign up with a service provider whose finances are tenuous and who might not be in business two or three years from now.
The cloud services provider must work closely and transparently with the corporation to build a cloud infrastructure that best suits the business’ budget, technology and business goals.
To ensure that the expectations of both parties are met, organizations should create a checklist of the items and issues that are of crucial importance to their business and incorporate them into Service Level Agreements (SLAs) Be as specific as possible. These should include but are not limited to:

• What types of equipment do they use?
• How old is the server hardware? Is the configuration powerful enough?
• How often is the data center equipment/infrastructure upgraded?
• How much bandwidth does the provider have?
• Does the service provider use open standards or is it a proprietary datacenter?
• How many customers will you be sharing data; resources with?
• Where is the cloud services provider’s datacenter physically located?
• What specific guarantees if any, will it provide for securing sensitive data?
• What level of guaranteed response time will it provide for service and support?
• What is the minimum acceptable latency/response time for its cloud services?
• Will it provide multiple access points to and from the cloud infrastructure?
• What specific provisions will apply to Service Level Agreements (SLAs)?
• How will financial remuneration for SLA violations be determined?
• What are the capacity ceilings for the service infrastructure?
• What provisions will there be for service failures and disruptions?
• How are upgrade and maintenance provisions defined?
• What are the costs over the term of the contract agreement?
• How much will the costs rise over the term of the contract?
• Does the cloud service provider use the Secure Sockets Layer (SSL) to transmit data?
• Does the cloud services provider encrypt the resting data to prohibit and restrict access?
• How often does the cloud services provider perform audits?
• What mechanisms will it use to quickly shut down a hack and can it track a hacker?
• If your cloud services provider is located outside your country of origin, what are the privacy and security rules of that country and what impact will that have on your firm’s privacy and security issues?
Finally, the corporation should appoint a liaison and that person should meet regularly with a representative from the cloud services provider to ensure that the company attains its immediate goals and that it is always aware and working on future technology and business goals. Outsourcing all or any part of your infrastructure to a public cloud does not mean forgetting and abandoning it.
Private Clouds: Advantages and Disadvantages
The biggest advantage of a private cloud infrastructure is that your organization keeps control of its corporate assets and can safeguard and preserve its privacy and security. Your organization is in command of its own destiny. That can be a double-edged sword.
Before committing to build a private cloud model the organization must do a thorough assessment of its current infrastructure, its budget and the expertise and preparedness of its IT department. Is your firm ready to assume the responsibility for such a large burden from both a technical and ongoing operational standpoint? Only you can answer that. Remember that the private cloud should be highly reliable and highly available – at least 99.999% uptime with built-in redundancy and failover capabilities. Many organizations currently struggle to maintain 99.9% uptime and reliability which is the equivalent of 8.76 hours of per server, per annum downtime. When your private cloud is down for any length of time, your end users (and anyone else who has access to the cloud) will be unable to access resources.
Realistically, in order for an organization to successfully implement and maintain a private cloud, it needs the following:
• Robust equipment that can handle the workloads efficiently during peak usage times
• An experienced, trained IT staff that is familiar with all aspects of virtualization, virtualization management, grid, utility and chargeback computing models
• An adequate capital expenditure and operational expenditure budget
• The right set of private cloud product offerings and service agreements
• Appropriate third party virtualization and management tools to support the private cloud
• Specific SLA agreements with vendors, suppliers and business partners
• Operational level agreements (OLAs) to ensure that each person within the organization is responsible for specific routine tasks and in the event of an outage
• A disaster recovery and backup strategy
• Strong security products and policies
• Efficient chargeback utilities, policies and procedures
Other potential private cloud pitfalls include: deciding which applications to virtualize; vendor lock-in and integration and interoperability issues. Businesses grapple with these same issues today in their existing environments. At present, however, the product choices from vendors and third party providers are more limited for virtualized private cloud offerings. Additionally, since the technology is still relatively new, it will be difficult from both a financial as well as technical standpoint to switch horses in midstream from one cloud provider to another if you encounter difficulties.
There is no doubt that virtualized public and private cloud infrastructures adoptions will grow significantly in the next 12 to 18 months. In order to capitalize on their benefits, lower your total cost of ownership (TCO), accelerate return on investment (ROI) and mitigate risk your organization should take its time and do it right.

Cloud Computing: Pros and Cons Read More »

Apple, Google Grapple for Top Spot in Mobile Web

Since January, the high technology industry has witnessed a dizzying spate of dueling, vendor product announcements.
So what else is new? It’s standard operating procedure for vendors to regularly issue hyperbolic proclamations about their latest/greatest offering, even (or especially) when the announcements are as devoid of content as cotton candy is of nutritional value. Maybe it’s just an outgrowth of the digital information age. We live and breathe instant information that circumnavigates the globe faster than you can say Magellan; the copy monster must be fed constantly. Or maybe it’s the protracted economic downturn which is making vendors hungrier than ever for consumer and corporate dollars.
Whatever the reason, there’s no doubt that high technology vendors – led by Google and Apple – are engaged in a near constant game of one-upmanship.
Apple indirectly started this trend in early January, when word began leaking out that Apple would finally announce the long-rumored iPad tablet in late January. The race was on among other tablet vendors to announce their products at the Consumer Electronics Show (CES) in Las Vegas in mid-January to beat Apple to the punch. A half-dozen vendors including, ASUSTeK Computer (ASUS), Dell, Hewlett-Packard, Lenovo, Taiwanese manufacturer Micro Star International (MSI) and Toshiba all raced to showcase their forthcoming wares in advance of Apple. It made good marketing sense: all of these vendors knew that once Apple released the iPad, that their chances of getting PR would be sorely diminished.
I have no problem with smaller vendors or even large vendors like Dell and HP, who rightfully reckon that they have to make their announcements in advance of a powerhouse like Apple to ensure that their products don’t get overlooked.
Apple vs. Google Battle of the Mobile Web Titans
But when the current industry giants and media darlings like Apple and Google start slugging it out online, in print and at various conferences, it’s overwhelming.
Apple and Google are just the latest in a long line of high technology rivalries. In the 1970s it was IBM vs. HP; in the 1980s, the rise of networking created several notable rivalries: IBM vs. Digital Equipment Corp. (DEC); IBM vs. Microsoft; Oracle vs. IBM; Novell vs. 3Com; Novell vs. Microsoft; Cabletron vs. Synoptics and Cisco vs. all the internetworking vendors. By the 1990s it was Microsoft vs. Netscape and Microsoft vs. pretty much everyone else.
The Apple vs. Google rivalry differs from earlier technology contests in that the relationship between the two firms began as a friendly one and to date, there has been no malice. Until August, 2009 Google CEO Eric Schmidt was on Apple’s board of directors. And while the competition between these two industry giants is noticeably devoid of the rancor that characterized past high tech rivalries, it’s safe to say that the two are respectfully wary of each other. Apple and Google are both determined not to let the other one get the upper hand, something they fear will happen if there is even the slightest pause in the endless stream of headlines.
Google and Apple started out in different markets – Google in the online search engine and advertising arena and Apple as a manufacturer of consumer hardware devices and software applications. Their respective successes – Apple’s with its Mac hardware and Google’s with its search engine of the same name have led them to this point: a head to head rivalry in the battle for supremacy of the mobile Web arena.
On paper, they appear to be two equally matched gladiators. Both companies have huge amounts of cash. Apple has $23 billion in the bank and now boasts the highest valuation of any high technology company, with a current market cap of $236.3 billion, surpassing Microsoft for the top spot. Google has $26.5 billion in cash and a valuation of $158.6 billion. Both firms have two of the strongest management and engineering teams in Silicon Valley. Apple has the iconic Steve Jobs who since his return has re-vitalized the company. Google is helmed by co-founders and creative geniuses Larry Page and Sergey Brin and since 2006 and Eric Schmidt, the CEO who knows how to build computers and make the trains run on time.
Fueling this rivalry is Apple’s and Google’s stake in mobile devices and operating systems. In Apple’s case this means the wildly successful iPhone, iPod Touch and most recently the iPad and the Mac Mini. Google’s lineup consists of its Chrome OS and Android OS which will power tablet devices like Dell’s newly announced Streak, Lenovo’s forthcoming U1 hybrid tablet/notebook due out later this year. The rivalry between the two is quite literally getting down to the chip level. Intel, which has for so long been identified with Microsoft’Windows-based PC platform is now expanding its support for Android – a move company executives have described as its “port of choice” gambit. Apple is no slouch in this area, either: its Macs – from the Mac Minis’ to the MacBook Pros, ship with Intel inside. Last week Nvidia CEO Jen-Hsun Huang weighed in on the Apple/Google rivalry on Google’s side, predicting that the tablet designs will converge around Google’s operating system.
But a stroll through any airport, mall, consumer home or office would give a person cause to dispute Huang’s claim: iPads and iPhones are everywhere. Apple recently announced that it has sold over two million iPads since the device first shipped in April. During a business trip from Boston to New Orleans last week I found that Apple iPads were as much in evidence as hot dogs at a ballpark.
Ironically, Microsoft, a longer term traditional rival of both Apple and Google is not mentioned nearly so often in the smart phone and tablet arenas. That’s because Microsoft’s Windows OS is still searching for a tablet to call its own. Longtime Microsoft partner HP, abruptly switched course: after Microsoft CEO Steve Ballmer got on stage and demonstrated Windows 7 running on HP’s slate, HP bought Palm and earlier this week acquired the assets of Phoenix Technologies which makes an operating system for tablets. That leaves Microsoft to promote its business centric Windows 7 phone which will run Xbox LIVE games, Zune music and the company’s Bing search engine. All is not lost for Microsoft: longtime “frenemy” Apple CEO Steve Jobs said recently that the new iPhone 4G will run Microsoft’s Bing fueling speculation that Apple will drop support for Google’s search engine. Both Google and Apple are still competing with Microsoft in other markets like operating systems, games and application software to name a few, but that’s another story.
There are other competitors in the smart phone and tablet markets but you’d hardly know it from the headlines. Research In Motion’s (RIM) Blackberry is still a market leader. But Apple and Google continue to dominate the coverage. I guess high technology just like sports revels in a classic rivalry. And this one promises to be a hard fought struggle.

Apple, Google Grapple for Top Spot in Mobile Web Read More »

Virtualization Deployments Soar, But Companies Prefer Terra Firma to Cloud for now

The ongoing buzz surrounding cloud computing – particularly public clouds – is far outpacing actual deployments by mainstream users. To date only 14% of companies have deployed or plan to deploy a private cloud infrastructure within the next two calendar quarters.
Instead, as businesses slowly recover from the ongoing economic downturn, their most immediate priorities are to upgrades to legacy desktop and server hardware, outmoded applications and to expand their virtualization deployments. Those are the results of the latest ITIC 2010 Virtualization and High Availability survey which polled C-level executives and IT managers at 400 organizations worldwide.
ITIC partnered with Stratus Technologies and Sunbelt Software to conduct the Web-based survey of multiple choice questions and essay comments. ITIC also conducted first person interviews with over two dozen end to obtain anecdotal responses on the primary accelerators or impediments to virtualization, high availability and reliability, cloud computing. The survey also queried customers on whether or not their current network infrastructure and mission critical applications were adequate enough to handle new technologies and the increasing demands of the business.
The survey showed that for now at least, although, many midsized and large enterprises are contemplating a move to the cloud – especially a private cloud infrastructure – the technology and business model is still not essential for most businesses. Some 48% of survey participants said they have no plans to migrate to private cloud architecture within the next 12 months while another 33% said their companies are studying the issue but have no firm plans to deploy.

The study also indicates that Private Cloud deployments are outpacing Public Cloud Infrastructure deployments by a 2 to 1 margin. However before businesses can begin to consider a private cloud deployment they must first upgrade the “building block” components of their existing environments e.g., server and desktop hardware, WAN infrastructure; storage, security and applications. Only 11% of businesses described their server and desktop hardware as leading edge or state-of-the-art. And just 8% of respondents characterized their desktop and application environment as leading edge.

The largest proportion of the survey participants – 52% – described their desktop and server hardware working well, while 48% said their applications were up-to-date. However, 34% acknowledged that some of their server hardware needed to be updated. A higher percentage of users 41% admitted that their mission critical software applications were due to be refreshed. And a small 3% minority said that a significant portion of both their hardware and mission critical applications were outmoded and adversely impacting the performance and reliability of their networks.

Based on the survey data and customer interviews, ITIC anticipates that from now until October, companies’ primary focus will be on infrastructure improvements.

Reliability and Uptime Lag

The biggest surprise in this survey from the 2009 High Availability and Fault Tolerant survey, which ITIC & Stratus conducted nearly one year ago, was the decline in the number of survey participants who said their organizations required 99.99% uptime and reliability. In this latest survey, the largest portion of respondents – 38% — or nearly 4 out of 10 businesses said that 99.9% uptime — the equivalent of 8.76 hours of per server, per annum downtime was the minimum acceptable amount for their mission critical line of business (LOB) applications. This is more than three times the 12% of respondents who said that 99.9% uptime was acceptable in the prior 2009 survey. Overall, 62% or nearly two-thirds of survey participants indicated their organizations are willing to live with higher levels of downtime than were considered acceptable in previous years.
Some 39% of survey respondents – almost 4 out of 10 respondents indicated that their organizations demand high availability which ITIC defines as four nines of uptime or greater. Specifically, 27% said their organizations require 99.99% uptime; another 6% need 99.999% uptime and a 3% minority require the highest 99.999% level of availability.
The customer interviews found that the ongoing economic downturn, aged/aging network infrastructures (server and desktop hardware and older applications), layoffs, hiring freezes and the new standard operating procedure (SOP) “do more with less” has made 99.9% uptime more palatable than in previous years.
Those firms that do not keep track of the number and severity of their outages have no way of gauging the financial and data losses to the business. Even a cursory comparison indicates substantial cost disparities between 99% uptime and 99.99% uptime. The monetary costs, business impact and risks associated with downtime will vary by company as well as the duration and severity of individual outage incidents. However a small or midsize business, for example, which estimates the hourly cost of downtime to be a very conservative $10,000 per hour, would potentially incur losses of $876,000 per year at a data center with 99% application availability (87 hours downtime). By contrast, a company whose data center operations has 99.99% uptime, would incur losses of $87,600 or one-tenth that of a firm with conventional 99% availability.
Ironically, the need for rock-solid network reliability has never been greater. The rise of Web-based applications and new technologies like virtualization and Service Oriented Architecture (SOA), as well as the emergence of public or shared cloud computing models are designed to maximize productivity. But without the proper safeguards these new datacenter paradigms may raise the risk of downtime. The Association for Computer Operations Management/ Data Center Institute (AFCOM) forecasts that one-in-four data centers will experience a serious business disruption over the next five years.
At the same time, customer interviews revealed that over half of all businesses 56% lack the budget for high availability technology. Another ongoing challenge is that 78% of survey participants acknowledged that their companies either lack the skills or simply do not attempt to quantify the monetary and business costs associated with hourly downtime. The reasons for this are well documented. Some organizations don’t routinely do this and those that attempt to calculate costs and damages run into difficulties collecting data because the data resides with many individuals across the enterprise. Inter-departmental communication, cooperation and collaboration is sorely lacking at many firms. Only 22% of survey respondents were able assign a specific cost to one hour of downtime and most of them gave conservative estimates of $1,000 to $25,000 for a one hour network outage. Only 13% of the 22% of survey participants who were able to quantify the cost of downtime indicated that their hourly losses would top $175,000 or more.

Users Confident and Committed to Virtualization Technology
The news was more upbeat with respect to virtualization – especially server virtualization deployments. Organizations are both confident and comfortable with virtualization technology.
72% of respondents indicated the number of desktop and server-based applications demanding high availability has increased over the past two years. The survey also found that a 77% majority of participants run business critical applications on virtual machines. Not surprisingly, the survey data showed that virtualization usage will continue to expand over the next 12 months. A 79% majority – approximately eight-out-of-10 respondents — said the number of business critical applications running on virtual machines and virtual desktops will increase significantly over the next year. Server virtualization is very much a mainstream and accepted technology. The responses to this question indicate increased adoption as well as confidence. Nearly one-quarter of the respondents – 24% say that more than 75% of their production servers are VMs. Overall 44% of respondents say than over 50% of their servers are VMs. However, none of the survey participants indicate that 100% of their servers are virtualized. Additionally, only 6% of survey resp

Virtualization Deployments Soar, But Companies Prefer Terra Firma to Cloud for now Read More »

Networks Without Borders Raise Security, Management Issues

“Networks without Borders” are rapidly becoming the rule rather than the exception.
The demand for all access all the time, along with the rapid rise in remote, telecommuting, part time and transient workers, has rendered network borders obsolete and made networks extremely porous. Today’s 21st Century networks more closely resemble sieves than citadels.
Gone are the days when employees and data resided safely behind the secure confines of the firewall, clocked in promptly at 9:00 a.m., sat stationary in front of their computers, never accessed the Internet, and logged off at 6:00 p.m. and were offline until the next workday.
Today’s workers are extremely mobile, always connected and demand 24×7 access to the corporate network, applications and data via a variety of device types from desktops to smart phones irrespective of location. ITIC survey data indicates that workers at 67% of all businesses worldwide travel telecommute and log in remotely at least several days a month. At present, one-out-of-eight employees use their personal computers, notebooks and smart phones to access corporate data.
From an internal perspective, the ongoing economic downturn has resulted in layoffs, hiring freezes, budget cuts and less money and time available for IT training and certification. At the same time, the corporate enterprise network and applications have become more complex. IT departments face increasing pressure to provide more services with fewer resources. Another recent ITIC survey of 400 businesses found that almost 50% of all businesses have had budget cuts and 42% have had hiring freezes. An overwhelming 84% majority of IT departments just pick up the slack and work longer hours!
External pressures also abound. Many businesses also have business partners, suppliers and customers who similarly require access. Additionally, many organizations employ outside consultants, temporary and transient workers who need access to the corporate network from beyond the secure confines of the firewall.
This type of on demand, dynamic access is distinctly at odds with traditional security models. The conventional approach to security takes a moat and drawbridge approach: to contain and lock down data behind the safety of the firewall. IT managers have been trained to limit access, rights and privileges particularly with respect to transient workers, outside consultants and remote and telecommuting workers. And who can blame them? The more network access that is allowed, the greater the risk of litigation, non-compliance and compromising the integrity of the corporate network and data.
Providing secure, ubiquitous access to an array of mobile and home-based employees, business partners, suppliers, customers and consultants who need permanent or temporary access to the network is a tedious and time consuming process. It necessitates constant vigilance on the part of the IT department to monitor and provision the correct access rights and privileges.
The conundrum for IT departments is to easily, quickly and cost effectively provision user account access while preserving security and maintaining licensing compliance. The emerging Virtual Desktop Infrastructure (VDI) technology, where users control a desktop running on a server remotely, can address some of these issues, but VDI doesn’t solve all the problems.
An intriguing alternative to VDI is nascent software application from MokaFive, which is designed specifically to plug the holes in the so-called “Porous Enterprise.” MokaFive, based in Redwood City, California was founded in 2005 by a group of Stanford University engineers specifically to enable IT departments to swiftly provision network access without the cost and complexity of VDI solutions. MokaFive is not the only vendor exploring this market; its’ competitors include VMware (via the Thinstall acquisition); Microsoft (via the Kidaro acquisition), LANDesk and Provision Networks. However, the MokaFive offering is to date, the only “pure play” offering that enables organizations to provision a secure desktop environment on the fly to individual users rather than just an entire group.
The MokaFive Suite is actually a set of Desktop-as-a-Service facilities that are operating system, hardware and application agnostic. MokaFive’s desktop management features enable IT administrators to centrally create, deliver, secure and update a fully-contained virtual environment, called a LivePC, to thousands of users. Contract workers can log on via Guest Access; there is no need for the IT department to specially provision them. The MokaFive Suite facilitates ubiquitous access to Email, data and applications irrespective of location, device type (e.g., Windows, and Macintosh) or the availability of a hard wired network connection.
I discussed the product with several IT executives and administrators who immediately and enthusiastically grasped the concept.
“This a very cool idea,” says Andrew Baker, a 20 year veteran VP of IT and security who has held those positions at a variety of firms including Bear Stearns, Warner Media Group and The Princeton Review. “The most tedious aspect of configuring a worker’s experience is the desktop,” he says. Typically the IT manager must physically configure the machine, set up the access rights, privileges and security policies and deploy the correct applications. This is especially problematic and time consuming given the increasing number of mobile workers and transient workforces. The other issue is the constant need to re-provision the desktop configuration to keep it up to date, Baker says. The MokaFive Suite, he says, “saves precious time and it solves the issue of the disappearing network perimeter. I love the idea of being able to be secure, platform agnostic and being able to support multiple classes of workers from a central location.”
MokaFive’s LivePC images run locally, so end-users simply download their secure virtual desktop via a Web link, and run it on any computer (Macintosh or Windows). IT administrators apply updates and patches to a single golden image and MokaFive distributes the differentials to each LivePC. The entire process is completed in minutes by a single IT administrator. Once the MokaFive LivePC link is up and published, users are up and running regardless of whether it’s one person or 100 people. The traditional method of physically provisioning an asset can involve several IT managers and take anywhere from two days to a couple of weeks. It involves procurement, imaging, testing, certification and delivery of the device to remote workers. Baker estimates that MokaFive could cut administration and manpower time by 30% to 60% depending on the scope of the company’s network.
MokaFive also requires less of a monetary investment than rival VDI solutions and doesn’t require IT administrators to learn a new skill set, claims MokaFive VP of marketing, Purnima Padmanabhan.
“VDI does enable companies to ramp up and quickly provision and de-provision virtual machines (VMs); however, the IT department is still required to build out fixed server capacity for its transient workforce,” Padmanabhan says. Oftentimes, the additional capacity ends up going to waste. “The whole point of contractors is to dial in, dial up and dial down expenses, and that’s what MokaFive does,” she adds.
Steve Sommer, president of SLS Consulting in Westchester, New York agrees. Sommer spent 25 years simultaneously holding the positions of CIO and CTO at Hughes, Hubbard & Reed a NYC law firm with 1,200 end users – including 300 attorneys — in a dozen remote locations. Sommer observes that corporate politics frequently determine access policy at the expense of security. “A company’s knowledge workers – lawyers, doctors, software developers – who drive large portions of revenue will demand all-access, all the time and security be damned. In the past it was an either/or proposition,” Sommer says.
With the MokaFive desktop-as-a-service approach all the data is encapsulated, encrypted and controlled. Organizations now have the option to manage the permanent workforce as well as temporary contractors and consultants who use their own personal devices quickly and easily. IT managers can provision a virtual machine (VM) on top of MokaFive or give the remote user or contract worker an HTML link which contains the MokaFive LivePC. The end user clicks on the link to get a completely encapsulated VM environment, which is controlled through policies using MokaFive. It can be completely encrypted at the 256-bit AES encryption. The entire environment is managed, contained and is kept updated with the latest passwords, connections, application versions and patches. When the user or contractor worker leaves the company, the IT department issues a root kill signal and all the licenses are retrieved and called back, ensuring compliance.
“MokaFive is a boon for IT departments and end users alike; no more worrying about provisioning and version. I love the fact that it’s application, hardware and operating system agnostic,” Sommer says. “And it also has distinct time saving benefits for the end user, or transient workforce. They can take their work with them wherever they are and they don’t have to worry about borrowing a notebook or PDA and ensuring that it’s properly configured with the correct version.”
MokaFive already has several dozen customers and prospects and is gaining traction in a number of vertical markets including financial services, legal, healthcare, government and education. Given the burgeoning popularity and mainstream adoption of VDI, the MokaFive Suite represents a viable alternative to organizations that want a fast, cost effective and non-disruptive solution that lets IT departments give fast, efficient and secure network access. It’s definitely worth exploring and MokaFive offers free trials for interested parties from its website.

Networks Without Borders Raise Security, Management Issues Read More »

VDI Vendor Wars Intensify

There’s no hotter market in high tech this year than Virtual Desktop Infrastructure (VDI) and you don’t need sales and unit shipment statistics to prove it. No, the best measurement of VDI’s hotness is the sudden flurry of vendor announcements accompanied by a concomitant rise in vitriol.
The main players in the VDI market are actually two sets of pairs. It’s Citrix and Microsoft lining up against VMware and EMC for Round 2 in the ongoing virtualization wars. On March 18, Citrix and Microsoft came out swinging, landing the first potent, preemptive punches right where they hope will hurt VMware the most: in its pocketbook.
Citrix and Microsoft unveiled a series of VDI initiatives that include aggressive promotional pricing deals and more simplified licensing models. To demonstrate just how solid and committed they are to their alliance and taking on and taking down VMware and EMC, the two firms even went so far as to combine their respective VDI graphics technologies.
At stake is the leadership position in the nascent, but rapidly expanding global VDI market. The results of the ITIC 2010 Global Virtualization Deployment and Trends Survey which polled 800+ businesses worldwide in the December/January timeframe indicate that 31% of respondents plan to implement VDI in 2010; that’s more than double the 13% that said they would undertake a VDI deployment in 2009. Application virtualization is also on the rise. The same ITIC survey found that 37% of participants plan application virtualization upgrades this year, up from 15% who responded affirmatively to the same question in the 2009.
The current installed base of VDI deployments is still relatively small; hence the statistics that show the number of deployments doubling year over year must be considered in that context. Nonetheless, double digit deployment figures are evidence of strengthening demand and a market that is robustly transitioning from niche to mainstream. The spate of announcements from Microsoft and Citrix were clearly intended to capitalize on the growth spurt in VDI. At the same time, the companies threw down the gauntlet with initiatives aimed at solidifying and expanding their base of current VDI customers while serving the dual purpose of luring VMware customers away from that company’s VDI platform. They include:
• “VDI Kick Start” This wide ranging sales promotion, which runs from March 18 through December 31, 2010, seeks to jump start VDI deployments by lowering the entry level pricing for customers purchasing Microsoft and Citrix technologies. As part of this deal, existing Microsoft client access licensing (CAL) customers will pay $28 per desktop for up to 250 users to purchase the Microsoft Virtual Desktop Infrastructure Suite, Standard edition, and Citrix’s XenDesktop VDI Edition for one year. That’s roughly a 50% discount off the list prices that corporations have paid up until now for their annual CALs. This is crucial for cost conscious businesses. Client access licenses typically represent the lion’s share of their licensing deals since desktops outnumber servers in mid-sized and large enterprises. In addition to merging Microsoft’s 3-D graphics technology for virtual desktops, called RemoteFX, with Citrix’s high-definition HDX technology.

• The Microsoft Virtual Desktop Access (VDA) License Plan. Organizations that use Thin Client devices which are not included or covered under Microsoft’s SA maintenance plan, can now purchase the VDA licenses at a retail price of $100 per device per annum. This targets end users who travel or telecommute and need to use personal devices or public networks to access their corporate data. Microsoft also made another move towards simplifying its virtualization licensing plan. Starting July 1, Microsoft SA customers will no longer be required to purchase a separate license to access Windows via a VDI.
• The “Rescue for VMware VDI” (the name says it all) this promotion is a direct attack on VMware. Like the VDI Kick Start program it runs from March 18 through December 31, 2010. Under the terms of this deal, any Microsoft Software Assurance licensing/maintenance customer can replace their existing VMware View licenses for free. VMware View users who opt out of that platform in favor of the Citrix and Microsoft offerings will receive up to 500 XenDesktop VDI Edition device licenses and up to 500 Microsoft VDI Standard Suite device licenses free for an entire year once they trade in their VMware View licenses.
Dai Vu, Microsoft’s director of virtualization marketing said the announcements were all about delivering more value to desktop customers and simplifying and extending organizations’ licensing rights.
The Citrix/Microsoft announcements also cement the close working partnership and the “enemy of my enemy is my friend” relationship the firms have enjoyed for many years. By bundling their respective VDI offerings together, the two companies should also ensure integration and interoperability which are crucial components for each and every layer in a virtualized data center environment.
VMware and EMC: Not Standing Still
VMware and EMC executives have yet to publicly respond to the Microsoft/Citrix initiatives. However, it’s almost certain that VMware will have to offer its current and prospective VDI accounts incentives to counter the Microsoft/Citrix alliance. Cash strapped corporations and IT departments are all on the lookout for top notch products at bargain basement prices. And it doesn’t get much better for customers than the free Rescue for VMware VDI program.
VMware built up a commanding lead in the server virtualization arena over the last five years by virtue of being first to market and delivering leading edge features and performance in its signature ESX Server product. VMware’s competitors have spent the last several years playing catch up in server virtualization. This allowed VMware to charge a premium price for its premier offerings. Depending on the size and scope of the individual organization’s server virtualization deployment, customers paid on average 35% to as much as 75% higher for VMware server-based offerings. There were surprisingly few complaints.
The emerging VDI and application virtualization markets are a different story. Only about 5% to 8% of organizations worldwide have fully virtualized their desktop infrastructure. So it’s too soon to declare a clear market winner. It’s safe to say that Citrix, Microsoft and VMware are all market leaders in this segment. This time around though, Microsoft and Citrix are determined not to let VMware and EMC run away with the race by building an insurmountable lead.
Meanwhile, VMware and EMC have not been idle. Former Microsoft executive Paul Maritz succeeded VMware founder Diane Greene following her 2008 departure as the company’s president and chief executive officer. Since then he has made tangible moves to bolster VMware’s position in the VDI and application virtualization arenas. Maritz and EMC CEO Joe Tucci make a formidable combination, as do EMC and VMware. EMC purchased VMware in 2004 for $635 million and it owns an 86% majority stake in the server virtualization market leader. In the past several years, VMware’s fortunes and revenues have risen faster than EMC’s. VMware’s year-over-year (YoY) quarterly revenue growth stands at 18.20% compared with EMC’s modest 2.10% Y0Y quarterly sales. Another key indicator is net earnings and in this regard, VMware experienced negative YoY quarterly earnings growth of -49.4 0% . By contrast its parent EMC recorded a very robust and positive 44.70% jump in YoY quarterly earnings. It is also worth noting that VMware’s annual revenues of $2.02 billion represent only 15% of EMC’s annual sales of $14.03 billion. And to date, EMC’s solutions have only been related tangentially to VMware’s VDI products. For practical purposes, this may continue to be the case. From a PR standpoint though, EMC and VMware are presenting themselves as a sort of virtualization “dynamic duo.”
At an EMC Analyst event at the company’s Hopkinton, MA headquarters on March 11, Pat Gelsinger, president of EMC’s Information Infrastructure Products group described the combination of EMC and VMware – specifically with respect to storage virtualization, virtualization management and private cloud infrastructures — as the “Wild West” of the virtualization market, saying “we want to be disruptive and change the way people fundamentally think of IT.” Though Gelsinger mainly confined his comments to EMC’s core bailiwick in the storage arena, it is clear that EMC and VMware are pro-actively presenting a united front.
In February, the two firms moved to reposition some of their assets; EMC and VMware inked a deal for VMware to acquire certain software products and expertise from EMC’s Ionix IT management business in an all cash deal for $200 million. EMC does retain the Ionix brand and gets full reseller rights to continue to offer customers the products acquired by VMware. Maritz said VMware’s acquisition of the Ionix products and expertise promises to further establish VMware vCenter as the next generation management platform for private cloud infrastructures.
The agreement also calls for VMware to take control of all the technology and intellectual property of FastScale, which EMC acquired in 2009. The FastScale Composer Suite incorporates integrated software management tools to enable organizations to maintain peak performance in a virtualized environment.
Also, recently, VMware introduced ThinApp 4.5, a new version of its application virtualization package designed to simplify enterprises’ migration to Windows 7.
End Users are the Biggest Winners
What makes the latest competition for VDI market dominance noteworthy is the extreme actions the combatants are willing to take in order to retain and gain customers’ at their rivals expense. With last week’s joint announcements and deepening partnership, Citrix and Microsoft have signaled their intention to lead but it’s still too early to call the race.
The joint Microsoft/Citrix initiatives to cut costs and simplify virtualization licensing plans remove two of the more significant barriers to VDI adoption. The largest looming challenge remains the willingness of corporations to embrace a new technology model as their organizations and IT departments continue to grapple with the lingering effects of the ongoing economic crunch. In this regard, all of the virtualization vendors in concert with OEM hardware vendors like Dell, Hewlett-Packard, IBM, Stratus Technologies and Wyse who partner with them must convince customers that transitioning to VDI will provide tangible Total Cost of Ownership (TCO) and Return on Investment (ROI) benefits. This entails providing organizations with the necessary guidance – including tools, training, documentation, Best Practices and solid technical service and support – to ensure that a conversion to VDI can be accomplished with minimal disruption. Admittedly, this is a tall order.
Hardware vendors like Dell, HP, IBM et al all have a stake in the future success of the VDI market. Organizations that migrate to VDI will seek to upgrade to newer, more powerful desktops (PCs, notebooks) and servers, which in turn, potentially boosts the hardware vendors’ individual and collective bottom lines. Additionally, both HP and IBM boast huge service and support organizations, which also stand to benefit from an uptick in VDI adoptions. So the hardware vendors have every reason to partner with Citrix, Microsoft and VMware to promote and expand the VDI market segment. Regardless of which vendor(s) prevails, the biggest winners will be the customers. When several big name vendors vie for the hearts, minds and wallets of customers, it usually means that feature-rich, reliable products get to market sooner at more competitive prices. Let’s hope the VDI race is a long one.

VDI Vendor Wars Intensify Read More »

Database Competition Heats Up

The database market will see lots of activity during the 2010-2011 timeframe as nearly 60% of organizations move to upgrade or expand existing and legacy networks.
That statistic comes from new ITIC survey data, which polled 450 organizations worldwide. Not surprisingly the survey shows that longtime market leaders Oracle, IBM, Microsoft and Sybase will continue to dominate the DBMS market and solidify their positions.
Databases are among the most mature and crucial applications in the entire network infrastructure. Database information is the lifeblood of the business. Databases directly influence and impact every aspect of the organization’s daily operations including: relationships with customers, business partners, suppliers and the organization’s own internal end-users. All of these users must have the ability to locate and access data quickly, efficiently and securely. The corporate database must deliver optimal performance, reliability, security, business intelligence and ease of use. It must also incorporate flexible, advanced management capabilities to enable database administrators (DBAs) to construct and oversee a database management system (DBMS) that best suits the organization from both a technology and business perspective.
What will distinguish the DBMS market this year is that the always intense and vociferous vendor rivalries will heat up even more over the next 12 months.
There are several pragmatic reasons for this. Most notable is the fact that many organizations deferred all but the most pressing network upgrade projects during the severe downturn over the past two-and-a-half years. Many businesses are now in a position where they must upgrade their legacy database infrastructure because it’s obsolete and is adversely impacting or will shortly impact the business. Anytime a company decides on a major upgrade there’s always a chance, that they may switch providers. The DBMS vendors know this and will do their level best to lure customers to their platform, or at the very least get a foot in the door.
Another factor that looms large in the 2010 DBMS market dynamics is Oracle’s purchase of Sun Microsystems. That acquisition finally got the green light from the European Commission last month. Speculation abounds as to the fate of the MySQL, which is a popular and highly regarded Open Source DBMS. For the record, Oracle executives stated publicly within the last two weeks that it will continue to support and develop MySQL and even provide integration with other Oracle offerings. But users are uneasy because MySQL does compete to some extent with some Oracle products. Expect rivals, particularly IBM and Microsoft, to aggressively capitalize on user confusion and fear to entice users to their respective platforms.
The DBMS Vendor Landscape
As nearly everyone knows, the four major DBMS vendors: Oracle, IBM, Microsoft and Sybase account for 90% of the installed base, unit shipments and revenue.
Oracle’s 11g is the undisputed market leader. It offers a full slate of online transactional processing (OLTP) as well as specialized database applications. As such it is being assailed from all sides and with relish by rivals who take every opportunity to criticize its’ products and strategy. Oracle, headed by Larry Ellison one of the most visible and outspoken high technology CEOs, happily reciprocates with its own vitriol.
IBM’s DB2 9.5 for Linux, Windows and UNIX remains firmly entrenched in high end enterprises owing to its rock solid reliability, performance, management, scalability and overall data and application integration capabilities. Users are also loyal to the DB2 platform because of IBM’s strong after-market technical service and support offerings. IBM also secures its position within very large enterprises by giving good deals and discounts on licensing renewals and training and support.
Microsoft’s SQL Server 2008 has shown tremendous improvement in scalability, security, ease of use, programmability and application development functionality and is gaining ground particularly among SMB and SME organizations. Microsoft hopes that the increased functionality of SQL Server 2008 will enable it to erode Oracle’s very entrenched presence among enterprises. A big plus for Microsoft is its legion of committed resellers and consultants who do an excellent job of promoting SQL Server 2008 among SMBs and SMEs.
Cost, Interoperability and Performance Top User DBMS Requirements
DBMS upgrades and new installations will be fought, won and/or lost according to three main factors: they are interoperability, cost and performance/features. The latest ITIC survey data found that nearly 90% rated interoperability with existing or planned infrastructure as the most important factor weighed when choosing a server vendor; 80% chose cost as a main DBMS influencer and 78% cited performance as their main reason for choosing a specific DBMS vendor platform.
But any DBMS vendor that hopes to dislodge or supplant a rival in an existing account will have to work hard to do so. The ITIC survey data also shows that organizations – especially large enterprises – do not readily or often forsake their legacy platforms. According to the survey data, 76% of survey respondents indicated they have not migrated or switched any of their main line of business applications from one database platform to another within the past three years.
This statistic makes a lot of sense. Precisely because DBMS platforms are among the most mature server-based applications in the entire enterprise, it’s much more work to rip out one platform and start fresh. A wholesale switch from one platform to another requires significant capital expenditure monies. Additionally, the business must also invest a lot of time and energy in converting to a new platform, testing new applications, rewriting scripts and re-training DBAs and getting them certified on the new environment. For CIOs, CTOs and IT departments this prospect has roughly the same appeal as having root canal without Novocain.
Nonetheless, one-in-five survey respondents – 20% — did migrate database platforms over the past three years. The most popular reasons for switching DBMS platforms, according to the survey respondents is a move to a custom developed in-house application a customized application developed by a partner. Just over half – 53% — of responding organizations that changed DBMS platforms came from midsized enterprises with 500 to 3,000 end users – a fact that favored Microsoft SQL Server 2008 deployments. Among the 20% of ITIC survey respondents that switched vendors, fully 50% of organizations swapped out Oracle in favor of SQL Server, while 17% migrated from Sybase to SQL Server. Overall, among the 20% of respondents that switched database platforms over the past three years, two-thirds or 67% opted to migrate to SQL Server. In this regard, Microsoft SQL Server converts outpaced rival Oracle by a 2-to-1 margin. Approximately 34% of the 20% of businesses that changed database platforms migrated away from DB2 or SQL Server in favor of Oracle.
IBM DB2 users were among the most satisfied respondents; an overwhelming 96% stayed put.
Analysis: Customer Issues and Chief Challenges
Respondents cite challenges with their database strategies, but are also sanguine about the journey. For instance, one respondent said that the main challenges were “keeping up with changes to the SQL platform and getting our database administrators and appropriate IT managers trained and re-certified on new versions of the technology and then figuring out how it all works with new virtualization and cloud computing technologies. Cost and complexity are also big factors to consider in any upgrade. Networks are getting more complex but our budgets and training are not keeping pace.”
Respondents were particularly focused on the cost issue: “cost, both new licensing and annual maintenance”, “increasing cost of licensing”, “cost is the overriding factor” were just some of the responses.
As for future plans, a 56% majority of respondents report that switching database platforms in the coming months is very unlikely; while 17% said it is not an option to switch and 15% said that switching is a possibility, depending on the circumstances.
Getting organizations to change DBMS platforms is difficult but not impossible. If a rival vendor can offer concomitant performance and functionality, coupled with tangibly better pricing and licensing renewal options which lower Total Cost of Ownership (TCO) and speed Return on Investment (ROI), organizations may be induced to make the switch. The biggest DBMS battle is in the SMB, SME sectors and green field accounts that are adding new databases.
DBMS vendors are anxious to keep the current customers and gain new ones. End users should make the vendors work to keep them as satisfied customers. Dissatisfied customers should voice their concerns and even satisfied customers should let their vendors know what they can do to make them even happier.

Database Competition Heats Up Read More »

Tablets Take Off in 2010, Thanks to Apple’s iPad

Regardless of how well the newest class of Tablet computers fare in terms of sales and unit shipments, the evolution of these portable devices will be divided into two classifications: Before the Apple iPad and After the Apple iPad.
Apple’s iPad — admittedly a late entrant into this market — has already changed the game in the fledgling, niche Tablet market, even before the company has shipped its first device.

The frenzied efforts of industry watchers — from Apple afficiandos, rival vendors to analysts and media — to ferret out the most minute detail of the Apple tablet in advance of its release, served to served to rejuvenate what had been a stalled market segment.
The Tablet computer occupies a still nebulous market arena that puts it somewhere in between smaller NetBooks and smartphones and larger sized portable devices. No one can answer those questions with any surety, but one thing is certain: Apple’s entrance into this crowded field has sparked renewed interest into this device category.
The long rumored iPad was shrouded in mystery for months before the official January 27 announcement. Apple stubbornly refused to confirm its existence, much less any details. Nonetheless, the anticipation was so great, that it sent several vendors scrambling to preview rival Tablet offerings at the Consumer Electronics Show (CES) in Las Vegas in advance of the iPad debut.
No one was shocked when Apple CEO Steve Jobs introduced the company’s latest “creation.” However, Apple did manage to stun the industry by hitting the $500 price barrier for the entry level device. This affordable tag makes the feature laden iPad Tablet competitive with the wildly successful, low-cost NetBooks which were all the rage in 2009. Additionally, the Apple iPads list tags will almost certainly follow the normal discounted street pricing patterns and decline by 10% to 30% over the next six months. Apple’s aggressive pricing maneuver has also succeeded in causing consternation among competitors who must now re-evaluate their own price structures in order to follow Apple’s lead.
Still even at $499, the Apple iPad is not the lowest priced Tablet device. That distinction currently belongs to Freescale Semiconductors which introduced a touch screen Tablet that retails for $199. The Freescale tablet lacks many of the iPad’s high end features, such as advanced graphics, which accounts for the price differential. It runs on either Android or Linux and also incorporates a battery that lasts for eight to 10 hours. Consumers can also opt to add a keyboard to hold the Freescale tablet like a monitor. Available in a selection of colors, the tablet includes Wi-Fi, Bluetooth and optional support for 3G. Users can add an external keyboard and mount the tablet on the keyboard as its display. Freescale Semiconductors is marketing the device able to OEMs who want to quickly get to market with a Tablet.
Tablet Market: Narrow Niche or Mainstream Appeal?
The real question now is: will the recent flurry of new Tablet releases translate into mainstream success or will Tablets remain a niche device in search of a market? Many industry observers have openly scoffed at the notion that these devices will ever achieve widespread adoption. In recent months the rising tide of speculation about the Apple iPad also engendered debate as to why anyone would need or want yet another portable device in a field that is already crowded with smart phones, a wide variety of portable notebooks and the very popular and inexpensive Netbooks.
These are all valid questions. Tablet devices have been available for the past five years. To say that they have met with only moderate success is an understatement. This is partially due to the economic downturn and also due in large measure to the fact that the marketing around these devices never identified a clear and compelling use for them outside a few narrow niches.
There was also confusion about what constituted a Tablet computer. There is no standard, one-size-fits-all device that addresses all market segments. In the 2006-2007 timeframe some vendors opted to see larger Tablets that more closely resembled traditional notebooks or laptops. The higher end devices from vendors like Acer, HP and Toshiba often incorporated advanced features like handwriting recognition, inking capabilities in the Windows presentation subsystem and fingerprint security ID. Conversely, several suppliers marketed hybrid mini-Tablets/eBook readers with small (six inches or less) form factors.
And over the last two years, the Tablet segment was eclipsed by the burgeoning popularity of NetBooks, which have an average price range of $150 to approximately $400.
Nonetheless, nearly every major hardware vendor boasts at least one Tablet in their product portfolio. Acer, Asustek Computer, Dell, Fujitsu, Gateway, Hewlett-Packard (HP), Lenovo, Micro-Star International (MSI), Motion Computing, Toshiba, Viewsonic and Wacom are all betting that consumers and eventually businesses will embrace the Tablet form factor.
In recent months Asustek Computer, HP, Dell and MSI all debuted new tablet offerings to beat Apple to the punch. MSI launched its 10-inch Tablet at CES and HP is readying its offering, an Inventec-manufactured device set to debut in the spring. Asustek released its tablet Eee PC T91 and will launch 10-inch model along with Windows 7.
Bottom line: There is a wide range of form factors and features from which to choose. Models range from very small lightweight, like the Apple iPad that weigh 1.5lbs. , and use a stylus, to larger 5-6 lb. notebook-type form factors, that swivel and have full or hidden mobile keyboards.
The Price is Right
One thing about Tablets that should help spur acceptance and adoption,and may even trump NetBooks, is cost. Tablet computer prices have dropped significantly from 2007 when pricing ranged from $599 to $2,700, with the media tag averaging $1,600. Thanks to the rise of NetBooks and Apple’s uncharacteristic move to be a price/performance leader, the average selling price (ASPs) for Tablets is now between $400 and $800. Special promotions abound and leasing and financing solutions are widely available from all the vendors. HP, for example, markets its HP/Compaq Mini 110, 210 and 311 Series of mobile laptops and mini NetBooks which range in price from $269 to $399 with 10 to just under 12 inch screens and is outfitted with Intel’s Atom processor 1.60 GHz. Additionally, HP also sells the TouchSmart tm2t series of high-end customizable tablets, whose list pricing begins at $899 and ranges to about $1,300. The TouchSmart tm2t tablets, have a 12.1 inch display screen. They allow users to swivel the screen, fold it over, write and draw on it using a digital pen or alternatively employ touch screen fingertip navigation. They also have a full keyboard. The HP tablets are available with 64-bit Windows 7; either 2GB or 3GB of memory; a 250GB or 320GB hard drive and a choice of Intel 1.3GHz Pentium processor or an Intel Core 2 Duo 1.60GHz processor. The HP TouchSmart tm2t series pricing is closer to traditional notebooks, though it incorporates the tablet features and functions. HP also regularly offers special sales and promotions on the TouchSmart tm2t tablets which can lower the price by 20% or more. Dell and Toshiba both have multiple Tablet models. Toshiba’s Portege M750 is a high end model that can convert from a notebook to a tablet and has digital pen and touch screen capabilities with pricing starting at $1,279.
Apple CEO Steve Jobs has made no secret of his disdain for NetBooks and he now seems determined to at least bring the iPad entry level list prices within a couple of hundred dollars (US) of the low cost NetBooksin the hopes of luring users away. . Credit Suisse financial analyst, Bill Shope published a Research Note earlier this week based on his meetings with Apple executives. According to Shope, Apple is positioning the iPad to be the device of choice for Web browsing and all forms of mobile media and the company is willing to cut the price, if that’s what it takes to ensure success. Other vendors will be forced to follow suit.
Meanwhile, with features ranging from mobility, portability and widespread applications like gaming, videos, photos, E-book reader, Email, Web browsing, maps, weather forecasts as well as the ability to write notes and draw pictures, the appeal of Tablets is taking on a much sharper focus. Seen in this context Tablet devices would appeal to a wide range of consumers as well as commercial and business users in fields like:
• Legal
• Healthcare
• Manufacturing (factory floor)
• Construction
• Academic
• Consultants
• Press
• Defense
• Aerospace

With Tablet devices now sporting features, performance, applications and pricing to rival high end notebooks and low-cost E-book readers and NetBooks, it’s highly likely that their popularity and adoption will soar in the coming months. The competition will be intense and that spells good news for consumers and corporations that are looking for competitively priced devices for their mobile and remote workers.

Tablets Take Off in 2010, Thanks to Apple’s iPad Read More »

Apple iPad Debuts and Surprise, Lives Up to the Hype

“It” is finally here. Apple CEO Steve Jobs unveiled the iPad tablet device at the Yerba Buena Center for the Arts in San Francisco to a packed house amidst thunderous applause.
After months of speculation, which reached a fevered pitch over the last two weeks, it was absolutely imperative that Apple‘s iPad live up to the hype. And it does. Jobs characterized the iPad as a third device category between a notebook and a smart phone; and given the features and the form factor that is a credible claim.
The biggest and most pleasant surprise was the very affordable price tag: iPad list pricing begins at $499 for the basic 16GB model and goes up to $829 for the most expensive 64GB model which includes Wi-Fi and 3G. While many industry watchers expected the iPad to sell for less than $1,000 (US), it’s safe to say that no one expected it to break the $500 barrier. This aggressive tag should enable the iPad to effectively compete and competitively priced compared to the smaller and wildly popular Netbooks, which is no doubt exactly what Steve Jobs intended.
The iPad incorporates all of the rumored features and elements that consumers have come to expect and demand from Apple and then some. It incorporates superior graphics, an elegant case, a slick user interface and a multi-touch virtual keyboard. In another nod to usability, the iPad can be angled or tilted in any direction while still allowing the user to view the screen. And at just half an inch thick and weighing only 1 ½ lbs. the iPad sports a sylph-like silhouette that would be the envy of every supermodel, not to mention potentially millions of consumers who will love the portability of the slim, lightweight form factor.
The iPad, which comes equipped with a 1GHz Apple A4 chip, is also available in a variety of configurations to fit various budgets. Customers can purchase the iPad with 16-, 32-, or 64 GB solid state hard drives. And in what will surely be a boon to consumer and corporate road warriors, the iPad has a battery life of 10 hours for mainstream applications. And the iPad can sit on Standby for a month without requiring a charge, according to Jobs. All models come equipped with Wi-Fi and Bluetooth connectivity.
The iPad is also fully interoperable with Apple’s other top selling products the iPhone, iPod and iTunes. Interoperability is a necessary and crucial component to the iPad’s future success. It also has the speed and power to run the latest games, TV and movies; an E-book reader and content from multiple external sources.
Broad Appeal
The iPad seemingly has something for everyone: enough speed and power to attract the gaming crowd; E-book reader capabilities; Google Maps; the ability to watch TV, movies and video – YouTube can be viewed in high definition (HD). It also features broad application support which is the life blood and a necessary element for the success of any hardware device. It already supports popular applications such as Calendaring, Google Maps, Facebook and even Major League Baseball. The iPad will also appeal to scrapbooking and photography buffs. It has a photo scrubber bar on the bottom of the screen that has multiple settings, that lets the user flip through photo albums, run slideshows and listen to music. And while it may not be the [Amazon] Kindle Killer as some have dubbed it, at the very least the iPad will give the Kindle some tough competition. Apple has already lined up five publishing powerhouses including: Harper Collins, Macmillan, Simon and Shuster, Hatchett House and Penguin Books. More such partnerships will likely be announced in the coming months.
Analysis
The iPad has two missions to fulfill. The first is that it must equal or exceed the very high bar that Apple has set for itself. This is no mean feat. Apple aficionados and critics alike have been spoiled by the dizzying array of devices Apple has released over the past several years. These range from new innovative Mac Books like the MacBook Air to the market changing iPhone and iPod and the ubiquitous iTunes for music downloads.
Apple now finds itself in the enviable or unenviable position of having to top itself in the quest to deliver “the next big thing” and secure its spot on the top of the hardware mountain.
Secondarily, the iPad is Apple’s attempt to fell multiple competitors — from Amazon to Google to the Net book vendors — with a single arrow.
So how does the iPad stack up? From a feature/function standpoint it lives up to the hype and it exceeds expectations from a pricing standpoint. Steve Jobs may very well have introduced a third device category. The iPad appeals to a broad user constituency that includes gamers, E-book readers, music and photography lovers, Web surfers and mobile and remote users (and probably some corporate knowledge workers as well) as well as casual consumers who just want to get the latest and greatest consumer offering that won’t break their budgets.
Undoubtedly, there will be some users who will simply shrug their shoulders and say, “I already have a notebook or Net book, why do I need the iPad?” And that’s fine.
And while it may not kill Amazon’s Kindle or the rival Net books it will force those competitors to respond with more advanced features and aggressive price points in the near and intermediate term. There is no doubt that other vendors fear Apple as witnessed by the many new tablet devices that were introduced at the Consumer Electronics Show earlier this month. Everyone wanted to beat Apple’s iPad to market.
No, the iPad is not Moses coming down from the mountain with tablets containing The 10 Commandments, but then again Moses didn’t have such a large audience, the benefit of sending his message out via the Web or the advantage of Apple’s marketing machine.
When all is said and done, the sales to end users – consumer and corporate alike – will be the final arbiters of the iPad’s success. The first sales figures, including pre-orders should be available within the next few months. Meanwhile, Apple has done its part by imbuing the iPad with the features, functions and broad application and industry support that are necessary to make it a success. Barring any unforeseen or show stopping bugs, the iPad looks like a winner.

Apple iPad Debuts and Surprise, Lives Up to the Hype Read More »

Scroll to Top