TrendWatch Guest Columnist
e-Performance Management: Services vs. Software
Your e-Business just went down and it can't get up. How do you know? Because you're measuring its performance from the "outside in", 24-hours a day, from the end-user's perspective, using the real-time services of an Internet performance management company, and you just received an alert that your transaction performance has degraded from multiple cities around the United States.
Using the in-depth and sophisticated diagnostic services of your Internet performance management services company, your operations personnel are able to quickly pinpoint the root cause of the problem, repair it, and regain the stellar performance you had experienced prior to the slowdown. The result? You've just mitigated as much and as quickly as humanly possible the negative revenue impact experienced by your e-Business.
Any company using the Web as a revenue generator, customer satisfaction enhancer, or partner portal -- which is quickly becoming virtually every business -- must measure, benchmark, test, diagnose, and manage the performance of their e-Business infrastructure. In a struggling economy, the successful IT executive is expected to increase operational efficiency with reduced resources, and optimally leverage the large investments the company has already made in its e-Business infrastructure. However, you can't manage what you don't measure; IT operations must proactively measure, diagnose, and test the end-to-end performance of their e-Business applications continuously in order to manage their e-Business successfully.
Many people do not realize that Internet performance varies not only over time, but also over geography. To know what end-users are experiencing on the other end of a browser, you must regularly sample the same waters your Internet users are swimming in on a real-time basis and from many Internet points of location around the world.
To assure customer satisfaction you need to know if your customers are getting consistent levels of acceptable performance. You need to know if your e-Business applications and infrastructure are well constructed, and if they have the capacity to deliver peak load performance when called upon to do so during high volume traffic events. When conducting such elemental testing, you want to be able to diagnose problems that can exist inside and outside the firewall.
The question is, should you use a services-based or software-based solution to measure your performance? I believe that a services-based solution in many cases is more cost effective, helps grow online revenue, and maintains better customer loyalty than a software solution. There are cases where you may have a choice between in-house software or an outsourced service, and one needs to make a business case for the outsourced service, so let's separate those two for a second.
An example of one form of performance management that simply cannot be done with software is external benchmarking. If you want an "outside in" view of your Web site and you want to be able to do that from 100 different cities around the globe, it's unlikely that you will actually set up and run 100 different computers from various different locations -- that's not very cost effective. Buying from a vendor that has an existing infrastructure is the only practical way to go. A service company will usually have a large infrastructure that you can cost effectively leverage to obtain an accurate view of your Web site, as experienced by your end-users. Using an unbiased, outsourced service will also enable you to measure and benchmark your performance against competitors, and offers a neutral third-party validation of your vendor's service level agreement performance.
As regards testing, some tasks are best accomplished with services, and some are better done with software. If you are an application developer and you wish to do functional testing day after day during the application development process, software is the right way to go. QA testing software has been around for decades and developers use it all the time. However, using software may not be the best way to do capacity testing on a Web site that's functionally complete, but needs an extra dose of "wind-tunnel" testing to ensure that it won't fail under the stresses of real user loads. In this case, an external service is more accurate and cost effective. Many Web sites also involve applications using data from multiple partners. In such situations, an external service will provide a far more accurate picture of how the application will work in real-life than a limited capacity test in a QA lab.
If you were interested in testing an intranet or Internet application inside the firewall, where multiple locations are not involved, you have the option of purchasing load testing software and hardware and doing it yourself. Typically, a business could spend hundreds of thousands, if not millions of dollars to maintain a testing infrastructure just to run tests once every few months when changes are made. Therefore, having this fixed infrastructure of software, people, and hardware to do occasional testing, or even periodic testing, is not as cost-effective as using a highly accurate testing service from a third-party performance management company where you pay only for services you use, when you need to.
With the advent of Web services, a distinctly new set of issues and challenges emerges in order to measure and manage your widely deployed and distributed Web services-based applications. How do you test where problems are when the Web services don't come from a single vendor or even a single server? How do you test for problems and how do you determine who should be accountable for poor performance? I believe that the bulk of the performance management for Web services-based applications can only be done as a service, rather than by using software.
For example, a transaction on Amazon.com may include an authentication process that takes place with Microsoft Passport, while the credit card processing occurs with First Data. Taking control of a Microsoft computer just to test your Amazon.com application is impossible. Frankly, the only way to do that kind of testing and simulate it correctly would be to have an "outside in" service that can drive these sites with the appropriate amount of test data, get information back, and then give you highly accurate statistics and results that help you pinpoint where the problems might be.
To summarize, there are cases where it makes sense to use software for QA and testing. Those cases are typically in the middle of the application development cycle. On the post deployment operations side, when you wish to do performance measurement, benchmarking, capacity testing, or load testing, a services-based approach is more cost-effective and overall delivers far better results.
Umang Gupta is Chairman and Chief Executive Officer of Keynote Systems, Inc. and a well-known technology visionary in Silicon Valley. For more information on Keynote Systems and its Internet performance management services, visit www.keynote.com.
Copyright 2002 Hurwitz Group Inc. This article is excerpted from TrendWatch, a weekly publication of Hurwitz Group Inc. - an analyst, research, and consulting firm. To register for a free email subscription, click here.
For More Information:
- Looking for shortcuts and helpful tips? Visit our Tip Exchange for time-saving and informative tips.
- Visit our Best Web Links for Web services for quality resources selected by our editors.
- Discuss this article, voice your opinion or talk with your peers in our Discussion Forums.
- Visit Ask the Experts for Web services, SOAP, WSDL, XML, .NET, Java and EAI answers.