BACKGROUND IMAGE: iSTOCK/GETTY IMAGES
Nearly all users of service-oriented architecture (SOA) will also be users of Business Process Execution Language(BPEL). As a tool for orchestrating coarse-grained business process flows, BPEL is virtually the industry standard, but it can create testing problems both in its own right and in connection with the Web services it orchestrates. Best practices for BPEL testing require a top-down approach that's combined with a testing suite and integrating BPEL fully into application lifecycle management (ALM).
Getting started: Setting up a model
Good BPEL testing starts with a comprehensive model of the logical/permissible sequences of business processes. This model is essential in driving testing both in the BPEL dimension and in the component testing/ALM dimension. The model should define what sequences of processes are allowed and the conditions under which those sequences are valid or should be expected.
Try to create a model from a business-logic perspective and avoid using current BPEL or documentation as the source, or you risk codifying your own mistakes in testing. The goal is to validate or develop the BPEL from the business side. If that can be done, technical conformance will assure business integration of IT processes.
Performing Business Process Execution Language testing
With business process alignments captured, it's time to move on to actual testing. Here, the normal practice is to separate BPEL testing from component/integration testing of live Web services, making the two processes essentially parallel unit tests that would later be combined into a single ALM-modeled integration test. For the BPEL portion, the goal is to quickly validate the business process sequencing and performance without complicating the setup by incorporating a lot of actual Web services.
This is particularly important where BPEL is guiding workflows that move out of a company's IT. Integrated data interchange in support of wholesale product movement or manufacturing components for just-in-time are examples of this kind of cross-company flow. It's difficult to test across company boundaries and nearly everyone will want to debug their own processes internally to avoid strife with important partners.
To support independent BPEL testing, most companies will want to identify a BPEL test framework or suite that allows BPEL validation without running against a full set of live Web services. If most of your middleware is from a common source, then a SOA vendor's BPEL test framework is likely to be the de facto choice. In addition to open source tools like BPELUnit, some companies that have tools to facilitate BPEL testing include: IBM, Microsoft and Oracle.
The process of testing BPEL with a framework involves the generation of a testbed that includes simulated Web services to be invoked and a set of test data that will exercise the BPEL orchestration process. In a good BPEL test framework, test data generation is automated based on constraints set by the user. The aforementioned initial business-process assessment is a good way to define what the valid process pathways are.
Test data should exercise valid pathways and test to ensure data can't force orchestration into an illogical path or create a flow that violates governance requirements. As is always the case with test data generation, the goal is to drive logic through all possible paths. It's valuable if the workflows are recorded by the test process so they can be validated against the business process flows previously identified.
BPEL and service/bus or workflow performance can also be an issue, and it's possible to get at least a general idea of how orchestration will impact performance at the BPEL testing level. Your business process flows should be associated with gross transaction volumes, so the time required to complete orchestration of a flow (neglecting Web service execution) can be viewed during BPEL testing.
Where complex workflows are identified (workflows linking many components), where high transaction volumes are expected (retail activity, for example) or both, a rough calculation of BPEL overhead per transaction can be used to determine the minimum time required for the workflow. That can then be checked against business response time requirements.
Integrating Business Process Execution Language testing into ALM
Since the purpose of BPEL testing with a framework is to isolate the testing from Web service testing of SOA components, it's not desirable to fully integrate BPEL testing into ALM in most cases. The practice is to test component flows to validate what could be called the technical interfaces (expected data to available data relationships, functionality, etc.) and then integrate BPEL flows at the end to validate gross functionality.
Test data generation in such a situation would then be a hybrid of BPEL-driven data generation (which focuses on orchestration validation) and Web service-driven data generation focusing on functional validation. It's helpful if BPEL test data generation can be used to produce both test data classes.
When integrating BPEL testing into ALM, most companies will want to isolate their partners' Web services from testing -- except where bilateral agreements have been reached. That will mean using some simulated Web services in the ALM structures to represent external partner activities.
If it's desirable to get performance data to validate response times during ALM/BPEL tests, be sure the simulated services generate process times representative of the real thing. It will be necessary to time actual response times for these Web services to get the right values.
The relationship between business processes, BPEL, SOA components, applications and ALM gets more elastic every day. The fact that BPEL testing can be anchored by carefully documenting business process flows can help align traditional SOA testing and BPEL testing to a common reference. This could help reduce the need for full-blown testing at the BPEL and component level as a regular element in redeployment, but would still sustain application quality and utility.
About the author:
Tom Nolle is president of CIMI Corp., a strategic consulting firm specializing in telecommunications and data communications since 1982.