The rise of big data is creating new challenges for enterprise architects to incorporate insights into business processes. As a result, many companies are turning to complex event processing architecture tools to facilitate this process.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
CEP concepts have been around for 20 years, said Roy Schulte, vice president at Gartner. More recently, enterprises have been adopting complex event processing architecture and stream processing to improve businesses processes.
"Stream processing is essentially modern wine in new bottles," Schulte said. Today, people commonly use the terms streaming analytics or stream processing to describe the same concept as CEP. The vast majority of these systems process streams of events to create complex events. In essence, the system is taking in simple events, and then doing some math on them.
Enterprise architects need to consider how to chain together CEP components efficiently in order to stay ahead of competitors. CEP and stream processing can also improve the use of business process management (BPM) tools. A good starting point is to look at the evolution of BPM. There are also numerous tradeoffs around building and buying components. Enterprises also need to develop a strategy for building CEP-related skills.
The evolution of BPM
Over the past several years, BPM technologies and platforms have evolved and become more intelligent. Important phases in the evolution of intelligence that drives business processes include business rules -- e.g., constraints, decision trees and expressions -- and operationalized analytics, with predictive models and machine learning algorithms.
According to Setrag Khoshafian, chief evangelist at Pegasystems Inc., based in Cambridge, Mass., big data analytics are becoming important elements of BPM services, as the volume, velocity and variety of data increases.
Setrag Khoshafianchief evangelist at Pegasystems
"Complex event processing is the fourth part that drives BPM solutions," Khoshafian said. "The faster business solutions can act on detected event patterns, the larger and more impactful the business value."
The events in the context of BPM can be internal events defined, generated or processed within the BPM tool. Examples include the instantiation or completion of a dynamic case or service levels. For example, when an assigned task is late, it is a temporal event that needs to be handled through escalation. The temporal aspect is extremely important in event processing, especially the occurrence and relationship of multiple events in a temporal window. Events can also be external, such as financial transaction events, device or machine-monitoring events and social media events.
"Core intelligent complex event capabilities are becoming part of the unified BPM platform that supports intelligence holistically," Khoshafian said.
The results provide much-needed robustness in executing and driving process tasks. Modern day BPM platforms must unify business rules and analytics, as well as complex event correlations involving events from internal, temporal and external sources.
Choosing to build or buy components
Many BPM vendors have stream analytics or CEP platforms as optional components in their products suites, such as IBM, Pegasystems, Red Hat, Software AG and TIBCO. Schulte said these tools may be used separately or together. Some buyers acquire their BPM and stream analytics technology from different vendors, and mix and match them. In many situations, it is fairly easy to mix and match products, because the handoff interfaces are clean and isolated.
Schulte said CEP latency is not an issue for 95% of all applications, because the stream analytics or CEP products are designed for low latency. However, architecture and technology can be challenging for the last 5% of applications, such as high-frequency trading, some telecommunication applications and a few other extreme applications with high throughput (50,000 to 1 million or more events per second) and low latency (a few milliseconds or sub-millisecond) requirements.
There can be benefits to streamlining the tool set used to create CEP applications. But organizations can run into challenges when trying to integrate legacy data types or implement specific CEP functionality. In the ideal world, enterprises would leverage a single back-end platform. But this can be challenging for many enterprises with legacy applications.
Develop CEP skills
Bringing complex event processing architecture into the enterprise can require new skill sets, Schulte said. "Most application developers are new to CEP, so the project team needs to have one or two members select, install and learn the stream analytics product, which can take several weeks. If a team member is already familiar with CEP, then the second and subsequent projects go more quickly."
Integration of CEP is virtually identical to any other kind of application integration, so if the project team has experience with integration tools, or help from experts in an integration competency center, the integration is straightforward -- a matter of adapters and sometimes transformation services, for example, from an API gateway or ESB.
Managing CEP costs
CEP applications can grow to include a lot of moving parts spanning hybrid cloud platforms. The BPM platform may reside on one cloud, while various application components are deployed on others. Each cloud platform may have different pricing characteristics that favor one cloud or software as a service offering versus another.
Zack Kielich, director of technical marketing at CliQr Technologies, based in Santa Clara, Calif., said it's a good practice to accurately collect and aggregate information from application performance monitoring tools. With that, it's possible to correlate actual application performance with cloud prices. When the rates exceed value for the performance needed, enterprises can leverage cloud-agnostic tools to automate the migration of specific, high-cost, low-visibility applications to a cloud offering better rates, such as Hadoop clusters crunching research results.
Mark Palmer, senior vice president and general manager of engineering at TIBCO Software Inc., based in Palo Alto, Calif., said another good practice is to create a catalog of events that tracks who produces them, who consumes them, formats and transports. The catalog can be handed to BPM developers to generate the proper business processes.
Take a bigger view
As enterprise architects consider implementing different CEP capabilities, it's a good idea to take a bigger view of the enterprise's CEP strategy. This can help to reduce complexity and make it easier to implement new CEP capabilities in line with business strategy. A good starting point is to plan in the context of clearly defined roles that will benefit from CEP capabilities. Shawne Robinson, director of product marketing for Pega Cloud, said this will make it easier to optimize the tasks being implemented in a way that adds the most value for the target users and to refine the types of data collected for analysis.
It's also important to start with clearly defined requirements. This is accomplished through collaboration between the business or analyst and the IT organization developing a service in order to be very clear on what problem needs to be solved. With a large number of inputs available, it is important to focus on the set of data that accomplishes the desired task.
BPM platforms can help bring agility to implementing CEP in alignment with business goals by offering the flexibility of case management, real-time decision making, BPM and CRM that supports multiple points of access, including social, mobile and network.
"If you can find a packaged application that does the job, then buy it," Schulte said. "If you can't, then develop your application on a commercial or open source stream analytics platform, rather than writing the foundational stream analytics logic yourself. Pair the stream analytics platform with a BPM platform if the response to the situation involves human activity steps or a sequence of machine activity steps over a period a time -- or both."
IBM releases CEP software to integrate data in real time
How pairing CEP with SOA can bring business benefits
No delay is acceptable when examining big data