Manage Learn to apply best practices and optimize your operations.

The importance of determining your CEP architecture bias

Making the most of CEP architecture requires figuring out which organizational goals you want the platform to help you achieve. Tom Nolle explains how to determine this.

Optimizing complex event processing platforms means application managers need to look at both sides of the CEP...

picture, determine the natural solution biases of their company, assess platforms that address that bias and ensure that they develop an approach that covers critical needs. Tom Nolle explains why and how.

Businesses react to events, so it's possible to characterize all business IT systems as event handlers. However, normal IT practices focus on associating each event with a process and executing that process when the event occurs. The relationships between previous and subsequent events within the same flow are considered unimportant unless they change the underlying databases -- like when you process a check and reduce the available balance.

Implementing CEP architecture is, in many ways, the next wave in analytics and big data, as it is aimed at bringing decision support closer to business events in time and place. It's also an evolution in transaction processing with those same targets, and this dualistic view of what a CEP platform should be carries into the choices available. 

Understanding complex events

CEP introduces the notion of a "complex" event, which steps back -- or "up" -- from the normal flow of events to determine higher level conditions that generate this and other events. It does so by correlating across a series of events, in one or more event flows. The basic separation of a CEP platform is based on whether the CEP application and its goals seem to enhance basic transaction processing and workflow -- a business process management (BPM) bias, in short -- or whether they seem to expand and enhance analytics, taking it from "historical" data to real-time decision-making.

One easy way to define your own natural CEP architecture bias is to relate the complex events you're looking for to the processing of simple events you already do. In some cases, CEP is expected to alter the way that simple events are handled, or to generate new events that will be handled using the same process options as those used for simple event handling. This suggests that a CEP platform based on current BPM event-handling and workflows would be best. Where complex events are expected to drive processes largely independent of current workflows, it may be best to think of the CEP platform as extending analytics or big data.

Workflow processing or analytics?

Some users report that the best single question to ask to align your basic CEP architecture strategy is: "Am I trying to handle transactions better, or am I trying to create new business inputs?" If the former is true, then think of CEP as transactional or workflow processing. If it's the latter, think of CEP as analytics and big data.

Transactional CEP platforms are often based on workflow, and many of the workflow platform providers offer CEP support. Tibco's Fast Data Platform is the foundation technology with which its CEP elements are integrated, and this reflects a CEP vision that is aligned with real-time processing. The examples offered by Tibco illustrate that CEP extends basic transaction or event handling to add sophistication in event recognition. Red Hat offers a similar approach with JBoss, and CEP is also provided with Drools through Fusion.

CEP is, in many ways, the next wave in analytics and big data, as it is aimed at bringing decision support closer to business events in time and place.

Most users will probably pick a transactional CEP platform based in large part on their current BPM and workflow platforms. When there's more openness in choices, ask whether the CEP architecture is going to be more dependent on rules management or whether it will be workflow-driven. Something like JBoss is best for the former; something like Tibco is better for the latter.

For analytics-driven CEP, there are two primary models developing, too. Some vendors view analytics CEP as a way to drive business processes closer to the decisions. This has two ingredients; the first is defining and handling event streams, and the second is event correlation and handling. IBM is one of the enterprise leaders in this space, in part because they provide the full range of tools needed for both stream management and event analytics.

Like other companies offering analytic-platform CEP, IBM focuses increasingly on "stream processing," which shifts work from macroflows to something more like pure events. To respond, companies must shift to the stream computing model at the process level. Then they must project some of the stateful or contextual analytics previously done on historical data forward into real time.

The other analytics-CEP approach is embodied in a union of Cassandra and Storm as extensions to basic Hadoop big data. Storm is used to create a stream of "bolts" that are passed to Cassandra for processing. A union of the two has emerged as a separate project, which some believe will form a model for a future CEP architecture.

Analytics-driven CEP is more likely to require a major change in either event handling or transaction processing, but the result, a stream computing model, may align better with long-term CEP goals. Most larger scale CEP platforms are working to support both strategies -- Tibco bought the core of its Streambase engine as a stream computing platform -- and thus it is likely that most companies can reach analytics-driven CEP without requiring a revolution in their current approach.

The risk here is an explosion in complexity. Analytics based on historical information is very different from analytics based on real-time information. It is possible to derive the same results from both, but the more complex the event stream and correlation, the more expensive it will be to sustain real-time analysis, and the more difficult it will be to adapt the stream computing environment to new requirements.

Don't reinvent the wheel in CEP architecture

CEP platforms should never force a company to retool their entire transaction processing structure, nor can they be expected to replace traditional big data analytics for business analysis rather than for business process management. They can help to meld analytics with real-time transaction processing, and that's often more than enough to justify their consideration.

Next Steps

Access our expert handbook on CEP and BAM

Benefits of CEP and SOA

Tibco standardizes cloud analytics

How CEP can bring big data to BPM

This was last published in February 2016

Dig Deeper on Event-driven architecture, CEP and operational intelligence

PRO+

Content

Find more PRO+ content and other member only offers, here.

Join the conversation

1 comment

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

What role does a CEP application play in your organization -- workflow or analytics?
Cancel

-ADS BY GOOGLE

SearchSoftwareQuality

SearchCloudApplications

SearchAWS

TheServerSide.com

SearchWinDevelopment

DevOpsAgenda

Close