By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
While aspects of operational BI and real-time analytics are new, the core intention to automate systems and optimize efficiencies is not new. Operations research can be traced at least as far back as World War II or even to William Taylor and the "Principles of Scientific Management" in 1911.
In recent years, of course, we have seen stunning advances in computers, integration infrastructure – not the least of which is SOA - that have moved analytics and complex event processing to an entirely new level.
At Gartner AADI in November, we caught up with Gartner's Roy Schulte to get an angle on the overall course of new event processing trends. Together with Mani Chandy he penned "Event Processing," a must-have text for the modern IT general.
For more than a decade, Schulte has been an important thought leader in what has been called the "zero-latency enterprise," the "real-time enterprise," or what have you. At the heart of all this effort is the quest for organizational speed – maybe we can throw another hat in the ring and call what is going on ''high-velocity enterprise systems.''
We asked Schulte what was the biggest change in ten years of looking at this phenomenon. The answer was that the tools have changed.
"The notion of using analytics to run your business is not new. The biggest change is that the development tools are much better. For example, an airline in the late 90s had a massive project called the Enterprise Nervous System and it was to collect data from hundreds of sources."
"They would use it to maintain a real-time picture of the state of the airline and use that to notify the many different business units and some application systems whenever an event of interest happened. If a plane would run late and have a new arrival time, you would need to reschedule the work of the baggage people, the locale of the gates, caterers, fuel trucks, and so on."
"They did it 10 years ago, but it was a heroic effort. It was all custom development," he said.
Yes, they used a messaging system, but there was a tremendous amount of hand-done customization to do. Now, that has changed.
"Now," said Schulte, "there are appliances and low-latency message systems."
"Flash forward," he says, "to 2010 - more than ten years later, Heathrow airport has a new real-time intelligence system, and it sounds like the same application we just discussed." The Heathrow system gets information from the airplane, from the gate, from the passenger reservation system – and, says Schulte, "it was all done with a BPMS suite."
"They did this on a leading-edge BPMS suite, which has in it a vast amount of middleware support," he said. Support includes rules engines, event processing, adapters and more. There was a lot of work to do, no doubt. But, notes Schulte, it came up in six months.
Down-in-the-roots programming is less of an issue now. Most of the work is done in high-level modelers, process modelers, and rules engines, said Schulte.
He advises that this is not literally a real-time system. "it is not stock market stuff," said the leading voice of complex event processing. But the point – that the road to real-time operations is now replete with available tools – is well taken.
See related Gartner AADI 2010 information page