Chris Martins

Subscribe to Chris Martins: eMailAlertsEmail Alerts
Get Chris Martins: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn


Related Topics: SOA & WOA Magazine, CEP on Ulitzer

SOA & WOA: Article

SOA Focus - Milliseconds Matter

In financial trading, if you're slow to act on an opportunity, it's gone, seized by a quicker competitor!

In financial trading, if you're slow to act on an opportunity, it's gone, seized by a quicker competitor. His profit is your loss. Electronic traders can easily miss a trading opportunity because their trading algorithms failed to detect the right conditions - or didn't detect them quickly enough. So for securities trading operations dependent on automated algorithmic trading - where profit or loss is determined in less than a second - milliseconds do matter.

Event Stream Processing enables organizations to make rapid decisions in response to highly dynamic data - data that's continuously changing. Besides financial services, ESP applies to industries as diverse as telecommunications, manufacturing, distribution, retail, energy, and military. Most organizations can benefit from understanding the impact of events as soon as they occur.

Organizations must advance their computing and data architectures to address the real-time demands of today's environments. ESP, which has also been referred to as complex event processing and data stream processing, has the potential to deliver enormous benefits. Imagine the ability to monitor a city's power grid by tracking the load on generators in combination with a real-time feed of the weather forecasts, and optimizing generator output on-the-fly to changing conditions.

In these situations - including the aforementioned electronic trading desk - increasingly sophisticated systems can now be deployed to actively monitor rapidly moving event streams and identify both discreet data points, as well as sophisticated data patterns. The event streams can originate from sensors dispersed in a wireless network, satellite telemetry signals, RFID tags, or stock ticks from a market data feed. In each instance, the events coursing through these data streams can be sifted, sorted, and otherwise analyzed to derive operational value for the business operation. And in each instance, the goal of the new architecture is to understand quickly what's happening and respond accordingly.

Traditional architectures are designed around a core presumption of a stable persistent data model with a relational database as caretaker. In these traditional architectures, changes to the data (inserts, deletes, or updates) must conform to strict rules that preserve the inviolable ACID properties of a transaction - its atomicity, consistency, isolation, and durability. Ensuring data accuracy is paramount: transactional integrity is always a priority over speed. Traditional database principles necessarily build latency into their operations to assure this data integrity. As data changes, the databases must be updated, indexes maintained, and then, after that, queries can be executed. You can't query what you haven't indexed; and you can't index what you haven't written to the database.

As valuable as such constraints are for traditional computing, they impose too much latency on ESP applications. ESP applications must often monitor tens of thousands of new events that are generated every second and tens of thousands of event "conditions" at the same time. For such applications, traditional transactional models are equivalent to electronic bureaucracies whose transactional processes only impede the real work to be done. To handle the volume and speed of event processing, a new model is required.

ESP assumes that many events lack significance, seek to filter out the noise, and derive meaning from the events that do have value. Often that meaning can only be understood in the context of other events that happen at the same time or in close proximity. So ESP applications understand that time is key in determining event significance; when an event happens is often as important as what happens.

In all business areas, conventional data management technologies can't cope with real-time analysis and response. It's time for enterprises to rethink traditional data processing strategies. ESP systems must address the special attributes of event stream data: the volume of data, the speed at which it's delivered, the temporal characteristics of the data, the potentially high amount of "noise," and the need to act in milliseconds to capitalize on events as they happen.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.