Business Analyst’s Tools – Data Flow Model

In this article, I’m going to talk about a simple yet powerful tool that every Business Analyst should have in their arsenal – namely Data Flow Modelling (). Specifically, I’m going to talk about producing a tangible asset that can be prodded and poked by the BA team, the development team and even business users in order to gain a clearer understanding of the Target Object Model. I’ve used this approach on numerous occasions, and it is particularly useful in trade flow projects.

Models can take lots of different forms. These range from the vapour model in your stakeholder’s mind to the stack of exquisite UML Use Cases and Sequence Diagrams in your design specification. But the gold standard is a working model that performs processes, uses reference data, makes decisions on a critical subset of data objects and shows a use case being processed through the Target Object Model. We produced one of these on a recent derivatives clearing messaging project, and I’ll use that as an example.

The steps we followed can be summarised as:


Producing the Trade Flow Playlist

The first thing we did was generate a “Play List” of the possible paths our trades could follow through the trade flow. We employed highly sophisticated tools  – some Magic Whiteboard and a stack of Post-it Notes. We drew our messaging domain on the Magic Whiteboard, and then marked out our inputs, our proposed messaging queues and our endpoints (effectively our middle office core applications). We used the Post-it Notes to represent our actors (trades coming from the exchange) and walked through the processing of each message, noting the waypoints they went through, when they got there and what supporting data objects each message required at each of these waypoints.

We were particularly keen to understand how, where and under which circumstances a single actor being input to a waypoint resulted in two actors being output on different message queues. All this was documented on more Magic Whiteboard and used to drive the next process in our model development.

Visualising Use Cases in Visio

At this point, we felt pretty pleased with ourselves. Despite having dry wipe marker on our hands and dirty shirtsleeves, we had a set of scenarios and enough detail to start producing our use cases. We broke out Visio and translated our play lists into use cases, showing the processes our actors would flow through and the data and success/exception/constraints that they would be subjected to. This process gave us our bible of use cases and they were built in such a way as to provide “snap shots” in the life of each actor. These made a useful tool in their own right (having a full set of use cases is important to all projects), and this process also highlighted commonality within some of the processes – allowing specific use cases to be developed for them in their own right (our  being a prime example).

Given the very complex Target Object Model, and the effort required so far in understanding the flow and how it would work in real-time, we wanted something more tangible – something we could run our use cases through and iron out issues, drive out data problems and “taste” the effects of our proposed asynchronous messaging design.

Bringing it alive with Excel Models

In order to get a living breathing model, we decided to build it out in Excel. We produced a brief that required us to visualise the message queues and core applications as tabs in the spreadsheet and our use case actors (trade messages) as individual rows on these sheets. Our model would initially use additional tabs to hold the routing rules and reference data required to support the processing of our trade messages by VBA code. Our messages would move from tab to tab mirroring the processing flow in our use cases.

This proved to be a fast process and was handled solely within the BA team – without the need to disturb the developers. Whilst building this, we were forced to revisit our use cases several times in order to review the flow in various places and we could also see how our reference data requirements were maturing. To further enhance the model, we broke out the tabs containing our reference and routing rule data to MS Access and changed the Excel model to use that for its data requirements.

All this effort gave us the ability to run our use cases through the model (simple trade; simple trade with reversal and rebook; simple trade with give up etc). We could clearly see where we had issues around how we accessed our reference data and processed it. It was a superb model and it allowed us to walk through scenarios with the business and to bring the development team up to speed with what the Target Object Model actually looked like, how it processed, where data bolted in and what sort of things they needed to think about when designing their Adapter Framework.

There was only one thing missing from the model. This was the ability to asynchronously process all messages on all queues at the same time – just as our proposed Target Object Model would require. What problems would we encounter when we started pushing our use cases through the system? Could we really afford to wait until the integration testing phase until we discovered these problems? We didn’t know, and couldn’t wait, so we moved to the Cloud.

To the Cloud – Asynchronous Trade Flow Model in PHP/MySQL

We built out a Cloud-based version of our model, making use of all the information and knowledge we had acquired to date in the production of our play list, use cases and Excel models. For the technically minded amongst you, it was built using a PHP Framework and made use of a simple MySQL database backend. It was a relatively fast process to go through and we had something up and running within a couple of weeks.

We were able to drop messages (actors) into the Cloud model, see our Adapter Framework engage with those use cases and watch them flow through the message queues and into the core applications in our Target Object Model.

The model quickly highlighted the race conditions that we would encounter in our proposed system. It showed that our use cases needed to be expanded to include a “trade state” use case that applied business rules to the order in which status updates should be applied in the Target Object Model. It allowed us to see the issues around breaks and how resubmission would impact on the message queues. Finally, it provided a convenient hook for our test team to hang their test cases off, and prompted them to develop automated tools that would support integration testing, UAT and issue management.

Summary

All these steps gave the project knowledge:

  • Requirements capture and use cases were correct
  • The reference data model was mature
  • Asynchronous processing would not go into “melt down” at a business rule level
  • Appropriate test tools were built and made available for integration testing and UAT
  • The project had a tangible model of the Target Object Model for future projects in that domain

Not a bad result for a BA team to deliver to its colleagues! ()

Redhound are happy to advise on all aspects of business analysis modelling, including reference data, business process and trade flow.

Our live Cloud-based demo integrates a Eurex clearing feed into a trade flow with complex routing rules, and is a working example of our modelling techniques.

Get in touch!

Ben has already written extensively about Adapter Frameworks as part of his “10 Things You’ll Do On Every Message Project” series.

Give us a shout if you’d like us to help you create one of these, or to review your existing one.

We hope this post has been helpful.

If you’d like to find out more about our data modelling philosophy, how we do this and what it looks like, get in touch

This entry was posted in Data Flow, Domain Model, Models, Trade Flow and tagged , , , , , , , , , , , , , , , , , . Bookmark the permalink.

2 Responses to Business Analyst’s Tools – Data Flow Model

  1. Pingback: 5 Things you need to know about Routing Rules | Message Consulting

  2. Pingback: Not modelling your workflow? Here there be monsters! | Message Consulting

Comments are closed.