MDM and SOA, a Strong Partnership

digg this | del.icio.us | Reddit | Stumble It!

Editor’s Note: Today’s post was written by Joan Lawson, a great enterprise architect whom I’ve known since 2003.

For more information on Joan, please see her LinkedIn profile — Dan Power

Let’s not allow Master Data Management (MDM) to become just another silo of data!  MDM and Service-Oriented Architecture (SOA) together, create a strong partnership in your enterprise architecture. 

1. Data Quality = Add Quality to SOA

SOA enables business functionality as a service.  However, it does not guarantee quality of the data on which it’s operating.  That’s a serious gap, which is filled by including MDM in a service-oriented architecture.  True business value is realized as services start leveraging the high quality data in the MDM hub and the services which surround it. 

2. Data Management Services Offered by the MDM Hub

MDM abstracts the governance of data by consolidating it into a central data model; conducting all data cleansing, augmentation, cleansing, and standardization; and creating a ‘gold standard’ source. These data management functions are centralized in the data hub and are hidden from the consumers of the cleansed data. Maximize the value of these services by consuming them from other applications that need to perform data quality processing external to the data hub.

3. Data Offered by the MDM Hub

Data services allow the consuming application to access and manipulate hub data from a service layer as a supported data source. Layering data services on the MDM hub hides the implementation of federated queries that gather the data requested by the consumer.

4. SOA, MDM, and middleware

SOA, integration middleware (Enterprise Service Bus or ESB), and MDM together can manage the detection of data changes in the source applications and propagate them from the source applications to the MDM – or from the MDM back to the consumers. With the addition of Business Process Execution Language (BPEL) and a business rules engine, a data change detected in a source can be captured, cause the data quality business rules to be executed on the data, and place the data back on the ESB to be consumed.

Are there other use cases for how MDM and SOA, together, add strength to the enterprise architecture? Please add your thoughts by commenting here or on the MDM Community.

Tags: , , , , , , ,

15 Comments on “MDM and SOA, a Strong Partnership”

  1. Henrik Liliendahl Sørensen 01/21/2009 at 4:15 am #

    In the realm of data offered by the MDM hub I think advanced search features is very important, and these may be implemented as SOA components on top of what is provided by the data container.

    This also means that these search features may be deployed on the hub as well as on the different data sources and destinations.

    Advanced search features may include among other things functionality as:
    • Speed, which is a lean factor
    • Error tolerance, so you don’t have to spell exactly as in the database and not puzzle with wildcards
    • Cross field search – typical party search is provided as “who and where”
    • After search navigation, that fulfils the process in one straight forward operation.

  2. Bob Barker 01/21/2009 at 3:43 pm #

    Great post. It’s worth pointing out, however, that there’s one class of problems we deal with that requires special handling vis-a-vis MDM hubs. In a problem domain where you’re searching for fraud, like airline passenger screening, insider trading, or workers compensation claims, the presence of “dirty” data is actually a plus. Fraudsters like to “game” systems to hide their identities, so being able to resolve and link multiple instances of an identity without merging them (or at least before merging them) is invaluable in catching bad guys.

  3. NM 01/22/2009 at 12:40 pm #

    Just discovered your blog via your LinkedIn post, great content! I look forward to future posts…

  4. Dan Power 01/22/2009 at 2:39 pm #

    @Henrik I think you’re right that offering advanced search as a service on top of the MDM hub would be very helpful to the rest of the enterprise. Thanks!

    @Bob Good point about the difficulties that fraud prevention introduces. Met with Matthew West today and really enjoyed the update on what Infoglide’s Identity Resolution Engine can do.

    @James Well written piece at http://jtlog.wordpress.com – keep digging these nuggets out of your “blog drafts” folder!

    @NM Thanks for the kind words – we’ll try to keep focusing on great MDM content. Please let us know if there are other topics you’re interested in.

    Joan Lawson may drop by to check out and respond to your comments as well.

  5. Joan Lawson 01/22/2009 at 3:13 pm #

    @Henrik – Yes, good examples of services that could be offered with an MDM that is implemented in a repository style

    @Bob – A very interesting use case. It sounds like a registry-style MDM, offering data matching services might be the best architecture for you.

    @James – I second Dan’s input. As I noted on your blog,
    I believe it depends on level of architecture. In a stand-alone application, a SOA is a design framework that allows the encapsulation of business functionality that can be reused. As long as the services inter-operate on a common data store, or pass static data for presentation, then the need for an MDM is minimal. However, as you add additional applications, data about a particular subject area becomes distributed. Whether services are then developed to support composite apps, the integration of data, or retrieval for presentation, then an MDM strategy is critical.

    @NM – Your feedback is definitely appreciated.

    Joan

  6. jt 01/23/2009 at 4:22 am #

    @Joan useful post about how MDM fits in to the enterprise- I quite often get challenged by people who think MDM is just another database and I think this is a useful explanation. Thanks for reply on my blog as well.

    @Dan thanks for the link- the rest of my drafts folder is pretty murky but I have a few ideas for future posts.

  7. Dan Power 01/23/2009 at 11:07 pm #

    I have to say, I love the dialog we’ve been having in the comments here. I’ll try to keep bringing solid content from myself, Joan and other people to spark the discussions! Thanks, everyone!

  8. Mike Young 01/27/2009 at 9:47 am #

    It continues to be a mainly underestimated issue with ERP systems that companies utilize one system or another as their “source of truth” for master data. Each of these systems proposes their own view of different pieces of master data, and it is mainly to support their own internal transaction processing. The 360 degree view of a customer is a concept which many companies desire but rarely attain. MDM is an excellent solution, especially when coupled with such powerful architectures and tools such as SOA and BPEL. Great article and posts!

  9. Joan Lawson 01/27/2009 at 10:07 am #

    Hey Mike, Good to see you! Definitely, and so often companies see their primary CRM application as the ‘source of truth’ for the customer data; yes, we know that customer data is spread across the CRM ecosystem and their ERP application. We need to keep spreading the word about the value of architecture and platforms such as MDM, SOA, BPEL, ESB to bridge these divides.
    Thanks,
    Joan

  10. Gautam Mekala 02/03/2009 at 6:55 pm #

    Very informative article …. Added to that when a SOA strategy and investment is considered, MDM should be part of it to solve the “semantic Interoperability” issues, otherwise, the SOA strategy will be incomplete and yield fewer benefits for the organization.

  11. Joan Lawson 02/03/2009 at 8:32 pm #

    Would you also consider canonical data models as a possible solution to the ‘semantic interoperability’ issues?

  12. Gautam Mekala 02/04/2009 at 3:01 pm #

    Joan,
    Yes i agree that canonical data models are possible solution to the semantic interoperability issues……..

  13. Shiva Prasad 03/17/2009 at 7:18 am #

    Hi Joan,

    I fully concur with the article, having been working at the intersection of SOA, BPM and MDM for the last 4+ years building a Business Operations Platform (completely services-based).

    I want to share my opinion on the point you make about using BPEL (together with Rules) to handle the “processes” around managing master data. I think you are talking about a general “process-based” approach and so I would take that you would also include BPML also as something that can be used for this. Any executable process instance (that is represented at runtime as either a BPEL or a BPML document) needs to be modeled (design time) using the standard BPMN notation. I guess you agree on this aspect. Here is where I have a difference of opinion.

    In our analysis of the types of processes that characterize master data governance (data stewardship for handling duplicates, exceptions, data merge decisions etc) and data integration (events, change data capture, ETL etc), we found that conventional BPM-based models are not sufficient to capture the largely collaborative and knowledge-worker-driven nature of these MDM jobs. In our view, BPM models can be very nicely used to depict well-defined, sequence-based activities with clear definition of steps and process participants, all known up-front. But BPMN, as the notation, is highly unweildy in terms of representing “case handling” process types – which are what most of the data governance tasks are – high degree of collaboration between knowledge workers (data stewards, business SMEs etc), multiple iterations, events occurring randomly (and not in a known sequence and at pre-defined times), actions to be taken being decided at the moment (on the fly, based on the case data context)….all of these variations cannot be modeled in BPMN. So, OMG is also pushing a new modeling notation to capture “Case-like” processes.

    In our product, we advocate a State Management-based approach for handling Master Data Management activities – which means, users depict the complete lifecycle of the Hub master data object as a series of States and State Transitions (State Transition models) – transitions, that are triggered by relevant events (which can occur randomly). And, in the State of an object (either while entering or exiting the state or even when the object is in that state) or during state transitions, we can hook in any kind of activity (human worker-driven UI task, a synchronous web service call, invoking a regular BPM process, calling a java class that generates a sequence number, calling a business rule that governs data, anything short-lived, long-lived, human task, system work – almost anything). Thus, end-to-end lifecycle management can be modeled and executed (with a state engine).

    However, I agree that if the data management process is fairly well-defined with a few activities and not likely to change often (dynamic) and not be too concerned with human involvement in these tasks, then you might be able to model these within BPMN and have them executed (after generating the BPEL or BPML) by a SOA-based Services Bus.

    Can you share your experiences using modeled processes for data management (governance, really) and what typically are the modeling approaches adopted by your customers? Have you collaborated with MDM vendors that advocate a state-based modeling approach for building model-driven trusted data hubs?

    Thanks

Trackbacks/Pingbacks

  1. Are SOA and MDM inseparable? « Notes from a small field - 01/21/2009

    […] 21, 2009 at 21:59 | In Ones and Zeros | Tags: CDI, esb, mdm, mdm-server, PIM, soa Reading “MDM and SOA, a Strong Partnership” on the Hub Solution Designs blog reminded me that it was about time I rescued this post from […]

  2. Thank You To Our Readers « Hub Designs Blog - 12/23/2010

    […] Elements of MDM and CDI remains one of our most popular articles, and Joan Lawson’s MDM and SOA, a Strong Partnership is in the “Top Ten” as […]

%d bloggers like this: