Skip to content

Creating a steady state data flow map

A detailed set of guidelines for creating a steady state data flow map.

Guidelines for creating a steady state data flow map  [PDF 2.0 MB]



These guidelines are a reference for organisations planning to generate steady state data flow maps. They assume basic knowledge of the underlying steady state data flow model.

The process described in these guidelines is intentionally open-ended in some respects, to promote business relevance over compliance. This allows organisations the flexibility they need to implement it in a way that makes sense for them, and helps maximise benefits.

The key to a successful steady state data flow mapping process is that it remains practical and relevant throughout, from start to finish, and that it continues to offer genuine value once completed.


The steady state data flow map serves as a delivery mechanism for implementating the operational Data Governance Framework (oDGF). The oDGF is meant to make data governance a part of organisational workflows and pertinent for operational staff.

The development and implementation of a steady state data flow map delivers actionable data governance insights, along with practical data management knowledge, into the operational context.

The steady state data flow map is also meant to function as a key enabler for a holistic approach to data governance. This encourages a fully integrated treatment of governance for data assets at all levels of the enterprise: executive, management and operations.

The resulting map can also serve as an effective communication tool for promoting an operational perspective upwards to management and executive levels. This fosters a consideration of data-related realities that can positively inform the development of organisational policy and strategy, and thereby increase their chance of success.

This underlying drive for data governance relevance influences numerous aspects of the steady states data flow mapping process, including:

  • identifying stakeholders
  • collection and using supporting documentation
  • the extent to which an existing process model is leveraged as a base map, and into the future,
  • the mechanisms organisations employ to capture and leverage ongoing steady state information.

Along those lines, the steady state data flow map is meant to deliver increasing value as staff grow more familiar with it and more comfortable considering it a part of the basic toolkit for their job.

The initial version of the map, developed in the stakeholder workshop described below, represents an informed guess. Additional insights from the ongoing use of the map in operational contexts will tune it to maximise its business relevance.

In support of that, the data flow map is easily re-configured to reflect new knowledge or deliver business value in changing circumstances.

A 4-step process

The process to plan, develop and deliver a steady state data flow map can be described in four primary steps:

  1. planning and preparation
  2. mapping workshop
  3. follow-up editing
  4. implementation and maintenance.

These steps can sometimes happen out of sequence, and may need to be. However, the process will produce the best results if it happens in the sequence we describe below.

Step 1: Planning and preparation

 The planning and preparation stage involves:

  • establishing and agreeing relevant mapping parameters
  • gathering and reviewing all the necessary information to appropriately inform and influence the subsequent steady state data flow mapping process.

Before the mapping process can start, the organisation must agree on 3 core parameters:

  • the data entity (or entities) to be mapped,
  • the extents of the mapping
  • which stakeholders to involve in the mapping.

Each of these parameters will likely overlap, and affect each other. The process of making these decisions may also uncover dependencies, further supporting the need for considering them holistically.

Each parameter is also important in its own right, and organisations must give each sufficient time and attention.If any one of the three is missing or incorrectly established, the data flow mapping effort’s success will be at risk. 

1)   The data entity you will map

Dedicate time up front to carefully decide and agree the data entity you will map.

We have used the term 'data entity' because the definition, form and format of data will vary between organisations. The steady state data flow map model has been designed to operate for anything that might, in a particular context, be classified as a data asset.

The decision about which data entity to map might be made by the full stakeholder group or a project owner who then takes it to the stakeholder group.

This decision is highly strategic, because it drives many of the next decisions, and can significantly influence final deliverables.

There is no single pathway for choosing which data entity to map. For example, it might be structured to:

  • fill data knowledge gaps within a particular part of the organisation
  • demonstrate the strategic value of an organisational data asset
  • help improve processes that result in important outputs
  • provide momentum for developing and operationalising a data quality framework
  • facilitate better collaboration between business units
  • contribute to improved organisational data practice and data maturity.

Any one of these options establishes a clear line of sight from the data flow mapping work to wider strategic and business needs. The choice of data entity should always be linked clearly back to relevant business imperatives.

2)   The map extents

The map’s extent will define its scope. In the map itself, the map extents will be its boundary lines, 

Setting map extents will set the specific area of the mapping process. It will also provide a mechanism for showing when the flow of data occurs within or outside of the business unit’s / organisation's control.

This can inform thinking about organisational or business unit responsibilities and obligations, data access, and supplier and customer relationship management.  

Well-defined map boundaries also help to identify whether steady states are inside or outside of the organisation or business unit sphere of influence. This will inform the correct specification of steady state attributes (including data quality dimensions) and their associated measures. The boundaries are also necessary for the proper designation and labelling of steady state zero (SS0), a critical component of the steady state data flow model.

A useful way to consider the setting of map extents is to see it as a means of showing the relevant data landscape - the space in which relevant data flow paths are arranged and the overall context through which the data entity flows. 

A landscape view will also help create clarity about what parts of the data flow and what business processes need more investigation, where to put steady states, and how to choose them.

It will help show secondary data flows that intersect the primary data entity flow path, based on their level of influence. A landscape view of the data flow will also help decide when and where these secondary paths should be on the map, or  as an external element best shown on a separate map.

Single or multiple business units

An important decision is whether the map extents are within one business unit, or many.

Both approaches are valid. Business needs should drive the choice, based on the value potential of each of the approaches to respond to those needs. 

A map reflecting data flow associated with a single business unit is likely to be more manageable. Since it is associated with a limited and engaged collection of staff, in-depth and comprehensive knowledge of the data entity should be high, helping to produce an accurate flow map.

The associated flow path is likely to be relatively simple and straightforward to  map. A smaller set of stakeholders will also simplify the effort required to run the mapping process, and for the same reason, more easily facilitate ongoing use and maintenance of the map.

If an organisation wants to create a number of steady state data flow maps, a single business unit example can be a good pilot to explain and promote the value of the data flow mapping process. 

A map involving multiple business units could reflect data associated with a functional area, or that supporting an organisational-level output. One example of an organisation-level output is the publication of an annual report.

It also has high potential benefit, because of its ability to demonstrate data use for enterprise business process improvement.

Often based on an important output, these maps effectively highlight the data sources used in the output and how to manage the data to ensure the output is produced consistently.

Extending across business unit boundaries, the multi-unit data flow map will provide a picture of how data moves (or doesnt move) between different parts of the organisation. It will also show how business units interact (or dont interact) to produce an important output or result.

It can offer valuable insights in support of organisational development, as it shows how business units function coherently to deliver the broader goals of the organisation.

Current or future state

Another potential decision about map extents is whether to map a current state or a future state data flow.

Many factors will influence this choice,  including:

  • the availability and accuracy of any existing current state information on data flow
  • the capabilities of those available to participate in a mapping workshop
  • the underlying reason for developing a data flow map.

We recommend starting with a current state map, as this will be a baseline for any future state maps.

A current state view will likely include a future state perspective, noting possible improvements to the current state. The workshop facilitator should make note of them in a way that doesnt interfere with creating an accurate current state map.

A proper future state map should be a separate mapping effort which uses the current state flow map to help build it more efficiently . Ideally, the data entity should stay the same, to allow for proper comparison.

The map extent and associated boundaries might change in a future state scenario, to reflect suggested changes to relevant processes or the resulting flow paths. It is good practice to reconsider the list of mapping workshop participants, as you may need a different group of stakeholders to provide necessary insights on a desired future state.

3)   Securing relevant stakeholders

It is critical to a successful outcome to identify and secure commitments from the stakeholders who will be part of the mapping process.

The decision will likely draw on a broad set of roles, based on the requirements of the specific map. The necessary capabilities that will influence this selection include:

  • providing direct input into the map and constructing the steady states attributes
  • reviewing outputs
  • offering general advice
  • providing project management
  • the authority to sign off on the final deliverables.

It is likely that the previous two steps (choosing the data entity and map extents) will identify and involve many of the relevant stakeholders. It is important that this step confirms all the relevant stakeholders and confirms their availability and commitment to participate, based on the timeframes associated with the mapping project.

To increase the likelihood that the resulting map and steady states provide ongoing value, it is a good idea to have a mix of perspectives participating in the mapping workshop. You should consider two primary perspectives in every case: detailed and big picture.

The detailed perspective

The mapping workshop should include at least one person who is able to provide an on-the-ground view of the processes that shape the data flow you’re mapping. This will be someone with an operational view, who regularly works with the data entity or uses the associated processes.

This person can provide detailed information about data or the associated process model. If the motivation for the exercise is process improvement, this individual will also likely possess valuable insights on how to mitigate or repair issues with the current state.

While this perspective should not exclusively drive the workshop session because they are likely to get bogged down by the details, they play a critical role in producing an accurate current state map.

The big picture perspective

Each group should also include at least one person who has a broad view across the full map extent, as well as insights into or a working knowledge of the areas upstream and downstream of the map extents.

This is likely to  be someone with a management view, who understands how the data entity and associated process model function in / contribute to the wider environment. If the motivation for the map is to improve processes, this individual will also have a clear sense of the general issues associated with current state and ideas on how to improve them.

If the workshop is dominated by people with a big picture perspective, it will likely result in a map that overlooks important details you need to build an accurate map of data flow and processes.

If the data flow map covers a single business unit, these two stakeholder roles might be covered by one person each. In very small or simple data flows, one person might be able to cover both viewpoints. The key is that these two perspectives are appropriately represented, with no gap in knowledge, so that they can each contribute to a successful data flow mapping outcome.

In multi-unit maps, there should also be at least one person from each of the business units participating .

This makes sure that any relevant insights or perspectives from each business unit are included, and that any needs or concerns are considered. Together, these business unit participants can reflect detailed, big picture, or combined perspectives, depending on what will contribute best to the map outcome.

A stakeholder group consisting of representatives from the full suite of relevant business units also supports helps bring various parts of the organisation together in one place, with a common point of reference and working towards a common process or knowledge improvement goal.


Decisions about data entity and map extent should ideally be be unanimous .

If they aren’t, some stakeholders might disengage from the process. While this might not shut the mapping process down completely, the resulting knowledge gap could mean a less useful map.

However, it may be the case that a well articulated, easily understood, and specific business need serves as the source of a decision to conduct a mapping exercise. In this case, the data entity and map extents will essentially be pre-determined.

Alternatively, there may be a desire to develop data flow mapping generally as part of a process improvement or organisational development initiative. In this case, the choice of data entity and map extents will support the goals of that initiative. In these instances, gaining consensus is relatively straight forward.

The likelihood of a successful data flow mapping process increases if the business value is universally agreed and represents the underlying motivation of everyone who is participating.

Documentation - collection and review

Early planning should also include identifying, collecting and reviewing the relevant documentation. This can be time consuming, but is important.

The documentation will generally fall under two categories: strategic and operational.


The collection and review of strategic documentation provides a comprehensive and high-level view of the important goals and business imperatives of the organisation. It might also provide insight into any new strategic goals or imperatives which aren’t yet widely understood.

If you consider your data high-value, strategic assets , flow patterns which show how and where they're used should reflect and support the strategic perspective.

This helps form a clear line of sight between the data flow map (current state operational contexts)  and the organisational vision (executive contexts).

This also supports the use of future state maps, increasing their strategic value and value as a business tool.

You don’t need to analyse these strategic documents in detail. Instead, use them to develop a general understanding of the organisation’s priorities, and how data aligns (or doesn’t) with them.

The most useful strategic documents will likely reflect executives’ vision and thinking. They may also include specific functional area or business units’ priorities.

Examples include statements of intent, vision or mission statements, long-term plans, annual or quarterly reports, business or data strategies, strategic planning workshop notes and conference presentations.


Operational documentation will be associated specifically with those parts of the organisation reflected in the data flow map and, to the extent it is available, describe how those business areas manage and employ the data entity. It will be used to inform both the design of the flow map and the development of an attribution table for the steady states reflected in the map.

Depending on the data maturity of the organisation, this operational documentation can represent a wide range of formats, from previously archived data flow maps to business process diagram that include no reference to data whatsoever.

If data maturity is generally low within the organisation, it is unlikely that the available documentation will include pre-existing, detailed data flow maps, or if they are available, maps that are current. It is much more likely that this documentation will consist of some sort of business process model diagramming  and associated information.

As it will be used to guide the data flow map construction, including in some cases serving as the base map itself, the quality and currency of business process diagrams that are available are of particular importance. That being said, and as will be described below in Step 2 of the mapping process, flow mapping can still proceed without any business process model documentation at hand, relying instead on the collective knowledge of the participants in the workshop.

There is also the need for documentation to support the development of the attribute table that will be associated with steady states in the map. Though not exclusively, the steady state data flow model is designed to support and reflect a data quality assessment, so a data quality framework that describes relevant quality dimensions is ideal. In lieu of that, documentation that provides a sense of the organisational view of data quality will suffice.

Step 2: Mapping Workshop

The second step in the process represents the core work, where the steady state data flow map and attribution for the steady states in that map are developed in a draft form. This session will involve most (if not all) of the stakeholders identified previously, including a facilitator. It is typically held over a portion of one day in a dedicated meeting space.

The room used needs to be equipped to support display of presentation materials and facilitate what can be highly dynamic and spontaneous development of a draft data flow map and set of initial steady state attributes and measures. Whiteboards, or drawable wall surfaces used in conjunction with post-it notes should suffice for this purpose.

The duration of the workshop will vary, depending on such factors as:

  • the size and complexity of the data flow map
  • the number of stakeholders participating and the level of clarity about the map entity
  • map extent and steady state attributes prior to the meeting

A half-day will generally be sufficient, regardless you can confidently schedule your workshop to occur within the space of one day.

A data flow mapping workshop involves a group of people. Therefore, the way it proceeds will inevitably be dictated by the personalities and relationships of those in the room.

It might be the case with multiple-business unit mapping that the session represents the first time this particular collection of people has met, nevermind been tasked to work with what could be unfamiliar and new concepts, to agree on a common set of outputs. Accordingly, it doesnt take much to shift things off topic to tangential discussions that do not support the production of an agreed map and attribute table.

For that reason, a workshop facilitator is critical. Ideally, this person would be from a part of the organisation other than the business unit(s) associated with the mapping extents. It can be helpful to have a facilitator from outside the organisation altogether, including an individual retained as a consultant specialising in meeting facilitation. 

The workshop facilitator must be able to manage the room, making sure things progress, while still giving space to potentially valuable contributions that can result from spontaneous discussion. This represents something of a balancing act, where the importance of having informed but potentially disparate perspectives in one room must be managed to avoid distraction and instead contribute to the development of an agreed and useful data flow map. The ability to suggest meaningful solutions on the fly, often involving compromise, will prove a useful capability.

While not required, it is also beneficial to have the workshop facilitator  'holding the pen' and drafting the data flow map and set of steady state attributes and measures. This amplifies the value of their role as an external presence, open to accepting suggestions on map and attribute elements and translating them without bias onto the workshop version of the draft map and attribute table.

It also helps keep the draft products relatively generalised and easily interpreted, which they need to be if the steady state model is to deliver to its potential. Any non-productive level of detail which might be introduced by the subject matter experts in the room will more likely be filtered by an external facilitator tasked with illustrating what is to them unfamiliar data and processes. They are essentially serving as proxy future users in this regard, who might very well sit outside of the constituent business unit(s) or the host organisation, or at the very least not possess a high level of subject matter expertise.

The nature and level of success of the workshop can also be influenced by the decision to use a pre-existing business process diagram, should it exist, as the base for the steady state data flow map. This can be a useful approach, for several reasons.

First, it is efficient. The steady state data flow model posits that steady states represent a set of data monitoring stations, situated on a flow network that is shaped by business processes. This promotes the inherent idea that data and process are intricately intertwined and of equal value to the success of the enterprise. If a business process diagram already exists and can be employed as a base map, then significant effort is saved, and the mapping component of the workshop just involves decisions about where to place steady states onto an existing base layer.

If this approach is to be employed, there must be a strong and consistent level of agreement amongst all of the stakeholders, either as part of the planning phase or during the early part of the workshop itself, that the business process diagram accurately portrays the set and sequence of processes within which data flow is to be developed.

In addition, there must be consensus that the business process diagram is meaningful. It needs to depict the set of processes in a way that is readily interpretable and makes sense to everyone in the room, including those without strong subject matter expertise or a technical background  (those providing the big picture perspective). A highly detailed and complex process diagram may accurately reflect reality, but may not be meaningful or useful in that form.

Early pilots of data flow mapping workshops employed this approach. Prior to the workshops, effort was made to secure any and all existing business process models for the parts of the business where data flow would be depicted. These were then used directly or with minor alteration to produce a base map for the workshop, onto which steady states could be manually added.

While steady state data flow maps were successfully produced via this method, it became apparent that the efficiency gains from directly employing existing business process diagram did not offset the challenges that arose from basing data flow on a depiction that was exclusively process-focussed and often of a level of inherent detail and complexity that ultimately would not serve the production of a final data flow map. Inevitably, extra time was required to re-work the existing business process depiction, often simplifying it and re-developing it with a dual data-process reference frame, negating any time saved by starting with a completed process diagram.

As such, the current recommendation is to start with a blank slate (a clean whiteboard) and employ an existing business process model for reference purposes. Any existing process model can still be helpful as guidance in this regard, if for instance a challenging point is reached in the data flow map development, with uncertainty about how to proceed.

This approach also supports the use of an external facilitator to draw the data flow map. When working off a clean slate, but with a detailed process reference at hand, the facilitator at the whiteboard will be well positioned to receive and depict suggestions confidently and turn that input into an illustration that consistently maintains the proper balance of complexity and interpretability.

The map development in this case involves the individual holding the pen soliciting information from participants about what happens along the journey that the data entity takes within the agreed map extents and translating those responses into a visual depiction of flow, with key processes properly included. Steady states will be also be added to the depicted flow, either at the end as a group and after all processes are mapped, or individually at the same time as each process addition. It is part of the facilitators role to decide which method will work best for the group.

Providing advice to the facilitator about where best to insert steady states will be challenging for some participants in the workshop, especially those who are highly process-oriented or less data savvy. To assist with this process, three criteria have been established to guide placement of steady state icons onto the data flow map. A new steady state icon can be added onto a data flow map:

  1. In association with a process that generates an agreed change in the state of the data. For example, a quality control check that results in verified data.
  2. In association with a change of format. For example, from a phone log to a database file.
  3. In association with a physical move of the data, not necessarily involving a state-changing process. For example, uploading data from an ingestion server to a production server.

Once the facilitator has confirmed that at least one of these general criteria have been satisfied, they will need to solicit additional specific input from participants on where it would be valuable to position the steady state icon on the map. For example, this might be a point where it is useful to capture and monitor a set of data quality measures.

A final workshop draft map will consist of a physical representation of data flow paths within the mapping extents for the data entity, depicting all relevant processes and steady states that occur along those flow paths. Solid path lines will depict movement of the production data entity, used to develop the outputs shown at the end of the flow as final processes.

If it is of value, the map can also present alternative flow paths, such as supporting information flows, or supplementary data flows originating from outside the map extent. These will be illustrated using different line symbology to distinguish them from the primary data flow.

In association with the draft data flow map, the workshop participants will also generate a record of the attributes for each of the steady states in the map. These will be developed as a table and include a unique identification value for each steady state, as well as the decision criteria (data quality dimensions)  applicable for each steady state, the measures (questions) used to test each of those criteria and the tolerance (qualitative or quantitative answers) used to characterise and assess the results of the measurements.

If it is of business value, more attributes can be captured at each steady state, representing additional dimensions of the decision criteria, the satisfaction of which determine if a new data state has been achieved. In a multiple business unit flow map for instance, it is advisable to record the business units that have an association with the data entity, at each steady state.

The facilitator, with the participants in the workshop, will decide whether it is best to develop the steady state attribution at the same time as each of the steady states are added to the map, or after the draft map is composed as a separate exercise.

The advantage of a simultaneous development is associated with focus on a particular steady state, which could help uncover what quality dimensions and measures work best in that specific location on the flow path.

In contrast, the advantage of running a separate attribution exercise arises from the holistic perspective generated by a completed flow map, which could likewise positively influence the distribution of quality dimensions and measures across the set of mapped steady states.

These methods could also be combined, such that criteria and measures are assigned as each steady state is added, with the composition of the full set reviewed in a holistic manner after the flow map is completed. 

At the close of the workshop session it is good practice to photograph the whiteboard draft data flow map and attribute table, to create a raw digital record of the work completed. These records protect the effort expended during the workshop and serve as the basis for development of digital draft versions of both outputs. 

The unique set of characteristics associated with each data flow mapping process will determine responsibilities for creating the draft digital products and selection of the software solution used to develop each. There are, however, a few good practice considerations associated with this decision:

  • One option for completing the digital products would be to assign that task to the workshop facilitator. In this case they would be translating their own work and should operate in this capacity with the same level of consensus amongst the stakeholder participants (who will now provide a review of the draft products) that they enjoyed as workshop facilitator.
  • If there are existing business process diagrams maintained by the organisation, it is best to employ the same software solution where possible to develop the steady state data flow map. This creates an inherent link between data flow and the organisation’s view of business process and supports a level of accountability and buy-in for those who are currently maintaining business process diagrams.
  • The resulting workshop products should emphasise interpretability. It is this characteristic ultimately which will enhance their business value to the organisation, so it is advisable to stress this condition at the creation of the initial draft.

The deliverable for this step is a draft steady state data flow map and table of the attributes associated with the steady states included in the map.

Step 3: Follow-up Editing

Once digital versions of the draft map and attribute table have been created, they should be circulated amongst appropriate stakeholders for review and feedback. It is up to the workshop facilitator or those with project management responsibility to determine who is positioned to provide the most useful review, limiting the selection to a manageable subset of the full stakeholder list, but at minimum including one representative each of the detailed and the big picture perspectives. 

The files for the draft outputs should be made available in the native software solution format, so that those who are able to contribute at a detailed level can access the map and provide recommended edits directly within the source environment . This will aid consideration of feedback and improve the quality and speed with which agreed changes can be implemented into the map.

For other stakeholder reviewers, who will be providing more of the big picture perspective, it is recommended to provide them a version of the flow map in an easily ingested format like PDF . As with any consultation effort, a viable version control mechanism should be used to properly track and archive the various and evolving iterations of the data flow map.

It is also advisable to create hardcopy paper outputs of the draft flow map, as some stakeholders will prefer this format for their review and mark-up. This also provides the opportunity to begin to consider effective presentation of the data flow map, including layout choices and decisions for visual elements like symbology, which will prove valuable as the maps are more widely circulated over time.

For the steady state attribute table, a simple spreadsheet solution should suffice for draft review, comment and edits.

Once feedback from all relevant stakeholders has been considered, agreed and implemented as changes to the data flow map and the steady state attribute table, these two outputs can be finalised. This would include the addition and manipulation of cartographic elements like legends and borders for the map, and interpretability enhancements like fonts and colours for the attribute table. These finalised deliverables are then made available to the appropriate parties for formal acceptance and sign-off.

The deliverable for this step in the process is a finalised steady state data flow map (digital and hardcopy) and inventory (digital and hardcopy table) of the attribution values associated with the steady states included in the map, all of which is agreed and approved for acceptance by the stakeholders possessing the properly delegated authority.

4: Implementation and Maintenance

The final step in the data flow mapping process involves the incorporation of the final map and steady state attribution into the host organisations operational environment, so that they become a part of normal workflow. In this way they serve as a mechanism for embedding data accountability and good practice into what are otherwise often process-dominated contexts.

The way this happens will vary with each organisation and is dependent on its ability to incorporate data flow mapping into existing workflow systems. The goal is to develop a data flow map that becomes a core and indispensable business tool for all parts of the enterprise, and one that is referenced regularly and as a matter of course whenever the relevant data entity is employed.

Ideally, the generation of information resulting from the use of the data flow map, including a set of values for the steady state criteria measures, is automated and consistently captured, with a record of those results made available for reference, analysis and auditing. This provides a valuable resource for process improvement efforts, and its value will only increase over time and as more results are collected.

The organisation will need to balance the value of information collected via a steady state data flow map and the resources required to maintain and administer that collection. The more transactional the level of steady state information, the more precisely it can be used to identify the current state of data quality or the effectiveness of associated business processes. But the overhead and technology required to maintain a transactional information store, particularly one where capture is automated, may be beyond the reach of some organisations.

In that case, a simpler and manual approach to steady state information collection and use could be employed. While relatively static in comparison to a transactional solution, the use of a manually maintained record of the steady state criteria and measures can still be leveraged for business value.

As another alternative, a more detailed, in effect transactional level collection, can be run manually, but for a limited time. This might be enacted at a particularly critical point in the lead up to a product release or publication, for instance, when the added effort is more easily justified. At other times operations can return to a less intensive use of the data flow map, monitoring general data characteristics like quality as an ongoing maintenance exercise.

The key in any of these options for administering the information generated by a steady state data flow map is its proper positioning within existing and familiar workflow. If suitably implemented, the use of a steady state data flow map, despite being a new element for operational line staff, should constitute a relatively effortless addition to their normal activities. This reflects the basic premise behind the steady state data flow model, whereby data flow and the business process model are intricately linked. In this scenario, any manifestation of the business process model should inherently also reflect data flow.

Operational staff will already have some means of understanding their processing responsibilities, be that a sophisticated workflow solution on their desktops, or a set of verbal instructions communicated by their manager. In any case the use of the steady state data flow model, consisting of a visual reference to an easily understood and likely familiar process diagram, along with the running through of a checklist of criteria, will slot easily and unobtrusively into those processing instructions. If the reference diagram/map and criteria checklist are accessed from a readily available library and delivered using basic office tools, that only further reduces the potential friction associated with adding a new element to existing workflow.

No matter what the implementation scenario, once established, the steady state data flow map is available for use as foundational data infrastructure. As infrastructure it is positioned to deliver a high return on its investment, which will be relatively low. And because the steady state model is designed to be highly portable and flexible, after the map is established and available to staff, there will be additional implementation  options available that require a relatively minimal level of effort.

In the end a business decision will be employed to determine the nature of the implementation of the steady state data flow map. But that is exactly how it should be. As a critical asset, data, and the ways in which that asset is leveraged, should be subject to and driven by decisions with the organisations business imperatives in mind.

As time goes on and data flow maps are used more extensively, and as additional data flow maps are developed across the enterprise, the potential for improved business value associated with their use increases.

Like any cartographic product, a steady state data flow map offers a powerful means of conveying the landscape, in this case the use of data across the organisation, which can be used to bolster effective asset management and influence the future direction of the organisation. In that sense they represent a legitimately strategic investment.

Process Duration

The duration of the effort required for the data flow mapping process, or any one of its steps, is difficult to approximate in general terms, as it will inevitably be dependent upon factors unique to each organisation and to each mapping project.

Based on mapping exercises completed to date, however, an approximate timeframe can be estimated as follows:

  1. Planning and preparation: 1-3 days
  2. Mapping workshop: 0.5 to 1 day
  3. Follow-up: 0.5 days to 1 week
  4. Implementation and maintenance: 1 week to ongoing

Note: time estimates expressed above are in terms of expended hours, not overall duration. One day therefore represents 7.5 hours, though those hours may take place over several calendar days.

Contact us

If you'd like more information, have a question, or want to provide feedback, email

Content last reviewed 02 September 2021.