Analysis & Modeling · BABOK KA 5

Modeling — make the invisible visible

A model is a deliberate simplification that answers a named question for a named audience. This module covers modeling as a discipline — why we model, how to choose a notation, how to move between abstraction levels — with process modeling (BPMN) as the worked example throughout.

Foundations

Why we model

Make the invisible visible

Processes, rules, and data relationships exist whether they are documented or not. A model surfaces them so they can be inspected, debated, and changed deliberately.

Create a single source of truth

A shared model gives business, IT, ops, and audit one artefact to point at — replacing tribal knowledge and contradictory PowerPoints.

Reduce ambiguity before code is written

Defects found in a model cost a fraction of defects found in production. Modeling is the cheapest place to argue.

Compare current and future states

As-is and to-be models, side by side, expose what is genuinely changing — and force conversations about transition requirements.

Support traceability

Each model element can be linked back to a need, a stakeholder requirement, a design, and a test — closing the loop BABOK calls Requirements Life Cycle Management.

How to think about depth

Levels of abstraction

Level 1 — Context / Enterprise

Sponsors, executives, enterprise architects.

What is the system, who interacts with it, and what value does it produce?

Examples

  • Context diagram of a claims platform showing claimant, broker, regulator, payments provider.
  • Value-stream map for 'order to cash' across five departments.
  • Capability map of a retail business with ~25 first-level capabilities.

Typical artefacts

  • Context diagram
  • Value stream
  • Capability map
  • Business model canvas

Level 2 — Business / Conceptual

Process owners, business stakeholders, BAs.

How does the work flow end-to-end, who does what, and where are the decisions?

Examples

  • BPMN swimlane of the refund process with five lanes and two gateways.
  • Conceptual data model for 'Customer, Account, Transaction'.
  • Customer journey map for first-time mortgage applicants.

Typical artefacts

  • BPMN at descriptive level
  • Conceptual ERD
  • Customer journey
  • Service blueprint

Level 3 — Logical / Analytical

BAs, designers, solution architects, QA leads.

Exactly which steps, rules, states, and data structures must exist for the solution to behave correctly?

Examples

  • BPMN at analytical level with message flows, exception paths, timer events.
  • Logical data model with attributes, keys, and cardinalities.
  • State diagram for a Loan Application moving from Draft through Approved or Declined.
  • Decision table for premium-discount eligibility.

Typical artefacts

  • Analytical BPMN
  • Logical ERD
  • State diagram
  • Decision table / DMN

Level 4 — Physical / Executable

Developers, integration engineers, DBAs, ops.

How is this expressed in the runtime — services, queues, tables, code?

Examples

  • Executable BPMN deployed in a process engine with service tasks bound to APIs.
  • Physical data model with tables, columns, indexes, and partitioning.
  • Sequence diagram showing the exact API calls between the front end, BFF, and core ledger.

Typical artefacts

  • Executable BPMN
  • Physical DDL
  • Sequence diagram
  • Deployment diagram

The toolbox

Notations every BA should recognise

Business Process Model and NotationBPMN

Process

Steward: Object Management Group (OMG)

The de-facto standard for end-to-end business processes — readable by business and executable by engines.

Best for
  • Cross-functional flows with hand-offs and decisions.
  • As-is vs to-be comparison.
  • Processes that may later be automated in a workflow engine.
Weak for
  • Knowledge-intensive, event-driven case work — use CMMN.
  • The decision logic itself — extract it to DMN.
  • Pure data structures — use an ERD.

BA use: Default choice for most BA process work. Stay at descriptive level (tasks, gateways, events, lanes) unless the audience needs analytical detail.

Decision Model and NotationDMN

Decision

Steward: Object Management Group (OMG)

Models business decisions and the rules behind them — separates 'how we decide' from 'how we flow'.

Best for
  • Eligibility, pricing, routing, scoring.
  • Rule sets that change more often than the surrounding process.
  • Compliance areas where each rule must be auditable.
Weak for
  • Single yes/no rules — a sentence is enough.
  • Continuous calculations better expressed as formulas.

BA use: Pull complex decisions out of BPMN tasks into a DMN decision table or DRD. The process stays simple and the rules become independently testable.

Case Management Model and NotationCMMN

Case

Steward: Object Management Group (OMG)

Models knowledge-work that is event-driven and discretionary, not a fixed sequence.

Best for
  • Investigations, claims with discretionary tasks, complex onboarding.
  • Work where the next step depends on what arrives, not a script.
Weak for
  • Highly repeatable transactional flows — use BPMN.

BA use: Use when stakeholders keep saying 'it depends'. CMMN lets you model the pool of available tasks without forcing a false sequence.

Unified Modeling LanguageUML

Behaviour

Steward: Object Management Group (OMG)

A family of 14 diagram types covering structure (class, component) and behaviour (use case, sequence, state, activity).

Best for
  • System-level analysis where actor / system interactions matter.
  • State, sequence, and class diagrams for software-intensive solutions.
Weak for
  • Pure end-to-end business flow — BPMN reads better for non-technical audiences.

BA use: Pick the smallest UML subset that answers the question — usually use case, sequence, state, or class. Avoid using UML as a checklist.

Entity Relationship DiagramERD

Data

Steward: Peter Chen / industry convention

Models entities, attributes, and relationships — the canonical way to talk about data.

Best for
  • Reporting, integration, master data, regulatory data.
  • Establishing a shared vocabulary for the domain.
Weak for
  • Workflow or behaviour — pair with BPMN or state diagrams.

BA use: Start with a conceptual ERD (entities + relationships) before attributes. The relationships drive most of the design conversations.

Data Flow DiagramDFD

Data

Steward: DeMarco / Yourdon (1970s)

Shows how data moves between processes, stores, and external entities — orthogonal to control flow.

Best for
  • Reporting and ETL pipelines, integration mapping.
  • Privacy and data-minimisation reviews.
Weak for
  • Decision logic and timing.

BA use: Use a context diagram (DFD level 0) for scoping; level-1 DFDs are useful when the conversation is about who sees which data, not who does what when.

State Diagram

Behaviour

Steward: UML

Models the legal states of an object and the events that cause transitions.

Best for
  • Lifecycle-heavy entities: orders, claims, applications, tickets.
  • Surfacing illegal transitions and missing terminal states.
Weak for
  • Cross-actor processes — that's BPMN.

BA use: Whenever stakeholders say 'in status X we can…', draw the state diagram first. It prevents whole classes of defect.

Sequence Diagram

Behaviour

Steward: UML

Time-ordered exchange of messages between actors and systems.

Best for
  • Interface design, API conversations, integration debugging.
  • Showing exactly where synchronous vs asynchronous boundaries lie.
Weak for
  • End-to-end business processes — too low-level for non-technical readers.

BA use: Reach for it when the question is about ordering, latency, or who calls whom — not about who does what.

Customer Journey Map

Experience

Steward: Service design / CX practice

End-to-end view of a customer's experience — steps, channels, emotions, and pain points.

Best for
  • Outside-in framing of a problem.
  • Aligning multiple internal teams around a single customer narrative.
Weak for
  • Internal hand-offs and system behaviour — pair with a service blueprint or BPMN.

BA use: Use a journey map to set context, then drop into BPMN at specific 'moments of truth' to show internal flow.

Service Blueprint

Experience

Steward: Service design

Extends a journey map with front-stage, back-stage, and supporting processes.

Best for
  • Service redesign where the visible change requires invisible changes behind the line of visibility.
Weak for
  • Pure technical interface work.

BA use: The blueprint is where customer experience meets operating model. It's a powerful BA artefact in service-heavy industries.

Process modeling — worked notation

BPMN element cheat-sheet

The descriptive subset of BPMN — flow objects, connectors, swimlanes, and artefacts — covers the vast majority of business analysis work. Each element below includes the failure mode that most often hides in poorly maintained models.

GroupElementSymbolMeaningCommon pitfall
Flow ObjectTaskAn atomic unit of work performed by a person or system.Tasks named with nouns ('Invoice') instead of verb-object ('Issue invoice') hide the actual work.
Flow ObjectSub-process▭ ⊞A task that hides further detail; can be expanded into its own diagram.Stuffing 20 steps into a flat diagram instead of nesting — readability collapses past ~10 elements per page.
Flow ObjectStart event○ (thin)Triggers the process. Can be message, timer, signal, or none.Using 'none' starts where a message or timer is the actual trigger — readers can't tell what kicks the process off.
Flow ObjectEnd event○ (thick)Terminates a path. Multiple end events are allowed and often clearer than one.A single 'end' that hides whether the process succeeded, failed, or was cancelled.
Flow ObjectIntermediate eventSomething happens during the flow — typically a timer, message receipt, or escalation.Modelling waits as tasks ('Wait 24h') instead of timer events.
Flow ObjectExclusive gateway (XOR)◇ with ×Routes the token down exactly one outgoing path based on a condition.Unlabelled gateways. Every outgoing arc must carry the condition that selects it.
Flow ObjectParallel gateway (AND)◇ with +Splits flow into parallel paths and joins them when all complete.Using XOR where parallel is meant — readers think the paths are alternatives, not concurrent.
Flow ObjectInclusive gateway (OR)◇ with ○Activates one or more outgoing paths based on conditions; joins waiting only for the activated ones.Pulling out an inclusive gateway when an exclusive one would do — extra notation, no extra clarity.
Connecting ObjectSequence flowOrder of execution within a single pool.Drawing sequence flow across pools — that should be a message flow.
Connecting ObjectMessage flow⇢ (dashed)Communication between two pools (separate participants).Forgetting to model the responding pool, leaving the message hanging.
SwimlanePool▭ (large)A participant — a whole organisation, system, or actor with its own process.Using a pool for a department in the same company; that's a lane.
SwimlaneLane▭ (sub-divided)A role or department within a pool, used to assign responsibility for tasks.Lanes named after named individuals — the model breaks the day they leave.
ArtefactData object▢ with corner foldInformation consumed or produced by a task.Decorative use only — a data object should change the reader's understanding of the flow.
ArtefactAnnotation[ ]A free-text note attached to part of the model.Annotations carrying business rules that should sit in DMN or in the requirement itself.

Method

As-is to to-be — a seven-step playbook

Process modeling is most valuable when it joins up the diagnosis of the current state with the design of a future state and the transition between them. This playbook is the spine; the techniques in each step are interchangeable.

  1. 1. Frame the process

    Agree the trigger, end events, and process owner before drawing anything.

    Techniques

    • Interviews
    • Document analysis
    • Context diagram

    Output

    One-paragraph process description and a SIPOC (Suppliers, Inputs, Process, Outputs, Customers).

  2. 2. Model the as-is

    Capture the process as it actually runs today — not the policy version, the real version.

    Techniques

    • Observation
    • Walkthroughs
    • BPMN at descriptive level

    Output

    As-is BPMN swimlane with cycle times and handoff counts annotated.

  3. 3. Diagnose

    Identify waste, hand-offs, rework loops, manual workarounds, and rule-heavy steps that hide complexity.

    Techniques

    • Root cause analysis (5 Whys, fishbone)
    • Value-add analysis
    • Process metrics

    Output

    Annotated as-is model + prioritised list of pain points with evidence.

  4. 4. Design the to-be

    Design the future-state process to eliminate or contain the diagnosed problems — automation is one option, not the only one.

    Techniques

    • Workshops
    • Process modeling
    • DMN for extracted rules
    • Service blueprint

    Output

    To-be BPMN with explicit decisions, events, and message flows; rules in DMN.

  5. 5. Define transition requirements

    Capture what is needed only during the move from as-is to to-be.

    Techniques

    • Gap analysis
    • Stakeholder analysis
    • Risk analysis

    Output

    Transition requirements: data migration, training, dual-running, deprecation plan.

  6. 6. Validate and baseline

    Walk the to-be model with the people who will run it; baseline once they can defend it.

    Techniques

    • Walkthroughs
    • Reviews
    • Acceptance criteria per task

    Output

    Signed-off to-be model with traceability into stories or use cases.

  7. 7. Measure after delivery

    Compare actual cycle time, hand-offs, and defect rates against the to-be assumptions.

    Techniques

    • Solution Performance Measures
    • Process metrics
    • Retrospectives

    Output

    Evidence of value realised — feeds the next round of process improvement.

Decision aid

Pick the right model for the question

If you can write the question down in one sentence, this table tells you which artefact to reach for — and which to avoid.

If the question is…Reach forAvoid
How does the work flow end-to-end across people and systems?BPMN swimlane (descriptive)Sequence diagram (too low-level for non-technical readers).
What rules govern an outcome given several inputs?Decision table or DMNHiding the rules inside a BPMN task.
What states can this thing be in, and how does it move between them?State diagram (UML)BPMN — it models flow, not lifecycle.
What are the business objects and how do they relate?Conceptual ERDClass diagram (premature implementation detail).
Where does data come from, where does it go, who sees it?Data flow diagramBPMN — control flow obscures the data view.
What does the customer experience and feel through this?Customer journey or service blueprintBPMN as the only artefact — it hides the outside-in view.
Exactly which messages are exchanged across systems and in what order?Sequence diagramBPMN — message flows shown across pools are coarser than sequence diagrams.
What does the work look like when there is no fixed sequence?CMMN or a discretionary task listForcing BPMN — false structure becomes a defect source.
What does the system look like from outside the boundary?Context diagram (DFD level 0)Skipping straight to detailed flows.

What to avoid

Modeling anti-patterns

The wallpaper diagram

Symptom: A single BPMN page with 60+ tasks, four font sizes, and no swimlanes.

Why it hurts: Nobody reads it; defects hide in the noise; updates are abandoned.

Fix: Decompose into a Level-2 overview with Level-3 sub-processes. Aim for ≤10 elements per page.

Modelling the policy, not the practice

Symptom: The as-is matches the procedure manual exactly — but operators do something different every day.

Why it hurts: The to-be solves a problem nobody has, while the real problem (the workaround) survives untouched.

Fix: Observe the work, not just the documentation. Annotate where practice diverges from policy.

Rules buried inside tasks

Symptom: A task labelled 'Calculate eligibility' hides 40 conditions in a developer's head.

Why it hurts: The model looks tidy; the actual logic is unaudited and untested.

Fix: Pull the rules into a decision table or DMN model and reference it from the task.

Notation soup

Symptom: Half BPMN, half flowchart, with UML actors thrown in.

Why it hurts: Readers can't tell what each shape means; tooling can't validate the model.

Fix: Pick one notation per audience and stick to its rules. Use a legend if you must mix.

Automating the mess

Symptom: The to-be model is the as-is model with 'system' lanes added and humans removed.

Why it hurts: You bake in the existing waste, hand-offs, and rework — at higher speed and lower visibility.

Fix: Diagnose and redesign first. Automation is the last 20% of process improvement, not the first.

One-shot modeling

Symptom: The model is built for sign-off and never opened again.

Why it hurts: Models drift, become wrong, and are mistrusted — feeding the cycle of starting over.

Fix: Treat the model as a living artefact with an owner, a review cadence, and a change log.

Modeling without a question

Symptom: The team starts BPMN-ing because 'we should have a process map'.

Why it hurts: Lots of effort, no decision improved.

Fix: Every model should answer a named question for a named audience. If you can't write that down, don't draw it.

BABOK

Where modeling sits in the BABOK

Strategy Analysis

Capability and value-stream models frame the change before requirements work begins.

Elicitation & Collaboration

Models are an elicitation device — drawing surfaces what stakeholders cannot say in prose.

Requirements Analysis & Design Definition

The KA where most modeling lives. Process, data, decision, state, and behaviour models all sit here.

Requirements Life Cycle Management

Models give traceability anchors — each element can map to a requirement, design, and test.

Solution Evaluation

Comparing measured performance against the to-be model is how value realisation is evidenced.

Business Analysis Planning & Monitoring

The BA approach decides which models will be produced, at which level, and for which audience.

Practitioner notes

Tooling discipline

Notation conformance

OMG specs (BPMN 2.0, DMN 1.x, CMMN 1.1, UML 2.x) are freely available. Most enterprise tools claim conformance; in practice, sticking to the descriptive subset keeps you portable.

Source of truth

Treat the model file (BPMN/DMN XML, .uml, JSON) as source — diagrams are derived. Store in version control alongside requirements artefacts.

Round-tripping with engines

Executable BPMN tools can run the same XML you modeled, but only if you avoid notation that doesn't have execution semantics (e.g. ad-hoc sub-processes, free-text annotations carrying logic).

Accessibility

A model that only renders at A0 print size is inaccessible. Provide a structured-text walkthrough alongside any non-trivial diagram.

Deep dives

12-section deep modules

Modeling as a discipline plus dedicated deep modules for BPMN, DMN, CMMN, UML, and data modeling — each with worked examples, anti-patterns, and a practice quiz.

discipline · 12 sections + quiz

Deep learning module

A model is a deliberate simplification that answers a named question for a named audience. Modeling is the BA discipline of choosing what to leave out so the thing that matters becomes visible.

Problem solved: Stakeholders argue past each other when they describe complex systems in prose. Models pin the conversation to a shared visual artefact so disagreement can be resolved at the level of a specific arrow, decision, or entity instead of vibes.

Related

bpmn · 12 sections + quiz

Deep learning module

() is the -maintained standard for visually modelling end-to-end business processes. Same symbols, same meaning, anywhere in the world.

Problem solved: Before , every team drew flowcharts with their own symbols. Reading another team's process model required learning a new dialect every time. BPMN gives BAs, architects, and process engineers a single shared for activities, events, gateways, and .

Related

dmn · 12 sections + quiz

Deep learning module

() is the standard for modelling business decisions and the rules behind them — separately from the process that invokes them.

Problem solved: When complex rules live inside tasks, they're invisible, untestable, and hard to change. Pulling them into makes them auditable, versionable, testable, and reusable across processes.

Related

cmmn · 12 sections + quiz

Deep learning module

() is the standard for knowledge work where the next step is decided by a human in the moment, not by a fixed sequence. Each case is unique.

Problem solved: forces a defined sequence. Investigations, complex onboardings, claims with discretionary tasks, and care plans don't have one — handlers pick what to do next based on judgement. Forcing them into BPMN produces fake processes everyone ignores.

Related

uml · 12 sections + quiz

Deep learning module

() is 's suite for modelling software systems — what they're made of (structure) and what they do (behaviour). The diagrams BAs touch most are , activity, state, sequence, and class.

Problem solved: Process and data notations don't capture how a software system behaves over time, how objects collaborate, or how a single business object's lifecycle moves between states. diagrams give the BA + dev team a shared way to specify those.

Related

data · 12 sections + quiz

Deep learning module

Data modelling captures the things a business cares about (entities), what we know about them (attributes), and how they relate. It runs from business-language concept models down to database tables.

Problem solved: Systems disagree about what '' means. Reports double-count. Integrations break when one system stores something the other doesn't. A clear data model resolves vocabulary first, then structure, before code is written.

Related