12.5: Finding the facts

Capture lessons in real time, not just at project end. Learn how continuous reflection stops insight from disappearing and improves future delivery.

If your project doesn’t work, look for the part that you didn’t think was important.

Arthur Bloch, Author

Key terms

Micro-lessons: Small but meaningful observations that improve how the team works day-to-day.

Desktop review: A structured review of project documents to rebuild the story of what actually happened over time.

Recall bias: The way memory edits or forgets events, making later reviews less accurate.

Lessons log/register: A running record where insights are captured as they occur.

After-Action Review (AAR): A short, structured conversation held immediately after an activity to capture what worked and what didn’t.

Valid insight: Information that directly relates to the project and is based on real evidence, not opinion.

Reliable insight: Information that someone else could read later and clearly understand the same way.

Current insight: Lessons that reflect how the project actually operated, not just how it was originally planned.

Sufficient insight: Enough information gathered to confidently understand an issue and act on it.


12.5.1: Real-time lessons learned

Most organizations treat “lessons learned” as an end-of-project activity — a workshop or a form to complete once delivery is done.

But by that point, something subtle has already happened: memory has edited the story.

We tend to remember the big issues and the most recent events, but we lose sight of the dozens of small but important micro-lessons that shaped the project in real time.

This means that by the time we hold the review, much of the real learning has already disappeared.

The thing is, though, projects don’t just produce lessons at the end — they generate them constantly:

  • An approval stalled because roles weren’t clear.
  • A small phrasing change in a stakeholder briefing softened resistance.
  • A template confused a user, prompting a workaround.
  • A two-minute clarification in a stand-up prevented a three-day delay.

These are gold, but because they are small and quickly resolved, they rarely make it into official reviews. Without real-time capture, the lived intelligence of delivery leaks away.

Some senior project professionals keep a personal lessons log or journal throughout delivery — short notes capturing what surprised them, what unlocked progress, or what nearly caused failure.

This doesn’t need to be formal — even two bullet points a week can build powerful insight over time.

However, even if you don’t maintain a journal, the status report and change request templates already provided in this course contain a “Lessons Learned” field. This is deliberate.

That single entry does three powerful things:

  • It embeds reflection into normal workflow, so capture becomes habit, not a separate activity.
  • It prevents recall bias by recording insight while it’s still clear.
  • It accumulates small lessons over time, so by the end, you have a rich foundation before any formal review begins.

Insight has a brief window. As soon as a moment passes, the mind normalizes it — “That’s just how the project goes.”

Real-time lessons matter because they prevent end-of-project reports from becoming sanitized and overly generic, where insights are watered down into statements like “communication could have been better.”

Capturing lessons as they happen preserves the human insight and emotional truth of delivery, not just the procedural summary.

It also enables live course correction, allowing teams to adjust behavior and governance while the project is still active, rather than only informing the next one.

Over time, this approach builds cumulative organizational intelligence, rather than a collection of retrospective observations that no one revisits once the project has closed.

And as a very practical bonus, it means that when it’s time to write the final review, you’re not staring at a blank sheet of paper trying to remember what actually happened — most of the intelligence is already documented, ready to be refined.


12.5.2: Desktop review

By this stage, you’ve already begun capturing real-time lessons through status reports, change request logs, and possibly your own reflection notes. Those early insights are not just helpful observations — they are the first layer of your Desktop Review. They show you when and where tension, workaround, innovation, or governance friction actually occurred in the project.

Now the task shifts from moment-by-moment capture to systematic reconstruction of the project’s journey using the documentary record. This is where you cross-check what people felt and recorded in real time against what the formal project record shows over the life of delivery. The goal is not to find fault — it is to establish a factual truth base before interviews, reflection workshops, or stakeholder conversations begin.

Reconstruct the journey

A desktop review involves reviewing documents across the life of the project, including:

Formal records such as:

  • Original and revised project plans — to track how intent evolved over time.
  • Status reports and milestone dashboards — to identify pattern shifts and moments of strain.
  • Change requests and variation approvals — which often reveal where the scope definition or governance model came under pressure.
  • Risk and issue logs — showing where tension repeated or where responses escalated slowly.
  • Governance or steering committee minutes — especially noting delays, deferrals, or escalated decision points.
  • Financial reports — showing when cost allocations shifted or hidden work absorbed into other lines.

Informal records such as:

  • Notes from internal stand-up summaries or quick debrief emails.
  • Ad-hoc workaround instructions that never made it into a formal plan.
  • Drafts or earlier versions of documents — not just final approved copies.
  • Email trails where approval “in principle” was given without formal variation.
  • Instant message threads where decisions were made verbally and later formalized (or never were).

As you review, lay the documents out as a narrative rather than isolated artifacts. You are not just checking for compliance — you are storytelling with evidence. Ask yourself:

  • Were the right stakeholders kept in the loop at the right times?
  • Where were the same risks raised repeatedly?
  • Was a key lesson visible long before it was formally acknowledged?
  • What changed in governance tone around the third or fourth steering meeting?
  • Were changes made without formal approval?
  • Did scope language tighten, loosen, or get repurposed over time to accommodate pressure?
  • Were some lessons flagged early but ignored until late delivery stress made them urgent?

It’s important to note, though, that looking at only the final version of each document hides the story.

When you track their evolution, you start to see that most challenges do not suddenly appear from nowhere — they whisper throughout the project, usually in meeting notes or minor variation commentary, long before they surface as formal issues.

Setting the scene

The purpose of the desktop review is to build a neutral factual anchor before speaking to stakeholders.

This matters, because once people are involved, memory bias, political influence, and reputation management begin to color the story.

By building a timeline of documented reality, you are able to approach stakeholder interviews already understanding what happened — you are simply seeking clarity on why it happened.

After all, stakeholder time is scarce, especially after project closure. You may only get one chance to speak to a sponsor, supplier lead, or governance chair.

If that time is spent asking them to recall basic facts, you waste the opportunity and reduce the depth of insight you receive.

A thorough desktop review allows you to enter interviews with high-precision prompts, such as:

  • “I noticed three consecutive change requests were delayed at the governance level — was that due to process or sponsor availability?”
  • “There’s a shift in tone in the weekly report around Month 5 — did stakeholder pressure increase at that time?”

When stakeholders sense that you understand the journey and are not asking them to re-explain the basics, they speak more openly and at a strategic level rather than defaulting to surface commentary.

As we shall see, desktop review builds your foundation, stakeholder interviews reveal the meaning, and truth is found in the alignment between the two.


12.5.3: Stakeholder interviews

Stakeholder interviews are where you begin to layer lived experience onto the recorded evidence.

This is where you uncover the why behind the what, and where you surface insights that no document will ever show you — the reactions, resistance, short-cuts, governance friction, and political realities that shaped delivery.

However, before you ask the first question, pause and check your own perspective.

Your desktop review will have shaped an opinion about what happened — but remember, documents show what was recorded (and what people chose to record), not necessarily what was experienced.

If you enter interviews convinced you already know the answers, your questioning will unconsciously seek confirmation, not insight. The goal is not validation, but exploration.

To that end, stakeholder input does not always need to come from one-on-one interviews. In fact, different formats unlock different types of insight:

  • One-on-one interviews are ideal for senior stakeholders or those with high influence and political awareness.
  • Small lessons learned workshops or focus groups might be good for operational teams and delivery staff who build on each other’s observations.
  • Quick surveys or structured questionnaires can be useful for capturing input from lower power/interest stakeholders who hold frontline insight (users, support teams, mid-level contributors).

Pro tip: For structured questioning, refer back to the Project Review Template shared in the previous lesson, which includes example prompts that help move the conversation from generic commentary to actionable intelligence.

Remember: lessons learned are not only about finding what went wrong. Ask:

  • “Did anything work particularly well — and why?”
  • “Were there moments where you saw strong governance or smooth collaboration? What made that possible?”
  • “Did we succeed because of good luck, or good management?”

This last question is powerful — it distinguishes good outcomes produced by good systems from good outcomes that we probably didn’t deserve!

Also, if a stakeholder says, “No real lessons from my area,” challenge gently but clearly: “Is it truly that nothing at all could have been improved or intentionally repeated? No early warning signs, no surprises, no helpful behaviors worth capturing?”

There is always a lesson.

Even smooth delivery is a lesson — maybe your early stakeholder alignment strategy worked. Maybe your change control process finally clicked.

Positive lessons are as valuable as corrective ones, but they are often overlooked because we only go hunting for problems.

Ultimately, interviews are not about gathering quotes. They are about extracting systemic insight:

  • “What does this reveal about how we handle change?”
  • “What condition made that success possible?”
  • “What does this tell us about how our stakeholders engage under pressure?”

Other tips include:

  • Ask open-ended questions
  • Practice your active listening
  • Use silence
  • Respect ‘off-the-record’ confidences (although you should always try to independently verify them)
  • Explore side issues when they are relevant, but keep to the agenda
  • Let the interviewee ask questions of you, and
  • Reboot the interview, if necessary.

Stakeholder interviews, when done well, feel less like audits and more like high-value professional conversations about how delivery could be made easier, safer, or smarter next time.

I like to describe them as conversations, not interrogations.

And once stakeholders understand you’re looking for system improvements, not personal failings, they shift from defense to insight, which is the entire goal of this stage.


Don’t forget: As already noted, staff and contractors often finish their work long before the project formally ends — sometimes weeks, months, or even years earlier.

By the time closure comes around, they may have moved to new roles, organizations, or even industries.

To the extent possible, capture their insights as they leave rather than hunting them down later.

Exit interviews, quick debriefs, or even a few bullet points in an email can preserve valuable lessons that would otherwise disappear with them.

 Each departing team member carries a piece of the project’s story — don’t let those pieces scatter before the review formally begins.


12.5.4: Data quality

Before any insight becomes part of your organization’s lessons learned register, you should test the quality of the information you’ve gathered.

Not all data is good data, and as with all decision-making processes, GIGO — Garbage In, Garbage Out — applies here too.

The goal is not simply to collect comments but to ensure that what gets recorded as a lesson is accurate, representative, and useful to future projects.

To do this, apply four tests:

Valid

Valid data accurately reflects what actually occurred during the project.

It should relate directly to the specific project under review and sit within the scope of your review authority or terms of reference.

Insight drawn from unrelated incidents, or based on personal opinion without supporting evidence, weakens the integrity of the review.

Reliable

Reliable data is information that would be understood consistently by different reviewers, even if they interpret its implications differently.

For example, earned value data offers a reliable snapshot of performance — everyone understands what it means to be 10% behind schedule, even if opinions differ on why it happened.

We all might take different actions based on that information; however, the data itself is not in dispute.

Current

Data must reflect the most current and accurate state of the project.

Projects evolve, and so does understanding. Reviewing only the original project plan without considering later revisions would give an outdated view.

A project that failed to track ongoing changes may produce non-current insight, meaning your review is based on a version of reality that no longer matched how delivery was actually happening.

Sufficient

Finally, sufficiency means that you should gather data until you have enough information to draw reliable conclusions and make high-quality recommendations.

You know you’ve reached “enough” when additional data stops changing your understanding of the issue and only confirms what you’ve already observed — this is the point where patterns are stable, not still emerging.

Quite often, multiple data sets will throw up conflicting information.

For example, two stakeholders might have very different impressions of project success, while the project plan tells a different story again.

The skill of the reviewer lies in triangulating these multiple points of data to arrive at the ‘true’ picture of what went on in the project.

Quizzes