Over the last few years, government has been quietly but deliberately raising expectations around how portfolios, programmes and projects are governed, reported and assured.
Two Functional Standards in particular are documenting the expectations on this initiative: the Programme and Project Data Standard and the Government Functional Standard: GovS 002 – Project Delivery.
Taken together, they offer more than guidance for the public sector alone. They set out a view of what good looks like for organisations managing complex programmes at scale.
The Programme and Project Data Standard is underpinned by six dimensions:
Completeness – ensuring that all required records and essential values are present. This standard provides guidance on the fundamental attributes of each data entity.
Uniqueness – reducing duplication of records. This standard requires organisations to use of unique identifiers for data entities.
Consistency – avoiding contradictions between related data points. Attribute definitions and validation rules in this standard help maintain logical relationships between data points.
Timeliness – keeping data current and relevant. This standard sets minimum update frequencies for each attribute to support up-to-date data.
Validity – ensuring data is in the correct format and within expected ranges. This standard provides formats and category lists to guide data entry.
Accuracy – aligning data with reality. This standard helps reduce ambiguity and supports verification by promoting common definitions and formatting requirements.
As we move at pace towards a world where digitisation becomes a non-negotiable, and a uniform enterprise data structure underpins any AI-initiatives, both Functional Standards implicitly describe an operating environment that many organisations will now recognise as difficult to sustain; using fragmented tools or, retaining legacy approaches dominated by manual workarounds and spreadsheets.
The Programme and Project Data Standard calls out the limitations of organisations’ programme and project data environments which,
“may mean immediate compliance is impossible. When such limitations exist, organisations shall develop implementation plans to overcome them and to work towards full compliance.”
The standard implicitly makes a strong case for modernisation of legacy architectures and standardised project processes by:
Requiring highly structured, interrelated and frequently updated data
Highlighting data quality, consistency, traceability, and governance will be non-negotiables
Acknowledging limitations of existing IT and data environments
Emphasising decision-making, assurance, and value-for-money is enabled by good data
So what does this mean?
At the heart of The Programme and Project Data Standard is a push for consistency, structure and comparability with the goal of driving value from the almost infinite amount of data residing in organisations who are managing complex programmes.
It defines a common set of data fields across projects, programmes and portfolios, covering areas such as costs, schedules, risks, benefits, dependencies and ownership. The intent is clear: leaders should be able to answer basic but critical questions with confidence.
What is in flight?
How is it performing?
What are our risks?
What value is expected, and, what is actually being realised?
The standard explicitly links data quality to decision-making and assurance, emphasising completeness, timeliness and consistency as foundations for confidence in delivery. This represents a shift away from narrative-heavy reporting towards evidence-based oversight, where performance and value can be examined across an entire portfolio rather than in isolation.
The challenge of disconnected tools
One of the more telling aspects of The Programme and Project Data Standard is its acknowledgement that organisations may struggle to comply immediately due to limitations in their existing “programme and project data environments”. This is careful language, but the implication is familiar: data scattered across spreadsheets, in-house built tools and point solutions is hard to reconcile, hard to trust and expensive to maintain.
The standard assumes that data can be validated, linked, version-controlled and reported at different levels with minimal manual effort. For organisations reliant on disconnected tooling, meeting those expectations often requires significant workarounds, manual consolidation and duplicated effort, all of which introduce risk at exactly the point where clarity is most needed.
Governance, accountability and traceability
The Programme and Project Data Standard places strong emphasis on accountability. The data standard requires clear ownership of projects, risks, benefits and costs, along with audit metadata showing when information was created or changed. Backing this, GovS 002 reinforces this by making governance roles and decision points explicit, from portfolio level through to individual projects.
A recurring theme is traceability. From policy intent to portfolio objectives, from objectives to programmes, and from programmes down to individual deliverables and benefits. This two-way traceability is presented not as administrative overhead, but as a prerequisite for effective change control, assurance and learning.
In practice, this level of traceability is difficult to achieve when information is managed across multiple unconnected systems, each with its own structures and assumptions.
A portfolio view, not just project control
GovS 002 is explicit that portfolio management is not optional.
It positions the portfolio as the primary mechanism for balancing investment, capacity, risk and long-term objectives and, ultimately, a key driver in any organisation tracking and achieving its objectives.
Decisions about starting, stopping or reshaping work are expected to be made in the context of the whole, not just the performance of individual initiatives.
The data standard complements this by defining what information needs to exist for that portfolio view to be meaningful. Without consistent data on costs, benefits, dependencies and status, portfolio-level decisions become speculative rather than informed.
A transatlantic comparison: echoes of US federal contracting practice
Although the UK Government’s approach is framed differently, there are clear parallels with how the US federal contracting sector has evolved over the past two decades, particularly in defence, aerospace and large-scale technology programmes.
In the US, disciplines such as Earned Value Management (EVM) have long been used to bring consistency and rigour to programme oversight. Mandated on major federal contracts and overseen by bodies such as the Defense Contract Management Agency, EVM was introduced for a similar reason to the UK’s data standards to ensure decision-makers have objective, timely insight into cost, schedule and performance.
More recently, this has been reinforced through structured data requirements such as IPMDAR (Integrated Program Management Data and Analysis Report). IPMDAR standardises how contractors submit cost, schedule, risk and performance data, replacing narrative-heavy reporting with machine-readable, comparable datasets.
Similar drivers, different starting points
There are strong similarities between IPMDAR and the UK Programme and Project Data Standard:
Standardised data models rather than bespoke reporting
Regular, structured submissions instead of ad hoc updates
Emphasis on traceability between scope, cost, schedule and outcomes
Support for assurance and independent review, rather than reliance on self-reported status
Both approaches reflect a shared recognition: complex programmes cannot be governed effectively through narrative reporting and local spreadsheets alone.
Where the US has historically focused on contractor control and compliance, particularly in defence procurement, the UK standards take a broader view. They apply across portfolios, programmes and projects of many types, with a strong emphasis on benefits realisation, strategic alignment and value for money and value for the taxpayer, rather than delivery performance alone.
From compliance to capability
Another point of convergence is the gradual shift away from seeing these practices as purely compliance driven. In the US, EVM and IPMDAR are increasingly used not just to satisfy oversight bodies, but to support internal forecasting, scenario analysis and risk management. Similarly, GovS 002 positions portfolio, programme and project management as an integrated management capability rather than a reporting obligation.
In both contexts, the underlying challenge is the same: structured data at scale places significant demands on tooling, integration and governance. As US agencies have learned, manual or disconnected approaches quickly become costly and fragile when reporting frequency increases and scrutiny intensifies.
What this means for organisations today?
In summary, both standards describe a level of data maturity, integration and governance that is increasingly hard to sustain without a coherent PPM strategy underpinning it.
As organisations grow more complex and scrutiny increases whether from boards, regulators, or the public the tolerance for inaccurate or out of date reporting continues to shrink.
The underlying message is subtle but consistent: effective delivery depends on reliable, connected information, clear accountability and the ability to see change as a system, not a collection of standalone projects.
For many organisations, aligning with these standards is less about compliance and more about capability, building the foundations needed to make better decisions, manage risk and change proactively and, ultimately, execute projects faster.
References
About the Author
David Joynes is an Account Director of large Aerospace and Defence enterprise at Cora Systems. He leads strategic relationships and drives growth with the world’s largest A&D companies.
:format(webp))
:format(webp))
:format(webp))
:format(webp))
:format(webp))
:format(webp))
:format(webp))
:format(webp))
:format(webp))
:format(webp))
:format(webp))