Quick Summary

Bronson partnered with NRC to find, score, and rank AI use cases across Codes Canada, NMS, and CCMC.

Staff from business, research, tech, and operations teams took part through interviews and workshops.

A simple scoring system was built to rank each idea on value, feasibility, scale, data readiness, and fit with NRC’s strategy.

NRC’s tech setup was reviewed in full, including legacy systems, Azure cloud, and data stored in SQL, NoSQL, and XML.

Bronson delivered a Use Case Playbook, a ranked Backlog, and an Architecture Review with a clear plan for AI-ready tech.

Project Overview

NRC engaged Bronson as part of its Partnership Pathways initiative. The goal: use AI to modernize Canada’s construction standards.

NRC sets the rules for how buildings are designed and built across Canada. Its work covers fire safety, plumbing, energy use, and structural performance.

Three NRC programs lead this work: Codes Canada, the National Master Specification (NMS), and the Canadian Construction Materials Centre (CCMC). Together, these programs shape the technical and legal rules that guide Canada’s construction sector.

The amount of technical content NRC manages keeps growing. Rules keep changing. And demand for digital, easy-to-use standards keeps rising.

NRC saw that AI could help. But a clear plan was needed: where should AI go first, and was the tech ready to support it?

Bronson ran a focused, stakeholder-driven engagement. The work gave NRC a ranked AI roadmap, a clear view of its tech readiness, and a path forward for the next phase.

The Challenge

NRC had a long list of AI ideas. What was missing was a way to compare, rank, and sequence them.

It also wasn’t clear if NRC’s current tech could handle AI work.

Here are the main challenges Bronson tackled:

  • No clear way to rank AI ideas. NRC had ideas across many programs but no shared method to score them on value or feasibility.
  • A complex tech setup. NRC runs a mix of legacy systems, Azure cloud, SQL, NoSQL, and XML data stores. It was hard to see which ones could support AI.
  • Need to align AI with analytics. Tools like Power BI had to be part of the plan, not an afterthought.
  • Unclear data readiness. Data quality, access, labels, and governance varied across systems.
  • Risk of duplicated work. Without a full review of existing tools, teams could end up building the same thing twice

NRC needed structure. NRC needed evidence. And NRC needed to know its tech could support the AI work that mattered most.

Our Solution and Impact

The work was split into two tracks running side by side:

  1. AI Use Case Scoping and Prioritization
  2. Target Architecture and Data Readiness Review

Bronson collaborated with stakeholders across Codes Canada, NMS, and CCMC, plus NRC’s IT, digital transformation, and AI architecture teams.

Workstream 1: AI Use Case Scoping and Prioritization

  • Bronson interviewed business leaders, tech leads, researchers, and operations managers across all three programs.
  • Every existing and proposed AI use case was listed and sorted by efficiency, research value, compliance, and innovation.
  • A scoring framework was built and tested with NRC. Each idea was scored on value, feasibility, scale, data readiness, and strategic fit.
  • Workshops were run to rank the use cases together. This kept things transparent and made sure stakeholders agreed on the results.
  • The output became a backlog with short-, medium-, and long-term horizons. It flagged quick wins, dependencies, and risks.
  • One to two top use cases were picked for a deeper feasibility review.

Workstream 2: Architecture and Data Readiness Review

  • NRC’s tech setup was mapped end to end. This included legacy systems, commercial tools, Azure cloud, and data sources across SQL, NoSQL, and XML.
  • The review covered cloud and on-prem infrastructure, data pipelines, APIs, security, and governance.
  • Data readiness was checked against the top AI use cases. The review looked at access, quality, labels, and governance.
  • Existing internal tools were reviewed to find ones that could be reused. This helped avoid duplication.
  • NRC’s setup was compared to a standard AI-ready architecture. Gaps in compute, MLOps, observability, and reusability were flagged.
  • Findings were validated with NRC’s experts and architects. The roadmap was then presented to NRC leadership for sign-off.

Key Deliverables

AI Use Case Management PlaybookA reusable guide for how NRC takes in, sorts, scores, and ranks AI ideas. It gives NRC a repeatable process for the long term.

Prioritized AI Backlog RegisterA full list of AI use cases. Each one has a score, a rank, dependencies, and risk notes. It supports both governance and day-to-day decisions.

Architecture Review Summary A clear analysis of NRC’s tech setup. It includes practical recommendations, a step-by-step plan to upgrade infrastructure and analytics, and a list of internal tools NRC can reuse.

Let’s work together.

Don’t let data challenges hold back your operations. Explore how data, analytics, and AI can drive success in your business processes. Contact us today for a consultation and unlock the full potential of your data.