Evaluation at NRCan: Information for Program Managers

Evaluation 101 - PPT (506 kb)

Strategic Evaluation Division
Science & Policy Integration

July 2012

Purpose

  • The purpose of this document is to provide program managers with an overview of the evaluation function at NRCan.
  • The TBS Policy on Evaluation (April 2009) requires that all direct spending, including all G&C Programs, be evaluated every five years
    • Most program managers will find themselves participating in an evaluation at some point.

What is Evaluation?

  • Evaluations are the systematic collection and analysis of evidence on the outcomes of programs to make judgments about
    • their relevance;
    • performance; and
    • Alternative ways to deliver them or to achieve the same results.
  • Evaluations must be neutral, and evidence-based.
  • An evaluation is not the same as an audit.
Evaluations Audits
Focus on whether we are doing the right things, and the extent to which a program is achieving its expected outcomes, in a cost-effective manner. Internal audit is looking at financial management, processes, controls and risk.
Make assessments on the relevance & performance of programs Identify strengths and weaknesses in the management control framework

Why do Evaluations?

  • The objective of evaluation is to create a comprehensive and reliable base of evidence to support:
    • policy and program improvement;
    • expenditure management;
    • Cabinet decision-making; and
    • public accountability.
  • Evaluations are often required to support TB submissions and Memoranda to Cabinet.
  • They are also a critical source of evidence for Strategic Reviews, to support resource reallocation (next review will occur in 2014).

Background: Evaluation Stakeholders

Background: Evaluation Stakeholders
Text Version - Figure 1

Graphic shows the diverse information needs of stakeholders

At the top: Evaluation Reports (as the final product) leading to three streams:

First Stream: Policy and Program Improvement: meets the needs of NRCan DM; Sector ADMs; Program Managers

Second Stream: Expenditure Management / Cabinet Decision-making: meets the needs of Cabinet; Strategic Review; TBS

Accountability/ Public Reporting: meets the needs of Parliament; Canadian Public.

At the bottom: The challenge is to meet the diverse information needs of many stakeholders

 

The Evaluation Cycle

The Evaluation Cycle
Text Version - Figure 2

Cycle starts with Evaluation Planning, Evaluation Assessment (1-3 months); Contracting (1-2 months); Field Work/Analysis (6-8 months); Report and Recommendations (1-4 months); Management Responses; Approvals/Posting (2-3 months); and finally Implementing Change.

 

Evaluation Planning

  • NRCan must evaluate all direct program spending, including all ongoing grant and contribution programs, every five years.
  • NRCan has developed a five-year Evaluation Plan based on PAA units that is updated annually and approved by the Evaluation Committee.
  • In most cases, the evaluation of an individual program will be conducted within the scope of a broader evaluation of a PAA unit.
  • The current plan summary appears on the Strategic Evaluation Internet site at: http://nrcan.gc.ca/evaluation/plans-eng.php.

Phases of an Evaluation

  1. Evaluation Assessment (1-3 months)
    • research and planning to understand the programs
    • develop the Terms of Reference
    • obtain approval from the Evaluation Committee
  2. Contracting (1-2 months)
    • Consultants are often used to supplement in-house staff. Their roles will vary by project.
  3. Fieldwork or Data Collection/Analysis (6-8 months)
    • develop a detailed methodology report
    • methodologies: key informant interviews; focus groups; file/document/literature reviews; surveys; case studies; and data and economic analysis
    • analyse information collected from these multiple lines of evidence to develop conclusions
  4. Reporting & Development of Recommendations (2-4 months)
    • prepare preliminary findings and discuss with programs
    • draft report
    • address comments and revisions
    • develop recommendations
  5. Management Responses (1 month)
    • obtain ADM-approved management responses and action plans to the recommendations
  6. Approvals/Posting of report (2-3 months)
    • recommendation by the Evaluation Committee
    • approval by the DM
    • translation, ATIP review, media lines, release on Internet

Evaluation Questions and Issues

  • Evaluations address relevance and performance.
  • Relevance issues focus on:
    • continued need for program;
    • alignment with government priorities; and
    • alignment with federal roles and responsibilities.
  • Performance issues focus on effectiveness:
    • achievement of expected outcomes; and
    • demonstration of efficiency and economy.
  • Evaluators work with program managers to develop more detailed evaluation questions relevant to their program.

Roles and Responsibilities

  • Under the TBS Evaluation Policy, Deputy Ministers are responsible for the evaluation function.
  • NRCan’s Departmental Evaluation Committee – an ADM-level Committee – is chaired by the DM
  • NRCan’s Head of Evaluation – who is also the DG of Planning and Performance Management Reporting – reports to the Evaluation Committee.

The Role of the Evaluation Division

  • The Strategic Evaluation Division (SED) is responsible for:
    • Proposing a five year departmental evaluation plan to the Evaluation Committee, and updating it annually;
    • Managing and conducting evaluation studies, including managing contracts and deliverables when consultants are used and issuing reports in a timely manner.
  • Additionally, SED will help program managers develop their performance measurement strategies, with the goal of ensuring that good data is collected to support future evaluations.
    • Evaluation will work with your team to develop objectives, a logic model, a performance measurement framework and evaluation requirements.
  • NRCan's Strategic Evaluation Division is also responsible for reviewing and providing advice on the accountability and performance provisions in Cabinet documents (Memoranda to Cabinet (MCs) and TB Submissions).

The Role of Program Managers

  • Program managers are key to conducting evaluations.
  • They are responsible for developing, implementing and monitoring ongoing performance measurement – the foundation of evaluation.
  • Additionally, during an evaluation, they must be actively involved in:
    • explaining how their programs work;
    • contributing to evaluation planning, including identifying more detailed evaluation questions;
    • providing performance measurement information on resources used, activities undertaken and results achieved;
    • providing detailed documentation (see next slide) and suggestions on potential interviewees, case studies etc.;
    • participating in working groups to review questionnaires, preliminary findings, draft evaluation reports, etc.;
    • developing management responses and action plans for their ADMs and implementing them after the evaluation

Key Documents for An Evaluation

  • In preparation for an evaluation, program managers will be asked to provide key documents as early as possible, including:
    • Legislation, Regulations, MCs, TB Submissions
    • RMAFs, RBAFs or Performance Measurement Frameworks
    • references in budgets, SFTs, DPRs, RPPs
    • briefing notes, reports -- including annual and project reports, studies, databases
    • websites and communications products
    • five-years of financial expenditures for the PAA unit including G&C expenditures, O&M, and salaries

Questions and Assistance

  • If you have any questions on evaluation, or wish evaluation assistance in developing performance measurement information or Cabinet documents, please contact:

    The Director of Strategic Evaluation at (613) 996-9649

  • Electronic copies of this document, completed evaluation reports and the Terms of Reference for the Evaluation Committee are available at: https://www.nrcan.gc.ca/evaluation/reports/207