logo logo

Grants and Contributions Applicants Client Experience Research (Year 3)

Employment and Social Development Canada [ESDC]

October 25, 2023

POR# 099-22
CONTRACT AWARD DATE: 2022-12-19
CONTRACT #: CW2266044 (G9292-24-2550)
Contract value: $149,885.85 (tax included)

Ce rapport est aussi disponible en français.

For more information on this report, please contact nc-por-rop-gd@hrsdc-rhdcc.gc.ca

Grants and Contributions Applicants Client Experience Research (Year 3)

It is available upon request in multiple formats (large print, MP3, braille, e-text, DAISY), by contacting 1-800 O-Canada (1-800-622-6232).
By teletypewriter (TTY), call 1-800-926-9105.

© His Majesty the King in Right of Canada, as represented by the Minister of Families, Children and Social Development, 2023
https://publications.gc.ca/site/eng/services/copyPageTemplate.html
For information regarding reproduction rights: droitdauteur.copyright@HRSDC-RHDCC.gc.ca.

PDF
Cat. No.: Em20-148/2024E-PDF
ISBN: 978-0-660-67643-2

Recherche sur l’expérience client des subventions et contributions (Année 3)

Ce document offert sur demande en médias substituts (gros caractères, MP3, braille, fichiers de texte, DAISY) auprès du 1-800 O-Canada (1-800-622-6232).
Si vous utilisez un téléscripteur (ATS), composez le 1-800-926-9105.

© Sa Majesté le Roi du Chef du Canada, représenté par le ministre de la Famille, des Enfants et du Développement social, 2022
https://publications.gc.ca/site/fra/services/modeleDroitsAuteur.html
Pour des renseignements sur les droits de reproduction: droitdauteur.copyright@HRSDC-RHDCC.gc.ca.

PDF
Nº de cat. : Em20-148/2024F-PDF
ISBN : 978-0-660-67644-9

List of Acronyms

Acronyms
PROGRAM RELATED
AS Apprenticeship Service 
CSJ Canada Summer Jobs
EAF Enabling Accessibility Fund
FELIP Financial Empowerment of Low-Income People
NAAW National AccessAbility Week
NHSP New Horizons for Seniors Program
SIP Sectoral Initiatives Program
STAR Skilled Trades Awareness and Readiness Program
SDPP-C&F Social Development Partnerships Program – Children and Families
SDPP-D Social Development Partnerships Program – Disability Inclusion
SSLP Supports for Student Learning Program
WER Women’s Employment Readiness
WORBE Workplace Opportunities: Removing Barriers to Equity
MISCELLANEOUS
CX Client Experience
ESDC Employment and Social Development Canada
FY Fiscal year
GBA+ Gender Based Analysis Plus
GoC Government of Canada
Gs&Cs Grants and Contributions
GCOS Grants and Contributions Online Services
MP Member of Parliament
N/A Non applicable
PO Program Officer
POB Program Operations Branch
SC Service Canada

Political Neutrality Statement

I hereby certify as Senior Officer of Ipsos that the deliverables fully comply with the Government of Canada political neutrality requirements outlined in the Policy on Communications and Federal Identity and the Directive on the Management of Communications. Specifically, the deliverables do not include information on electoral voting intentions, political party preferences, standings with the electorate or ratings of the performance of a political party or its leaders.
Signature of Mike Colledge
Mike Colledge
President
Ipsos Public Affairs

Additional information

Supplier Name: Ipsos Limited Partnership
PSPC Contract Number: CW2266044 (G9292-24-2550)
Contract Award Date: 2022-12-19


Executive Summary

Grants & Contributions CX Survey – Results At a Glance (Year 3)

Overall Service Experience

Overall Service Experience. Text description follows this graphic.

Click for larger view

Figure 1: Overall Service Experience

This horizontal bar chart shows responses to three questions about the overall service experience and presents results for Year 1, Year 2 and Year 3. Respondents were asked to provide ratings on a 5-pt scale and results were grouped by those who provided a rating of 4 or 5.  All 3041 respondents in Year 3 answered as follows:

Satisfaction with Service Channels

Satisfaction with Service Channels. Text description follows this graphic.

Click for larger view

Figure 2: Satisfaction with Service Channels

This horizontal bar chart shows responses to a question about satisfaction with the overall quality of service provided by the service channels used during the applicant process and presents results for Year 1, Year 2 and Year 3. Respondents were asked to provide ratings on a 5-pt scale and results were grouped by those who provided a rating of 4 or 5.  Sample sizes vary by service channel, only those who used each channel during their experience were asked about it.

Satisfaction with Client Experience by Program

Satisfaction with Client Experience by Program. Text description follows this graphic.

Click for larger view

Figure 3: Satisfaction with Client Experience by Program

This vertical bar chart shows responses to a question about satisfaction with overall service experience by program and presents results for Year 1, Year 2 and Year 3. Respondents were asked to provide ratings on a 5-pt scale and results were grouped by those who provided a rating of 4 or 5.  Sample sizes vary by program.

Funding approval

Figure 4: Funding approval

Click for larger view

Figure description: Funding approval

This horizontal bar chart shows whether the applicants received funding approval or not and presents results for Year 1, Year 2 and Year 3 as follows:

Satisfaction by Approval Status

73% 82% 74% 49% 47% 41%
Year 3 Year 2 Year 1 Year 3 Year 2 Year 1
Approved Denied

Strengths

Figure description: Strengths

Click for larger view

Figure description: Strengths

Areas for Improvement

Figure 7: Areas for Improvement. Text description follows this graphic.

Click for larger view

Figure description: Areas for Improvement

* referred to as [program] web portal in Year 1

Note: Program types, intakes, and streams in grants and contributions vary widely, meaning that some year-to-year or program comparisons should be done with caution.

Year 3   Year 2   Year 1

Top 5 driver of satisfaction

Significantly higher / lower than total

Significantly higher / lower than Year 2

Key Findings

Overall Satisfaction and Applicants Experiencing an Issue

Overall satisfaction with the service experience among applicants to Grants and Contributions programs declined compared to Year 2, returning to levels observed in Year 1.

Applicants to CSJ and SDPP experienced more issues related to the timeliness of service and had more difficulty following up or getting assistance during the application process than in Year 2 which negatively impacted their satisfaction.

Satisfaction Drivers and Awareness of Service Standards

The timeliness of service had the largest impact on satisfaction with service experience, followed by the ease of follow-up before receiving a decision, and confidence in the issue resolution process. In Year 3, all these aspects of service have increased in importance in driving overall satisfaction. Ratings in each of these aspects of service have declined compared to Year 2. 

Awareness of service standards remained relatively low and fewer knew of the time to acknowledge the submission and issue a funding decision than in Year 2. Applicants who were aware of each service standard continued to have a more positive experience. Notably, impressions have weakened year over year across several aspects of service among those who were not aware.

Selected Applicant Profiles and the online experience

Those not approved for funding continued to be much less satisfied and fewer applicants were approved for funding compared to Year 2, which has contributed to the decline in overall results. Applicants who were not approved had much more difficulty getting help with their application, were less likely to feel the process was clear and timeliness of service reasonable and few reported having received a debrief on the outcome or being satisfied with the explanation provided.

Virtually all applicants reported submitting their application online and ratings for the ease and timeliness of the process remained strong and consistent with Year 2. Applicants to higher complexity programs continued to find all steps of the process more difficult.

Satisfaction with Service Channels

Satisfaction with the service provided through most service channels was largely consistent and remained highest for support provided by email from a program officer, followed by the online channels. Fewer were satisfied with the Government of Canada website compared to Year 2 due to lower ratings among SDPP applicants who also had weaker impressions of the service provided by email.

Learning about the program

Email outreach from Service Canada or the program, the Government of Canada website and program applicant guides were the primary ways applicants learnt about the program they applied for. The vast majority who relied on the Government of Canada website continued to find it easy to navigate, however applicants to higher complexity programs had more difficulty. Further, more could be done to improve the ease of determining how long each phase of the process is anticipated to take. 

Populations served by funding and project close-out

Funding sought by applicant organizations continued to be targeted largely at supporting diverse communities, however slightly less so than in Year 2. 

The vast majority of funding recipients found it easy to complete the tasks associated with funding agreement close-out. Recipients of EAF and higher complexity programs had more difficulty and fewer recipients of CSJ felt the tasks were easy compared to Year 2. 

Qualitative Research

Organizational Capacity to Complete the Application Process

Top-of-mind Associations with the Application Process

Future Improvements and the Ideal Experience
Applicants offered numerous improvement suggestions. Highlights include:

The Impact of Funding

Interest in future ESDC funding opportunities


Objectives and Methodology

Background: Gs&Cs Client Experience Research

The Program Operations Branch (POB) within Employment and Social Development Canada (ESDC) handles the operation and coordination of most Grants and Contributions (Gs&Cs) programs across the Department. The Branch actively works to improve the design, administration and delivery of Grants and Contributions programs. This notably includes making the process of applying for funding accessible, efficient and effective through quick and easy online services and standardized forms and agreements.
To comply with the Treasury Board Policy on Service and Digital and the Employment and Social Development Canada (ESDC) Service Strategy, POB requires the gathering of data on the client experience to assist in effectively managing service delivery. To meet these requirements, POB uses the Client Experience (CX) Performance Measurement Framework to guide the research on the Gs&Cs business line of client service delivery experience. The data collected with the framework, which includes qualitative and quantitative dimensions, will provide key insights and diagnostics on client experience to help:

This is the third year of POB’s Client Experience Research Program (FY 2022/23 into 2023/24). Year 3 will build on previous years of research to support the systematic and integrated approach to measure and improve CX in Gs&Cs service delivery which also allows the department to track process on consistent and comparable CX indicators over time. 
The detailed methodology and research instruments for all aspects of the research are available under a separate cover. 
Note: Program intakes in grants and contributions vary widely, meaning that some year-to-year or program comparisons should be done with caution. 

Research Objectives

The Client Experience Research Project is carried out in two phases, a quantitative phase and a qualitative phase.
The overarching objectives of the Year 3 quantitative research are to:

The research objectives for the quantitative research were to:

The qualitative research explored the lived experiences of Gs&Cs applicants through focus group discussions and individual interviews. Building on the quantitative research, the qualitative phase of this project was structured around the following:

Methodology – Quantitative Research

An online survey was conducted with 3,041 Service Canada applicants across 11 programs. The survey was fielded from April 19 to June 9, 2023, and took on average approximately 16 minutes to complete. The survey sample size has a margin of error of +/-1.75%.

Applicants were defined as organizations that applied for grants and contributions funding (including both funded and unfunded) within the last two intake years (FY 2020/21 and 2021/22). A random sampling of organizations that applied to CSJ or NHSP were included, while all organizations that applied for the remaining programs were invited to complete the survey. ESDC distributed the survey links to participating organizations.

The exact intake periods referred to in this study are as follows:

Fiscal Year 2021-22:

Fiscal Year 2020-21:

*SIP has been replaced by The Sectoral Workforce Solutions Program (SWSP). The SWSP builds on and replaces the SIP.

Three (3) of the programs included in the survey have different streams that applicants can apply for.
The relevant streams referred to in this study are as follows:

Of the 9,862 organizations that were invited to participate, a total of 3,041 organizations completed the survey. The response rate for the survey was 31% which is considered strong compared to industry standards for a survey of this nature.

Total
Invited to participate 9862
Click-through 3924
Partial Completes 883
Terminates 0
Over Quota 0
Completed Surveys 3041
Response Rate 31%
Abbreviation Invited Completed Response rate
CSJ Canada Summer Jobs  3250 1004 31%
EAF Enabling Accessibility Fund 1063 300 28%
NHSP New Horizons for Seniors Program 3250 1296 40%
SDPP- C&F Social Development Partnerships Program – Children and Families 904 168 19%
SDPP- D Social Development Partnerships Program – Disability Inclusion 200 46 23%
AS Apprenticeship Service 36 11 31%
WORBE Workplace Opportunities: Removing Barriers to Equity 79 
22 28%
SSLP Supports for Student Learning Program 80 24 30%
WER Women’s Employment Readiness 214 51 24%
STAR Skilled Trades Awareness and Readiness Program 23 3 13%
SIP Sectoral Initiatives Program 763 116 15%
Total 9862 3041 31%

Note: “n=” represents the number of respondents to a question, it is known in statistical language as the size of the sample. Sample sizes below n=30 are considered small and below n=10 considered very small. Results of small and very small sample sizes should be interpreted with caution and findings viewed as directional in nature.

The quantitative survey also served as a recruitment tool for the qualitative research, by asking if organizations would be interested in voluntarily participating in focus groups or in-depth interviews at a later date.

Only those organizations with email contact information on file were invited to participate, which does not represent the total volume of applicants.

Calibration of the Data – Quantitative Approach

Weighting adjustments were made to bring the sample into proportion with the universe by program volume (depending on the most recent intake for the particular program).

The final data was weighted by the number of respondents in each program in proportion to the total number of applicants as detailed below. The universe proportions used to develop the targets were based on figures provided by ESDC.

Program #Of Applicants %Of Total
Canada Summer Jobs  41463 84.94%
Enabling Accessibility Fund 1040 2.13%
New Horizons for Seniors Program 4176 8.56%
All programs but CSJ, EAF and NHSP 1252 2.56%
Social Development Partnerships Program – Children and Families 881 1.80%
Social Development Partnerships Program – Disability Inclusion 195 0.40%
Apprenticeship Service 36 0.07%
Workplace Opportunities: Removing Barriers to Equity 74 
0.15%
Supports for Student Learning Program 75 0.15%
Women’s Employment Readiness 210 0.43%
Skilled Trades Awareness and Readiness Program 23 0.05%
Sectoral Initiatives Program 639 1.31%
Total 48812

Note Regarding Program Complexity

For the purpose of this study, program complexity has been defined as low, moderate, and high as outlined in the following table. These service standard clusters are informed by departmental reporting in the Performance Measurement and Management Framework.

Note: Canada Summer Jobs does not fit into these distinct clusters and has been analyzed as a separate group.

Program Complexity Level Description Programs Included
Low complexity programs Grant programs in the 112 days/16 week review period
  • Enabling Accessibility Fund (grants)
  • New Horizons for Seniors Program (grants)
  • Social Development Partnerships Program (SDPP) – Disability (grants)
  • Social Development Partnerships Program (SDPP) – Children and Families (grants)
  • Workplace Opportunities: Removing Barriers to Equity (contribution)
Moderate delivery-complexity programs Contribution streams in the 126 days/18 week review period
  • Women's Employment Readiness (WER) Pilot Program (contribution)
  • Enabling Accessibility Fund (contributions)
  • Sectoral Initiatives Program (SIP)
  • Skilled Trades Awareness and Readiness Program (STAR)
  • Social Development Partnerships Program (SDPP) – Disability (contribution)
  • Social Development Partnerships Program (SDPP) – Children and Families (contribution)
  • Apprenticeship Service
  • Supports for Student Learning Program (SSLP)
High-delivery complexity programs Contribution streams in the 154 days/22 week review period
  • N/A

Note on Reporting Conventions – Quantitative Data

Throughout the report, subgroup results have been compared to average of all applicants (i.e., total) and statistically significant differences at the 95% confidence level noted using green and red boxes.

Where subgroup results are statistically higher than the total a green box has been used and where results are statistically lower than the total a red box has been used.

Additionally, where results in Year 3 were statistically higher than Year 2, a green arrow has been used and where results in Year 3 were statistically lower than Year 2, a red arrow has been used.

Significantly higher / lower than total

Significantly higher / lower than Year 2

For the purposes of legibility, values of less than 3% have not been labelled in charts throughout the report.

Bases marked with a * indicate a small sample size and with ** indicate very small sample size, so results should be interpreted with caution and findings viewed as directional in nature.

As part of the analysis, a key drivers’ analysis was conducted to identify the factors which have the greatest impact on overall satisfaction. Throughout the report, the top 5 drivers have been identified using a yellow box.

Top 5 driver of satisfaction

Methodology – Qualitative Research

Respondents from the Gs&Cs client experience survey were asked a question whether they would be interested in taking part in follow-up qualitative research. After conducting an analysis of the sample that opted-in to ensure a mix of programs, regions, and to ensure inclusion of applicants in both official languages, potential applicants were contacted randomly and asked if they would like to be taken through the screening questionnaire to confirm their eligibility for an in-depth interview or online focus group.

Methodology Qualitative Research

Click for larger view

As shown in the tables below, 4 focus groups and 26 in-depth interviews were conducted.

Focus Groups Composition Date and Time
Group 1:  Unfunded applicants to any program
NATIONAL - ENGLISH
July 26 at 10AM ET
6 Applicants
Group 2: Funded applicants to any program
NATIONAL - ENGLISH
July 26 at 3PM ET
8 Applicants
Group 3:  Unfunded applicants to any program
QUEBEC or Official Language Minority Communities (OLMC) - FRENCH
July 27 at 10AM ET
7 Applicants
Group 4: Funded applicants to any program
QUEBEC or Official Language Minority Communities (OLMC) - FRENCH
July 27 at 1PM ET
7 Applicants
In-depth Interviews Composition Date and Time
In-depth interviews. The following programs were prioritized: Apprenticeship Service (AS), Workplace Opportunities: Removing Barriers to Equity (WORBE), Social Development Partnerships Program - Disability (SDPP-D), Supports for Student Learning Program (SSLP), Sectoral Initiatives Program (SIP). July  18  to August 10
19 English Applicants
7 French Applicants

Methodology – Qualitative Research Data Collection, Analysis

Data Collection 

With applicants’ consent, all qualitative research sessions are both audio and video taped. Verbatim transcripts from each and every focus group and interview is created; however, names or personal identifying details are not captured and/or scrubbed or redacted by the moderator to ensure applicants’ privacy.

Moderators also capture high level findings on each topic of their own observations – what the overall reaction was, any nuances, and any non-verbal cues on body language or tone. Because our transcripts are anonymous, it is not possible to comment on any variations by group or audience, if they have not been placed in separate groups – for example, moderators cannot provide a sense of different opinions by older vs. younger applicants, or males vs. females, depending on the topic.

Data Analysis

We identify the following elements in the qualitative analysis:

Note on Interpretation of Qualitative Findings

The value of qualitative research is in exploring the issues and experiences of research participants in depth, free from the constraints of a structured quantitative questionnaire. Qualitative evidence is rich and allows researchers to hear first-hand the underlying factors shaping experiences and opinions, as well as the interplay between factors.

Qualitative findings should not be extrapolated to the broader population, as they are not statistically projectable. Notable nuances that emerged in the interviews have been highlighted where relevant and these should be treated as strictly directional.

The qualitative findings should thus be viewed as complementary to the quantitative survey findings in terms of building a more complete understanding of the Gs&Cs client experience.