Grants and Contributions Applicants Client Experience Research (Year 2)

Employment and Social Development Canada [ESDC]

June 1, 2022

POR# 060-21
SUPPLIER: Ipsos Limited Partnership
CONTRACT AWARD DATE: 2021-12-08
CONTRACT #: G9292-229941/001/CY
Contract value: $140,330.26 (tax included)

Ce rapport est aussi disponible en français.

For more information on this report, please contact nc-por-rop-gd@hrsdc-rhdcc.gc.ca

Grants and Contributions Applicants Client Experience Research (Year 2)

It is available upon request in multiple formats (large print, MP3, braille, e-text, DAISY), by contacting 1-800 O-Canada (1-800-622-6232).
By teletypewriter (TTY), call 1-800-926-9105.

© His Majesty the King in Right of Canada, as represented by the Minister of Families, Children and Social Development, 2022
https://publications.gc.ca/site/fra/services/modeleDroitsAuteur.html
For information regarding reproduction rights: droitdauteur.copyright@HRSDC-RHDCC.gc.ca.

PDF
Cat. No. : Em20-148/2022E-PDF
ISBN: 978-0-660-44219-8

Recherche sur l’expérience client des subventions et contributions (Année 2)

Ce document offert sur demande en médias substituts (gros caractères, MP3, braille, fichiers de texte, DAISY) auprès du 1-800 O-Canada (1-800-622-6232).
Si vous utilisez un téléscripteur (ATS), composez le 1-800-926-9105.

© Sa Majesté le Roi du Chef du Canada, représenté par le ministre de la Famille, des Enfants et du Développement social, 2022
https://publications.gc.ca/site/fra/services/modeleDroitsAuteur.html
Pour des renseignements sur les droits de reproduction: droitdauteur.copyright@HRSDC-RHDCC.gc.ca.

PDF
Nº de cat. : Em20-148/2022F-PDF
ISBN : 978-0-660-44220-4

Political Neutrality Statement

I hereby certify as Senior Officer of Ipsos that the deliverables fully comply with the Government of Canada political neutrality requirements outlined in the Policy on Communications and Federal Identity and the Directive on the Management of Communications. Specifically, the deliverables do not include information on electoral voting intentions, political party preferences, standings with the electorate or ratings of the performance of a political party or its leaders.
Signature of Mike Colledge
Mike Colledge
President
Ipsos Public Affairs

Additional information

Supplier Name: Ipsos Limited Partnership
PSPC Contract Number: G9292-229941/001/CY
Contract Award Date: 2021-12-08

Executive Summary

Grants & Contributions CX Survey – Results At a Glance

  • 1,942 SURVEYS CONDUCTED
  • METHODOLOGY: ONLINE SURVEY
  • FIELDWORK: February 16 to March 15, 2022

Overall Service Experience

Figure 1: Overall Service Experience. Text description follows this graphic.
Click for larger view

Figure 1: Overall Service Experience

This horizontal bar chart shows responses to three questions about the overall service experience and presents results for Year 1 and Year 2. Respondents were asked to provide ratings on a 5-pt scale and results were grouped by those who provided a rating of 4 or 5. All 1942 respondents in Year 2 answered as follows:

  • Overall Satisfaction: Year 2 77%. Year 1 70%.
  • Ease: Year 2 79%. Year 1 74%.
  • Effectiveness: Year 2 78%. Year 1 70%.

Satisfaction with Service Channels

Figure 2: Satisfaction with Service Channels. Text description follows this graphic.
Click for larger view

Figure 2: Satisfaction with Service Channels

This horizontal bar chart shows responses to a question about satisfaction with the overall quality of service provided by the service channels used during the applicant process and presents results for Year 1 and Year 2. Respondents were asked to provide ratings on a 5-pt scale and results were grouped by those who provided a rating of 4 or 5. Sample sizes vary by service channel, only those who used each channel during their experience were asked about it.

  • Email support from Program Officer (627 respondents answered this question in Year 2): Year 2 79%. Year 1 80%.
  • GCOS Web Portal (623 respondents answered this question in Year 2): Year 2 76%. Year 1 67%.
  • Online (1365 respondents answered this question in Year 2): Year 2 71%. Year 1 66%.
  • Email support from SC (1580 respondents answered this question in Year 2): Year 2 70%. Year 1 65%.
  • In-Person (29 respondents answered this question in Year 2): Year 2 62%. Year 1 66%.
  • Mail (139 respondents answered this question in Year 2): Year 2 58%. Year 1 63%.
  • Phone support from SC (427 respondents answered this question in Year 2): Year 2 61%. Year 1 61%.
  • 1 800 O Canada (72 respondents answered this question in Year 2): Year 2 48%.Year 1 49%.

Satisfaction with Client Experience by Program

Figure 3: Satisfaction with Client Experience by Program. Text description follows this graphic.
Click for larger view

Figure 3: Satisfaction with Client Experience by Program

This vertical bar chart shows responses to a question about satisfaction with overall service experience by program and presents results for Year 1 and Year 2. Respondents were asked to provide ratings on a 5-pt scale and results were grouped by those who provided a rating of 4 or 5. Sample sizes vary by program.

  • EAF (207 respondents answered this question in Year 2): Year 2 78%. Year 1 77%.
  • NHSP (384 respondents answered this question in Year 2): Year 2 83%. Year 1 73%.
  • CSJ (865 respondents answered this question in Year 2): Year 2 79%. Year 1 69%.
  • YESS+ (152 respondents answered this question in Year 2): Year 2 62%. Year 1 60%.
  • UT&IP (32 respondents answered this question in Year 2): Year 2 78%. Year 1 73%.
  • EL&CCI (65 respondents answered this question in Year 2): Year 2 19%. Year 1 60%.
  • SDPP (153 respondents answered this question in Year 2): Year 2 72%. Year 1 53%.
  • FCRP (20 respondents answered this question in Year 2): Year 2 65%.
  • IELCC (8 respondents answered this question in Year 2): Year 2 75%.
  • IWILI (13 respondents answered this question in Year 2): Year 2 62%.
  • SWP (4 respondents answered this question in Year 2): Year 2 75%.
  • SDG (39 respondents answered this question in Year 2): Year 2 39%.

Funding approval

Figure 5: Funding approval. Text description follows this graphic.
Click for larger view

Satisfaction by Approval Status

  • Year 2: Approved 82%. Denied 47%.
  • Year 1: Approved 74%. Denied 41%.

Figure 4: Funding approval

This horizontal bar chart shows responses to a question about whether the applicant received funding approval or was denied and presents results for Year 1 and Year 2. A total of 1820 respondents in Year 2 answered as follows:

  • Year 2: Approved 93%. Denied 7%.
  • Year 1: Approved 90%. Denied 10%.

Strengths

Figure 6: Strengths. Text description follows this graphic.
Click for larger view

Figure description: Strengths

  • Service in choice of official language 93%
  • Completing steps online made the process easier 88%
  • Ease of determining if organization is eligible for program funding 84%
  • Confident personal information protected 83%

Areas for Improvement

Figure 7: Areas for Improvement. Text description follows this graphic.
Click for larger view

Figure description: Areas for Improvement

  • Determine the amount of time each phase is anticipated to take 58%
  • Client journey took reasonable time 66%
  • Needed to explain situation only once 67%
  • Ease of completing the budget document 67%

* referred to as [program] web portal in Year 1

Significantly higher / lower than total

Significantly higher/lower than Year 1

Key Findings - Quantitative Findings

Overall, applicants to Grants and Contributions programs were more satisfied with the process in Year 2 compared to Year 1. 

  • More than three-quarters (77%) of applicants were satisfied with their overall experience, an increase of seven points from Year 1 (70%), while fewer were dissatisfied (7%, -5 points). The vast majority felt it was easy to apply (79%) and move smoothly through all steps (78%), with improvement on both measures compared to Year 1.
  • Satisfaction was highest for NHSP (83%), CSJ (79%), EAF (78%) and UT&IP (78%), followed by IELCC and SWP (both 75%), SDPP (72%), FCRP (65%), YESS (62%) and IWLIL (62%), while ratings were lowest for SDG (39%) and EL&CCI (19%). 

Satisfaction was driven by improvements in the overall ease, effectiveness and timeliness of the process among CSJ and NHSP applicants. First-time applicants, those applying to higher complexity programs and organizations more reliant on volunteers continued to experience more difficulties navigating the application process.  

  • The overall increase in satisfaction is due primarily to higher ratings among applicants to NHSP and CSJ, who represent the vast majority of Grants and Contributions applicants and experienced the most positive improvement in the various aspects of the service experience. Those who applied to EL&CCI were less satisfied than in Year 1.
  • Overall, applicants provided higher ratings for several aspects of ease and effectiveness and fewer reported experiencing problems or issues during the application process. The most notable improvements in the service experience have been in the timeliness of service, clarity of the application process, issue resolution and ease of getting assistance. 

The aspect of service which had the greatest impact on satisfaction is the helpfulness of Service Canada and 1-800 O-Canada phone representatives, followed by the amount of time it took from start to finish, the ease of getting help and clarity of process. These also represent the aspects of service where ratings were relatively lower than other areas.   

  • The helpfulness of phone representatives (Service Canada and 1-800 O-Canada) has taken on increased importance and become the primary driver of satisfaction in Year 2, while the ease of getting help when needed has also become more important. 
  • Working to further reduce the amount of time the application process takes, improving access to assistance when needed and the ability of Service Canada phone representatives to assist applicants represent the greatest opportunities to improve the service experience given their strong impact on satisfaction and relatively lower ratings. 
  • Awareness of service standards also had an impact on satisfaction and fewer than half of applicants were aware of each standard in Year 2. Those who were aware of service standards had a more positive experience, with fewer issues and were more satisfied with most service channels, the timeliness of service, clarity of the process (including issue resolution) and ease of getting assistance. Working to more clearly communicate service standards to applicants should help to improve impressions of the application process. 

Those who were applying to the program for the first time, in particular for higher complexity programs, had greater difficulty with the process which required more contact with Service Canada and time on the part of the applicant. The resulting impact was lower overall satisfaction with their experience applying.   

  • First-time applicants were typically smaller, younger organizations, which relied more heavily on volunteers to complete the application and required more contact with Service Canada and effort to complete the application. They experienced more challenges particularly in regards to the clarity of the steps and timelines for the application process, ease of getting assistance and overall timeliness of the client journey. 
  • Notably, most applicants to CSJ reported they apply on the same basis, while applicants to most other programs and specifically those of higher complexity had less experience applying for the particular program. 
  • As noted in Year 1, satisfaction declined by the number of times the client contacted Service Canada. While applicants reported having fewer contacts than in Year 1, the highest proportion continue to have had more than 10 or more contacts during their experience which is more prominent among applicants to higher complexity programs. 

Applicants who were not approved for funding continued to have considerably lower satisfaction with their experience than those who were funded. Most were not provided an explanation for the decision and few who were expressed satisfaction with the rationale.  Providing organizations which were unsuccessful a better understanding of the reasons for not receiving approval should help to improve their satisfaction with the process.   

  • While fewer applicants were denied funding than in Year 1 (7% vs. 10%), only half (47%) of those were satisfied with the application process compared to more than eight in ten (82%) of those approved. 
  • Most of those who did not receive funding approval were not provided an explanation why and of those who were few were satisfied with the explanation provided. Applicants to programs other than EAF and CSJ were less likely to have been satisfied with the explanation provided. 
  • Notably, applicants to CSJ were more likely to have received funding approval compared to Year 1 and those who applied to EL&CCI were considerably less likely. Given the impact of funding approval on satisfaction with the application process, the shifts observed in the proportion who received funding is likely a contributing factor to the increase in satisfaction seen for CSJ and decrease for EL&CCI compared to Year 1. 

Support provided through email from a program officer remained the highest rated service channel, while telephone channels continued to receive lower ratings. Satisfaction with online channels, including the GCOS web portal and Government of Canada website, has increased and a greater proportion of applicants felt completing steps online improved the ease of the process.   

  • Eight in ten were satisfied with the email support from a program officer (79%), followed by the GCOS web portal (76%), Government of Canada website (71%) and email support form a Service Canada office (70%). 
  • Fewer were satisfied with in-person service at a Service Canada office (62%), telephone support from a Service Canada office (61%) and mail (58%), while satisfaction remained lowest for the 1-800 O-Canada (48%). 
  • The highest rated aspects of service remained the provision of service in choice of official language, that completing steps online made the process easier, confidence in security of their personal information and ease of determining eligibility. Improvement has been made in the ease of completing steps online and determining eligibly. However, compared to Year 1, fewer were confident their information was protected or were provided service in their choice of official language.  
  • Aspects of service with lower ratings included the ease of determining how long each phase of the process was anticipated to take, the overall amount of time it took, having to explain their situation only once and ease of completing the budget document. Applicants were more satisfied with the timeliness of service and only explaining themselves once than in Year 1, but they remain areas for further improvement. 

Applicants continued to rely most heavily on the Government of Canada website in the stages leading up to submitting their application and the vast majority found it easy to find what they were looking for. There has been improvements in the clarity of information online and ease of determining the steps to apply, however ratings were lower for determining how long each phase of the process is anticipated to take.   

  • When learning about the program, applicants were most likely to have received email communication directly from the GoC, ESDC, or the program they applied to (57%) or to visit the Government of Canada website for the program (48%), followed by the general Government of Canada website (25%), or talked to peers/community network (23%). A greater proportion received an email from the GoC, ESDC, or the program than in Year 1 (driven by an increase among CSJ and NHSP applicants), while fewer talked to peers/community network or their MP.
  • CSJ and NHSP applicants were more likely to have received an email from the GoC, ESDC, or program directly during the awareness stage, while those applying to all other programs relied more heavily on the Government of Canada website and experienced more difficulty finding the information they were looking for when doing so (EL&CCI, SDPP and SDG applicants in particular). 
  • Among those who used the GoC website, ratings were highest for the ease of determining if their organization was eligible for funding, when the application period takes place, and finding general information about the program. Determining the amount of time each phase of the application process is anticipated to take was considered the most difficult information to find.

The vast majority submitted their application online and found it easy to do so. Applicants found it easier to complete most aspects of the application and that it took a reasonable amount of time to complete compared to Year 1, however impressions of the process remain much weaker among those who applied to higher complexity program.   

  • Most submitted their application using the online fillable form (51%), followed by the GCOS web portal (35%), while fewer downloaded the application and submitted by email (10%) or mail (3%). YESS and CSJ applicants were more likely to have submitted using the GCOS web portal, while applicants to all programs except for CSJ were more likely to have downloaded the application and submitted them by email. 
  • Improvements observed in the ease of completing the application were due to primarily to higher ratings among CSJ applicants and to a lesser extent NHSP. 
  • Applicants to all other programs continued to experience more difficulty and were less likely to feel it took a reasonable amount of time to complete. Completing the budget document and narrative questions were the most challenging aspects of the application submission. 

Among those approved for funding, most found the tasks associated with funding agreement close-out easy to complete, although applicants to higher complexity programs continued to experience more challenges.   

  • Consistent with Year 1, a strong majority felt each aspect of the funding agreement close-out was easy to complete and ratings were very consistent between tasks with the exception of resolving any outstanding issues with funding which received lower ratings. 
  • Those who received funding through CSJ were more likely to find it easy to complete most aspects of the funding agreement close-out, while applicants of all other programs (except for NHSP) were generally less likely and ratings have declined for recipients of EAF and YESS funding across most components.

Almost all applicants supported diverse communities with their application.   

  • Virtually all applicants (97%) reported that the funding they applied for would support diverse (GBA+) communities. Nearly three-quarters (73%) of applicant organizations said the funding would support those who identify as youth, followed by women (63%), those belonging to a minority racial or ethnic background (62%), low socio-economic status (53%) and Black Canadians (52%).
  • Overall satisfaction is consistent among applicants who assist GBA+ communities and those who do not. In addition, overall satisfaction among those who assist GBA+ communities has increased compared to Year 1. 

Most organizations operate in Ontario, Quebec, or British Columbia and organizations operating in Quebec reported the highest satisfaction.   

  • Almost two in five (38%) of applicant organizations operate in Ontario, followed by one quarter in Quebec (25%), and one in ten (13%) in British Columbia. Newfoundland and Labrador (3%), Prince Edward Island (2%), Yukon, Northwest Territories, and Nunavut (1%) were regions where the fewest number of applicant organizations operate. 
  • Applicant organizations operating in Quebec reported the highest level of satisfaction with their experience (83%) and were more likely to be satisfied compared to all clients. Eight in ten (81%) organizations in Atlantic Canada were satisfied, followed by three-quarters (76%) of those in Ontario, while closer to seven in ten (72%) applicant organizations in the West or Territories which is lower compared to all client.  

Findings from the qualitative research aligned with quantitative findings.  

  • Those with less external support in the Grants and Contributions process, and those with less experience in applying for grants generally, were less satisfied with the application journey and process.  

Those who identified as belonging to an equity-seeking group experienced barriers to, or concerns about, applying.  

  • Some felt that ESDC itself could/should play a role in filling these gaps, particularly since many of these organizations serve equity-seeking groups.
  • It is challenging for ‘bare bones’ organizations that have some general / overarching needs for funding, such as equipment upgrade, staffing, programming support.
  • These applicants felt that there was little opportunity to ‘tell their story’ in terms of the full context of themselves/the communities they serve.

The organizations who are more confident when applying are better resourced and funded, have levels of collaboration, or have experienced grant writers on staff as advisors.  

  • They described the application process satisfactory and easier to undertake, and they feel more confident in the type of content and information they provide within their application. 
  • Enablers to applying for federal funding included experience, diversity, support and collaboration.
  • Barriers included lack of resources, the need for partnerships, application requirements, lack of flexibility, perceptions of favouritism or bias, lack of soft metrics, lack of diversity, and vendor challenges.

Many participants expressed concern with a lack of user-friendliness when applying to a program through the GCOS web portal, or the fillable PDF.

  • They struggled with various aspects of the portal including a general lack of user-friendliness, being unable to save progress or losing responses, confusing language/legalese, and a lack of online collaboration within their teams.
  • However, there were those who considered themselves technically savvy, or had a great deal of experience applying for grants and using other similar portals, who did not share the same concerns and considered applying online easy and straightforward. 

Feedback on Service Dimensions were mixed – very few felt that their application experiences aligned with all three that were tested in the research.  

  • In general, participants are seeking a flexible relationship that recognizes the importance of the work that their organizations do in their communities – and treats them as valued partners who are filling gaps and providing services that otherwise are not available to those who need them the most. 
  • There was much positive feedback on Ease and Effectiveness, along with some concerns, but Emotions was the Service Dimension where many described negative emotions and experiences, frustrations and pain points.

Concerns and considerations about diversity, equity and inclusion were brought forward throughout the discussions.  

  • Participants felt that there needs to be more inclusive practices in place, such as ensuring that public-facing Service Canada staff are culturally diverse, providing additional support to organizations serving equity-seeking groups and communities, and providing flexibility for those who are executing projects on-the-ground to take their communities’ unique considerations into account. 

Many wished for transparency around evaluation criteria for successful applications.  

  • This included knowing the overall envelope, and what a reasonable amount their organization should apply for.
  • This information would determine whether or not applying for the fund is worth the considerable time and effort it takes to apply. That said, most intended to apply for funding in future, as it is an important source of funding, especially for those organizations who lack core funding or who are not charities who can receive donations. 

An ideal experience is one that is timely where timelines are met, communication is proactive, and a two-way discussion about the both funded and unfunded applications is facilitated.  

  • Further, a clear understanding of decision criteria and what content is expected from applicants, were also important to participants/applicants. 
  • Making it easier for applicants to access educational information related to the grant or contribution, was consistently raised as contributing to an ideal experience.

Reasons for higher dissatisfaction with Early Learning and Childhood Innovation (ELCC) were uncovered through the qualitative research.  

  • A clear picture emerged about the experience they had, most of whom characterized it as being very poor. However, this was not in relation to the decision itself, but because of significant delays for the decision, the manner and timing in which it was communicated, and the poor emotional Service Dimension experienced.
  • For Sustainable Development Goals (SDG), there was a desire for greater recognition of social enterprise within it and other programs. 

Objectives and Methology

Background: Gs&Cs Client Experience Research

The Program Operations Branch (POB) within Employment and Social Development Canada (ESDC) handles the operation and coordination of the Grants and Contributions (Gs&Cs) programs across the Department. To comply with the treasury Board Policy on Service and Digital 3.1 on digital transformation, 3.2 on decision-making, service delivery, use of technology and data and client-centric service and 4.2 on client centric service design and delivery, ESDC requires the gathering of data on the client experience to assist in effectively managing service delivery. 

To meet these requirements, POB utilized the Client Experience (CX) Measurement Framework to guide the research on the Gs&Cs business line of client service delivery experience. the data collected through the implementation of the Client Experience (CX) Measurement Framework provides key information to:

  • Better understand the needs and expectations of organizations;
  • Identify obstacles and challenges from the perspective of the organization;
  • Identify strengths and opportunities to improve CX, including opportunities to implement changes and test new approaches related to program design and delivery; 
  • Assess the extent to which clients’ expectations are being met; 
  • Identify and prioritize resources and opportunities tied to CX improvements; 
  • Assess the impact of improvements made to the CX over time; and 
  • Explore how ESDC’s leadership at all levels can play an important role in creating a positive CX.

This is the second year of POB’s Client Experience Research Program (FY 2021/22). Year Two builds on the first year of research by continuing to use a systematic approach to measuring CX in Gs&Cs service delivery and allowing the department to track process on CX indicators over time. 

The detailed methodology and research instruments for all aspects of the quantitative and qualitative research are available under a separate cover.

Note: Program intakes in grants and contributions vary widely, meaning that some year-to-year or program comparisons should be done with caution.

Research Objectives – Quantitative Research

The Client Experience Research Project is carried out in two phases, a quantitative phase and a qualitative phase. The primary objectives of Year 2 are to focus on monitoring selected programs that were previously studied in Year 1 and to capture new CX insights from programs that have not previously been studied.

The research objectives for the quantitative research were to:

  • Measure service satisfaction, ease, and effectiveness of the end-to-end client experience; 
  • Measure CX with service channels; 
  • Build a baseline of client experience across the spectrum of Gs&Cs programs by introducing new programs while also starting to assess change and consistency by including some of the same programs as Year One; 
  • Provide diagnostic insights regarding the opportunities for improvement; and 
  • Assess how potential changes in service delivery might affect the CX.

Research Objectives – Qualitative Research

Building on the quantitative results, the qualitative research explored the following through focus group discussions and individual interviews with Gs&Cs applicants who have applied for funding within the past two intake years (FY 2019-20 and 2020-21): 

  • Client needs and expectations: Validate and/or deepen insights regarding quantitative findings, explore the aspects that make it easy for clients as well as the obstacles/barriers clients face when going through the client experience, the impact of potential changes, and aspects that could transform the experience into a simpler and more responsive process.
  • Service dimensions: Assess which service dimensions hold greater or lesser value for clients with respect to accessing service, given the complexity of the services and clients’ capacity to effectively use online services. This would allow us to validate themes to be covered in the department’s survey.
  • Organizational characteristics: Investigate and understand organizational characteristics, qualities, and experiences to identify barriers and challenges faced by organizations. This may include organizational characteristics for those serving diverse populations, those organizations that were (un)successful in obtaining funding, and/or organizations that did (not) re-apply for funding.
  • New or unique quantitative findings: Explore the nuances and features of the quantitative findings by probing clients or a subset of the client population to explain and contextualize their recent experiences with the Gs&Cs application process
    • Based on findings from the quantitative research where certain programs had lower levels of client experience satisfaction, it was determined that applicants from Early Learning and Child Care Innovation (ELCC) and Sustainable Development Goals (SDG) would be targeted for the in-depth interviews.
    • The focus groups shifted to target programs other than ELCC and SDG
    • Additional questions based on quantitative findings included:
      • Understanding perceptions on the length and complexity of the application
      • Technical issues with the application process

The findings of the qualitative research will be used to:

  • Explore the client’s interactions with the department and the challenges clients may face;
  • Build a deeper awareness and understanding of the clients’ experiences to inform program design and service delivery improvements;
  • Identify opportunities for service-related changes, enhancements, and/or policy improvements; and
  • Support, challenge, and/or enrich the findings from the survey.

Methodology – Quantitative Research

An online survey was conducted with 1,942 Service Canada applicants across 12 programs. the survey was fielded from February 16, 2022 to March 15, 2022, and took on average approximately 16 minutes to complete. the survey sample size has a margin of error of +/-2.2%.

Applicants were defined as organizations that applied for grants and contributions funding (including both funded and unfunded) within the last two intake years (2019/20 or 2020/21). A random sampling of organizations that applied to CSJ, EAF or NHSP were included, while all organizations for remaining programs were invited to complete the survey.

ESDC distributed the survey links to participating organizations. Fieldwork launch was executed using two approaches in order to better understand the impact on response rates. Half of the sampling of organizations that applied to CSJ, EAF and NHSP were sent an information email in advance of receiving the survey invitation email containing the survey link, while the other half (and those who applied to all other programs) were only sent the survey invitation email.

The exact intake periods referred to in this study are as follows:

  • Canada Summer Jobs (CSJ):January to February 2020; December 2020 to February 2021
  • Early Learning and Child Care – Innovation (ELCCI): October 2020 to January 2021
  • Enabling Accessibility Fund (EAF): June 2020 to November 2020
  • Foreign Credential Recognition Program (FCRP): March to April 2019; February 2020 to June 2020
  • Indigenous Early Learning and Childcare (IELCC): February 2021 to April 2021
  • Innovative Work Integrated Learning Initiative (IWILI): September 2020 to November 2020
  • New Horizons for Seniors Program (NHSP): September 2020 to Oct. 2020
  • Social Development Partnerships Program (SDPP): June 2020 to July 2020; March 2021 to April 2021; December 2020 to January 2021
  • Student Work Placement (SWP): November 2020 to December 2020
  • Sustainable Development Goals (SDG): Grants: May 2019 to November 2019; Contributions: June 2019 to Sept. 2019
  • Union training and Innovation Program (UTIP): July to August 2020
  • Youth Employment and Skills Strategy (YESS): June 2019 to July 2019; March 2021 to April 2021

Four (4) of the programs included in the survey have different streams that applicants can apply for.
The relevant streams referred to in this study and the exact intake periods are as follows:

  • Enabling Accessibility Fund (EAF):
    • Small Projects (June 2020 – July 2020)
    • Youth Innovation (June 2020 – Nov. 2020)
  • New Horizons for Seniors Program (NHSP):
    • Small grant (up to $5000) (Sept. 2020 – Oct. 2020)
    • Community-based projects (up to $25,000) (Sept. 2020 – Oct. 2020)
  • Social Development Partnerships Program (SDPP):
    • Supporting Black Canadian Communities (June 2020 – July 2020)
    • Supporting Black Canadian Communities – West Intermediaries (Mar. 2021 – April 2021)
    • Disability – Community Inclusion Initiative (Dec. 2020 – Jan. 2021)
  • Union training and Innovation Program (UTIP):
    • Investments in training Equipment (July 2020 – Aug. 2020)
    • Innovation and Apprenticeship (July 2020 – Aug. 2020)

Of the 8,704 organizations that were invited to participate, a total of 1,942 organizations completed the survey. the response rate for the survey was 22% which is consistent with industry standards for a survey of this nature.

Total
Invited to participate 8704
Click-through 2941
Partial Completes 999
Terminates 0
Over Quota 0
Completed Surveys 1942
Response Rate 22%
Abbreviation Invited Completed Response rate
CSJ Canada Summer Jobs  1625 865 53%
EAF Enabling Accessibility Fund 1625 207 13%
NHSP New Horizons for Seniors Program 1625 384 24%
FCRP Foreign Credential Recognition Program 125 20 16%
ELCC Early Learning and Child Care 455 65 14%
IELCC Indigenous Early Learning and Childcare 66 8 12%
IWIL Innovative Work Integrated Learning Initiative 10 
13 130%*
SWPP Student Work Placement 30 4 13%
SDG Sustainable Development Goals 688 39 6%
YESS Youth Employment and Skills Strategy 936 152 16%
SDPP Social Development Partnerships Program 1393 153 11%
UTIP Union training and Innovation Program 126 32 25%
Total 8704 1942 22%

Note: “n=” represents the number of respondents to a question, it is known in statistical language as the size of the sample. Sample sizes below n=30 are considered small and below n=10 considered very small. Results of small and very small sample sizes should be interpreted with caution and findings viewed as directional in nature.

The quantitative survey also served as a recruitment tool for the qualitative research, by asking if organizations would be interested in voluntarily participating in focus groups or in-depth interviews at a later date.

* Response rate exceeding 100% may be due to applicants applying to more than one program and/or sampling procedures. Only those organizations with email contact information on file were invited to participate, which does not represent the total volume of applicants. 

Calibration of the Data – Quantitative Approach

Weighting adjustments were made to bring the sample into proportion with the universe by program volume based on 2019 and 2020 figures (depending on the most recent intake for the particular program).

The final data was weighted by the number of respondents in each program in proportion to the total number of applicants as detailed below. the universe proportions used to develop the targets were based on figures provided by ESDC.

Abbreviation Program #Of Applicants %Of Total
CSJ Canada Summer Jobs  39202 74.13%
EAF Enabling Accessibility Fund 2173 4.11%
NHSP New Horizons for Seniors Program 7194 13.60%
Foreign Credential Recognition Program 4312 8.16%
FCRP Foreign Credential Recognition Program 127 0.24%
ELCC Early Learning and Child Care 503 0.95%
IELCC Indigenous Early Learning and Childcare 68 0.13%
IWIL Innovative Work Integrated Learning Initiative 10 
0.02%
SWPP Student Work Placement 30 0.06%
SDG Sustainable Development Goals 722 1.37%
YESS Youth Employment and Skills Strategy 971 1.84%
SDPP Social Development Partnerships Program 1755 3.32%
UTIP Union training and Innovation Program 126 0.24%
Total 52811

Methodology – Qualitative Research

Respondents from the survey were asked a question whether they would be interested in taking part in follow-up qualitative research. After conducting an analysis of the sample to ensure a mix of programs, regions, and to ensure inclusion of participants in both official languages, potential participants were contacted randomly and asked if they would like to be taken through the screening questionnaire to confirm their eligibility for an in-depth interview or focus group. the breakdown of participation is as follows:

408

Number of survey respondents who agreed to be recontacted

112

Those were contacted to be screened

77

Those who agreed to be screened

51

Total number of participants in focus groups and in-depth interviews

The chart below provides a detailed description of the fieldwork.

Group Composition Date and Time
Group 1:  Unfunded applicants to any program other than ELCC and SDG, or those who are unsure
NATIONAL - ENGLISH
May 27 at 10AM
5 Participants
Group 2: Funded applicants to any program other than ELCC and SDG
NATIONAL - ENGLISH
May 27 at 1PM ET
6 Participants
Group 3:  Unfunded applicants to any program other than ELCC and SDG, or those who are unsure
QUEBEC - FRENCH
May 25 at 10AM ET
5 Participants
Group 4:  Funded applicants to any program
QUEBEC - FRENCH
May 25 at 1PM ET
4 Participants
In-depth interviews were focused on applicants to ELCC and SDG programs May 16th to May 31st
18 English Participants
8 French Participants

Methodology – Qualitative Research Data Collection and Analysis

Data Collection 

With participants’ consent, all qualitative research sessions are both audio and video taped. Verbatim transcripts from each and every focus group and interview is created; however, names or personal identifying details are not captured and/or removed or redacted by the moderator in order to ensure participants’ privacy. 

Moderators also capture high level findings on each topic of their own observations – what the overall reaction was, any nuances, and any non-verbal cues on body language or tone. Because our transcripts are anonymous, they are able to comment on any variations by group or audience, if they have not been placed in separate groups – for example, moderators can provide a sense of different opinions by older vs. younger participants, or males vs. females, depending on the topic.

Data Analysis

We identify some basic elements to qualitative analysis: 

  • Universal agreement where participants all agree, or there is agreement across different groups of stakeholders
  • Consensus perspectives that reflect the view of most participants; areas of wide agreement without much counter point (Many, most, several)
  • Conflicting or polarized perspectives where views are much more divided, or if there is a spectrum or variety of views (Some vs others)
  • Minority perspectives, often expressed by one or two participants as a counterpoint to a consensus viewpoint, or if they have an individual take or example/story (a few, a couple, mention)
  • Verbatim commentary, providing examples of what participants actually said during a discussion (direct unattributed quotes)
  • External context, for this project it is the results of quantitative research that provided a foundation for the qualitative research conducted and the discussion questions posed.

Note Regarding Program Complexity

For the purpose of this study, program complexity has been defined as low, moderate, and high complexity as outlined in the following table. These service standard clusters are informed by departmental reporting in the Performance Measurement and Management Framework.

Note: Canada Summer Jobs does not fit into these distinct clusters and has been analyzed as a separate group.

Program Complexity Level Description Programs Included
Low complexity programs Grant programs in the 112 days/16 week review period
  • Enabling Accessibility Fund (grants)
  • New Horizons for Seniors Program (grants)
  • Indigenous Early Learning and Child Care
  • Innovative Work Integrated Learning
  • Student Work Placement Program
  • Sustainable development goals
  • Social Development Partnerships Program (SDPP) (Grants)
  • Union training and Innovation Program (Grants)
Moderate delivery-complexity programs Contribution streams in the 126 days/18 week review period
  • Foreign Credential Recognition Program;
  • Youth Employment and Skills Strategy Program
  • Social Development Partnerships Program (SDPP) – Disability (Contributions)
  • Social Development Partnerships Program (SDPP) – Children and Families (Contributions)
  • Union training and Innovation Program (UTIP) (Contributions)
High-delivery complexity programs Contribution streams in the 154 days/22 week review period
  • Early Learning and Child Care

Note on Reporting Conventions – Quantitative Data

Throughout the report, subgroup results have been compared to average of all applicants (i.e., total) and statistically significant differences at the 95% confidence level noted using green and red boxes.

Where subgroup results are statistically higher than the total a green box has been used and where results are statistically lower than the total a red box has been used.

Additionally, where results in Year 2 are statistically higher than Year 1, a green arrow has been used and where results in Year 2 are statistically lower than Year 1, a red arrow has been used.

Significantly higher / lower than total

Significantly higher/lower than Year 1

For the purposes of legibility, values of less than 3% have not been labelled in charts throughout the report.