Executive Summary—The Advisory Panel on Government of Canada Quantitative Public Opinion Research Quality

Executive summary

Prepared for:
Public Services and Procurement Canada
Supplier:
Sage Research Corporation
Contract number:
EP363-182149/001/CY
Contract value:
$95,106.45
Award date:
December 7, 2017
Delivery date:
October 26, 2018
Registration number:
POR 058-17

About the report

This public opinion research report presents the results of a research panel conducted with knowledgeable, leading professionals from the private sector, Statistics Canada and academic institutions, between April 11, 2018 and August 22, 2018.

This publication may be reproduced for non-commercial purposes only. Prior written permission must be obtained from Public Services and Procurement Canada for all other uses. For more information on this report, please contact Public Services and Procurement Canada at questions@tpsgc-pwgsc.gc.ca or at:

Public Services and Procurement Canada
Portage III Tower A
16A1-11 Laurier Street
Gatineau QC   K1A 0S5

Cette publication est aussi disponible en français sous le titre : Comité consultative sur la qualité de la recherche quantitative sur l’opinion publique au gouvernement du Canada

Catalogue Number: P103-14/2019E-PDF
International Standard Book Number (ISBN): 978-0-660-28290-9

Related publication (registration number: POR 058-17):
Catalogue Number: P103-14/2019F-PDF (Final Report, French)
International Standard Book Number (ISBN): 978-0-660-28291-6 (Final Report, French)

© Her Majesty the Queen in Right of Canada, as represented by the Minister of Public Works and Government Services, 2019

The Advisory Panel’s recommendations

Background

The Public Opinion Research Directorate (PORD) is a mandatory common service provider responsible for giving advice on legislation, policies, research methodology and accepted industry practices. Under the Policy on Communications and Federal Identity, PORD also has the responsibility for developing and maintaining Government of Canada standards. The standards for both telephone and online public opinion surveys were first developed and implemented in 2009 and based on 2 separate advisory panels, one related to telephone and the other related to online research. These standards were later revised in 2013.

Given the ongoing changes in the public opinion research industry, PORD is undertaking a review of its quantitative standards for the conduct of Government of Canada public opinion research.

Purpose and objectives

The project involved convening an advisory panel composed of knowledgeable, leading professionals from the private sector, Statistics Canada and academic institutions to provide advice on potential standards and best practices for public opinion survey research (POR) conducted using telephone and/or online quantitative methods.

The advisory panel addressed and provided guidance on standards for the following topics:

Intended use of the results

For the topics listed above, the intent is to help PORD (a) to revise existing standards and guidelines and, as appropriate, create new standards and guidelines to guide the quality of survey research undertaken on behalf of the Government of Canada, and (b) to equip PORD with expert advice on dealing with evolving research methodologies.

Methodology

The advisory panel on Government of Canada quantitative public opinion research consisted of 10 members drawn from the private sector, academics experienced with market research, and Statistics Canada. Members of the panel were recruited by PORD, with assistance from Sage Research.

The advisory panel’s work took place between April 11, 2018 and August 22, 2018. The advisory panel process consisted of an initial web conference followed by 3 online discussion boards. Panel members then reviewed 4 working reports summarizing the results and proposing the guidance to put into the final report of the advisory panel.

The advisory panel’s recommended guidance for quantitative research is expressed as standards and guidelines, together with supporting commentary.

While it was not a mandate of the advisory panel to reach consensus, it did so on quite a few aspects of standards and guidelines for quantitative research.

Qualitative research is designed to reveal a rich range of opinions and interpretations rather than to measure what percentage of the target population holds a given opinion. Advisory panel members gave their personal opinions and experiences on the issues discussed, and were not speaking on behalf of their organization or industry.

Contract Value: $95,106.45, including Harmonized Sales Tax (HST)

Political neutrality certification

I hereby certify as Senior Officer of Sage Research Corporation that the deliverables fully comply with the Government of Canada political neutrality requirements outlined in the Communications and Federal Identity Policy of the Government of Canada and Directive on the Management of Communications. Specifically, the deliverables do not include information on electoral voting intentions, political party preferences, and standings with the electorate or ratings of the performance of a political party or its leaders.

Signature of Anita Pollak

Anita Pollak
President
Sage Research Corporation

The advisory panel’s recommendations

Sampling

The panel provided input on the following topics:

There were recommended changes to the following sections of the standards:

Definitions of types of samples

Section 4 sampling procedures includes standards for probability sampling, non-probability sampling and a census but does not give definitions of these types of sampling procedures.

Section 4.1.2 (online)/4.1.1 (telephone) should be expanded (a) to include both definitions and examples of probability and non-probability sampling, and (b) a definition of a census.

Maximizing representativeness of non-probability surveys

The objective is to revise the standards to emphasize the importance of striving for representativeness in non-probability surveys, and to explain in the proposal how this will be done. The current standards (sections 4.3.2 sampling procedures and 1.2.2 proposal documentation) address this objective to some extent, but the intent is to make the requirement more explicit and detailed.

The majority of the panel members agreed with including text to emphasize the importance of taking steps to improve the representativeness of non-probability survey results (in both sections 4.3.2 and 1.2.2). However, no consensus was reached on whether this should be a standard or a guideline.

Online sample information to include in proposal documentation

Section 1.2.4 #3 proposal documentation lists the required information when an online sample provider is used. This section should be expanded to require separate and more specific proposal information for both probability and non-probability online samples.

There was agreement on the types of information that should be required in the proposal for online probability samples. For online non-probability samples, there were different points of view on some of the specific information disclosure requirements.

Statistical treatment of survey results

The panel provided input on the following topics:

There were recommended changes to the following sections of the standards:

Statistical treatment of non-probability survey results

The panel was asked to clarify the use of statistical measures for non-probability surveys in light of recent developments in the application of alternative measures of statistical precision.

Section 4.3.3 of sampling procedures should include (a) revised wording to further clarify that margins of sampling error do not apply to non-probability survey data; (b) limitations on use of alternative measures of precision (e.g. Bayesian credible intervals) for non-probability surveys; (c) documentation requirements in both the proposal and the survey report when alternative measures of precision are used.

Statistical treatment of probability survey results

The objective was to determine whether to expand on the current requirements for reporting level of precision for probability surveys (section 14.7.2/15.7.2).

There were several proposals for revised wording, but no consensus on the most appropriate wording.

Statistical treatment of census survey results

In section 4.6 in sampling procedures, there are 2 statements made about the statistical treatment of census survey results. The panel was asked to comment on whether the 2 statements were consistent.

It was agreed that part of section 4.6.3 should be deleted as it is inconsistent with the standard to not use inferential statistical tests in a census survey.

The panel was also asked to consider whether or not a survey ceases being a census if the response rate falls below a certain level.

There was agreement that a census with a response rate of less than 100% is still a census, albeit perhaps better described as an attempted census. Margin of sampling error does not apply but other sources of survey error can still be present, such as non-response bias.

Required questions in surveys

The panel provided input on the following topics and recommended changes to section 2 questionnaire design (2.1.2; 2.1.3):

Introduction wording preceding required demographic questions

There is a general requirement (section 2.1.2) to inform respondents at the beginning of a survey of the confidentiality of their questionnaire responses, but the current standards do not state any specific wording for how to preface the block of demographic questions located near the end of the questionnaire.

Add a requirement for an introduction to the block of demographic questions that addresses confidentiality. There was agreement on most of the wording for this introduction, but no consensus was reached on use of the terms “confidential” and/or “anonymous” in this introduction.

Use of “prefer not to answer” in the required questions for online surveys

“Prefer not to answer” (and its related forms) should be removed as a required listed response category in online surveys, but retained as an optional response category.

Efficiency of reading a large number of response options for required questions in telephone surveys

Some of the required demographic questions have a relatively long list of response options. This includes the questions on age, education and household income.

Revise section 2.1.3 to allow for modifying the wording of a demographic question to allow the interviewer to instruct a respondent to stop at the category that applies to them.

Required demographic questions: gender

In the currently mandated question, gender information is collected very differently in telephone versus online surveys. The telephone version does not actually ask a question about gender, but rather relies on interviewer observation. The online version asks a question and so is based on respondent self-classification rather than interviewer classification. Neither the telephone nor online survey versions offer an “other” answer option.

Revise the mandated question for all POR surveys (telephone and online) (a) to include an “other” answer option, and (b) require the gender question to be read to respondents in telephone surveys.

Required demographic questions: language

There are currently 2 mandated questions for language (mother tongue and language spoken most often at home), with discretion for the researcher to use one or both questions depending on the survey objectives.

Revise this to require only language spoken most often at home.

Required demographic questions: age

Revise 2 of the current age categories (35-49 and 50-54) to 35-44 and 45-54 to get a more even distribution of the age categories.

There was discussion but no consensus on whether the current age category 18-34 should be split into 2 categories (18-34 and 25-34) or left as is.

Required demographic questions: education

In order to better align the response options with the school systems in both Quebec and the rest of Canada, combine “grade 8 or less” and “some high school” into a single category, “less than high school diploma or equivalent.”

Required demographic questions: household income

Revise the question wording to specify a time frame of “last year” for household income.

Required demographic questions: addition of household phone status for telephone surveys

This information can sometimes be useful in quota controls or weighting. Add a requirement to section 2.1.3 (a) to include a question on household phone status in all telephone surveys.

Use of mobile devices in online surveys

The current standards do not address the possibility and implications of an online survey being completed on a mobile device. The panel was asked to provide input on revising the standards to address the following areas:

There were recommended changes to the following sections of the online standards:

Proposal documentation relating to use of mobile devices in online surveys

The default expectation should be that an online POR survey sample will include respondents using either a computer or a mobile device for the survey and that surveys have a mobile-friendly version of the questionnaire.

Additions to sections 1.2.2 and 1.2.5 in proposal documentation are recommended to make these expectations explicit.

Mobile-friendly online surveys and questionnaire design

There were 3 potential revisions/additions to section 2 questionnaire design considered with respect to questionnaire design in online surveys where mobile devices may be used.

Should there be a standard encouraging use of a common question design/layout across devices?

The consensus was that a standard is not appropriate given that research on what design approach is best for question design/layout across devices is inconclusive, and optimal design approach can vary across surveys and for different questions within a survey. However, the addition of a guideline highlighting the options available to researchers could added in section 2 questionnaire design.

Should there be a different survey duration standard for mobile-friendly surveys?

The standard for online questionnaire duration is 20 minutes, but an average duration of 15 minutes or less is “strongly encouraged.” The panel considered whether the standard for survey duration should be left as is, or revised to specify a shorter duration for mobile-friendly surveys.

No change was recommended to the existing standard on survey duration.

Should there be guidelines on features of a mobile-friendly questionnaire?

Most agreed that a list of examples be added to section 2 questionnaire design as a useful reminder to researchers of the elements that make a questionnaire more mobile-friendly.

Proposed revisions related to pre-testing in the online standards

The current standard specifies the total number of pre-test completions, but does not break this down by device type.

The panel considered potential revisions/additions to section 3 pre-testing with respect to online surveys where both mobile devices and computers may be used.

Should pre-testing standards specific to device type be added?

Section 3 pre-testing should be revised to include a requirement for pre-testing on both computers and mobile devices when a survey can be completed on both types of devices. Several alternative options for how to word the requirement were proposed.

Should standards on the number of pre-test interviews by device type be added?

The consensus was that section 3.1.5, which requires a minimum of 10 pre-test interviews in each language, should be left as is with the understanding that the pre-test would include a sample of different devices.

Possible revisions to online standards related to data collection and quality controls related to the possibility of mode effects by device type or screen size

In a survey that allows completion on both mobile devices and computers, there is the potential for a “mode” effect. That is, the different designs/layouts for a given question could cause different response distributions.

The panel considered whether or not there should be any requirement to collect information on device type, and any requirement to conduct an analysis for mode effects by device type.

A standard should be added to section 7 data collection requiring collection of data on the type of device used by respondents to complete a survey.

The panel did not support adding a standard requiring an analysis of mode effects for each survey (section 14.6 quality controls). The view was that not enough is known about device type mode effects at this time to specify analytic requirements for individual surveys and “research on research” needs to be done using the aggregated device data collected across surveys to determine what if any standard would be appropriate for analysis of potential mode effects.

Covering respondent costs for use of mobile devices

Users of mobile devices may incur costs to participate in a research survey. The current standards do not have any requirements as to how such costs should be handled.

The consensus was that there should not be a standard about covering respondent costs associated with using a mobile device: (a) respondents always have a choice whether or not to participate in a Government of Canada (GC) Public Opinion Research (POR) online or telephone survey; (b) the current standards require certain information be given about the survey (e.g. length), so respondents are able to make an informed choice about whether or not to participate; (c) unless compensation is set at an arbitrary fixed amount for all mobile users, the logistics of determining the amount to compensate each respondent and documenting this for billing purposes would be very complex and difficult, if not impossible.

Inclusion of cell phones and landline phones in telephone surveys

An important issue in sampling for telephone surveys is the inclusion of cell phone users and landline users. This can affect coverage of the survey population, the sampling frame(s) used for the survey, and possibly weighting. A telephone probability sample of the general Canadian adult population must include a cell phone sample. The panel was asked to consider revisions to the telephone standards related to proposal documentation and sampling procedures:

There were recommended changes to the following sections of the standards:

Proposal documentation relating to inclusion of cell and landline phones in telephone surveys

There were 3 potential revisions/additions to section 1 proposal documentation in the telephone standards considered by the panel.

Response rate/participation rate

Consider revising the text in section 1.2.3 #1 to require stating an estimated response/participation rate for both cell phones and landline in surveys where both device types can be used.

Description of data collection

The panel was asked to consider whether there should be any revisions to section 1.2.4 #7 in proposal documentation, which states that a rationale must be given when the sample includes interviews on cell phones. The current language overly downplays the importance of including cell phone users in the sample.

The wording of section 1.2.4 #7 should be revised to acknowledge the importance of cell phone samples in telephone surveys. There were several alternative proposals on the approach to take.

Sampling procedures relating to inclusion of cell and landline phones in telephone surveys

The current standard in sampling procedures section 4.2.3c addresses disclosure of coverage issues in probability samples, and gives as an example a sample of cell phone only households.

Add landline-only samples as another example in section 4.2.3c given the growing number of cell phone only households, a landline-only sample could have substantial coverage error.

Telephone survey call-back requirements

The telephone standards for call-backs in section 7 data collection (7.2) require a minimum of 8 call-backs to be made before a telephone number is retired. Some concern has been expressed that 8 call-backs is too many, and might be perceived as harassment. The current standard also (a) does not provide a definition of what constitutes a call-back, and (b) does not differentiate between call-backs to landlines and cell phones. The panel was asked to consider what should be the standard for number of call-backs, including whether there should be a different standard for respondents reached on a cell phone.

There were 2 main recommendations for revisions to section 7.2:

  1. change the terminology from “call-backs” to “call attempts” on the grounds the meaning is more straightforward. Note that “call attempts” equals 1 plus the number of “call-backs”
  2. a minimum of 8 call-backs (9 call attempts) is excessive. The majority panelists recommended the Standard be revised to require 6 call attempts, meaning the initial call and 5 call-backs.

The panel opted to apply the same call-back requirement to both cell phones and home phones.

Interactive Voice Response telephone surveys

The panel was asked to consider revisions to the telephone standards in section 5.3 use of interactive voice response in the following areas:

The panel was also asked to provide input on standards related to:

There were recommended changes to the following sections of the standards:

Use of Interactive Voice Response for Government of Canada Public Opinion Research surveys

Section 5.3.1 discourages, but does not forbid, use of IVR surveys for POR. It also suggests circumstances when IVR may be an appropriate methodology. The panel considered whether there should be any changes to this sub-section on the use of interactive voice response.

The majority suggested adding more examples of situations when IVR as a data collection method may be acceptable while maintaining the principle that IVR is not a preferred method for GC POR surveys.

Interactive Voice Response survey introduction

Section 5.3.2 states that the information disclosure requirements for IVR surveys are the same as for interviewer-administered surveys, and similarly requires that the information be provided in the survey introduction. Because IVR surveys are typically shorter than interviewer administered surveys, the panel was asked to comment on (a) whether the required elements for telephone survey introductions should be revised or shortened for IVR surveys, and (b) the possibility of moving some of the information disclosures to the end of the survey.

Most panelists said the required information in the survey introduction should be the same for IVR surveys as for other surveys. There was no consensus on where in the questionnaire the various required disclosures should be made. However if it is decided that certain types of information can be disclosed later in a survey, that option should be available to all surveys, and not just to IVR surveys.

Interactive Voice Response survey duration

The standard for survey duration states surveys must be completed in 20 minutes, and strongly encourages a duration of 15 minutes or less.

A guideline should be added to questionnaire design section 2.1.1 to encourage an IVR survey duration of 5 to 7 minutes or less.

Should there be a different call-back standard for Interactive Voice Response surveys?

The call-back requirements in section 7.2. call-backs do not make any distinction between interviewer-administered surveys and IVR surveys. The panel considered whether there should be any changes to this section specific to IVR surveys.

There were several alternative proposals, ranging from a recommendation to exempt IVR surveys from the call-back requirements for interviewer-administered surveys, to requiring IVR surveys to have the same call-back requirements as interviewer-administered surveys.

Multi-mode surveys

The current standards already address multi-mode surveys to some extent. The panel was asked to provide input on possible revisions to the standards in the following areas related to multi-mode surveys:

There were recommended changes to the following sections of the standards:

Proposal documentation for multi-mode surveys

The primary concern associated with multi-mode surveys is the potential for mode bias. The panel considered whether and how the proposal documentation requirements need to be elaborated to make it more clear in the proposal that the issue of potential mode bias is recognized and that steps will be taken to address this.

Additions were recommended to:

Sampling procedures for multi-mode surveys

Revise section 4.5 in sampling procedures to emphasize the value of using similar modes of data collection to minimize the risk of mode biases.

Questionnaire design for multi-mode surveys

There is no current standard for questionnaire design specific to multi-mode surveys.

Revise section 2.1 in questionnaire design (a) to encourage comparability across modes in question wording and presentation of response options, and (b) to highlight the value that including benchmark questions can have for enabling detection of mode biases.

Pre-testing for multi-mode surveys

The current section 3 pre-testing does not make any specific references to separate pre-tests by mode in a multi-mode survey. The panel was asked to consider whether there should be a requirement for a minimum number of pre-test interviews in English and French for each mode in a multi-mode survey.

There were several different points of view on this matter, and no consensus was reached.

Outcome rates for multi-mode surveys

Currently in section 8 outcome rates there is no standard for how to calculate outcome rates for a multi-mode survey.

A standard should be added outlining the general principles for calculating and reporting on outcome rates for multi-mode surveys. Research designs which do not allow calculation of either of the mandatory outcome rates (response rate or participation rate) should not be permitted for GC POR surveys.

Mandatory survey report requirements for multi-mode surveys

The standard for reporting on data collection in sections 14.5.2/15.5.2 of mandatory survey report requirements should be updated using the updated language in sub-section 1.2.4 #7 in proposal documentation.

Section 14.6.3/15.6.3 quality controls should be revised (a) to ensure decisions made about combining or not combining data across modes are clear, and (b) to require descriptions of any adjustments made to the data to mitigate mode effects.

Incentives in surveys of children, young people or vulnerable respondents

Section 6 data collection from children, young people or vulnerable respondents does not make any reference to whether or how incentives are used for this survey population. Section 7 data collection (7.5 [telephone]/7.6 [online]) that deals with incentives/honoraria also does not refer to this population.

Guidance should be added to section 7.5/7.6 to address such matters as who will receive the incentive and getting parental consent.

Privacy and security of data

The panel was asked to provide input on possible revisions/additions to the standards in the following areas:

There were recommended changes to the following sections of the standards:

Passive data collection in online surveys

Online and mobile methodologies create possibilities for collecting various types of personal data “passively” that is, without direct interaction with respondents. The issue considered was what passive data collection is allowed and under what circumstances is it allowed in the context of surveys? The panel was asked to consider if the current standards are sufficient to address these questions associated with passive data collection in surveys.

The panel endorsed a revision to the standards in data collection section 7.2 (a) to explicitly define “passive data collection” and provide examples of “personal information”, and (b) to note exceptions where the passive data collection is legally permissible.

Photographs and recordings

The online and telephone survey standards do not currently have any standards pertaining specifically to respondent photographs, videos or audio recordings.

The panel endorsed the addition of standards in section 5 retaining public confidence to clarify (a) that photographs and recordings are considered to be personal data and need to be treated as such, and (b) the responsibility of researchers when the survey involves asking respondents to generate photographs and/or recordings.

Telephone surveys: sensitivity to setting

The current telephone standards section 5.2.1 avoidance of harassment, has a standard focused on sensitivity of the survey subject matter, but it does not directly address issues potentially caused by the setting of the interview. Because respondents are increasingly likely to answer calls using a mobile phone, there can be issues with them using the phone in problematic settings (e.g. driving, walking in a public space). On both mobile phones and fixed-location phones, they may be in a setting where they can be overheard.

Most panelists supported adding a guideline to determine if a telephone survey respondent is in a location where they can take the call, for both cell and landline users (section 2, questionnaire design).

Data breaches

The current standards in section 13/14 data security require taking steps to protect against data breaches (the loss of or unauthorized access to/disclosure of personal or organization information). The relevant sections are: 13.2 (online)/14.2 (telephone) protection of data servers; 13.3/14.3 temporary storage of data on servers; (c) 13.6/14.5 in the event of any data breach.

The panel was asked to identify any revisions or additions to the standards, and/or any guidelines that should be included.

The existing standards pertaining to privacy and security of data, including data breaches, are appropriate.

There were 2 main areas identified for additional standards or guidelines:

Cloud storage

The current standards in section 13/14, data security require that survey data be stored in Canada.

This is a complex area: it requires expertise in the legal and regulatory framework affecting data access and use not only in Canada but in other countries as well where servers might be located, and it requires an understanding of GC policies in this area. The panel did not consider itself to be experts in these areas. For the most part there were no suggested changes to the current standards. However, one suggestion was for the GC to have a pre-approved list of countries that satisfy the conditions set out in in the current standards and that are acceptable for cloud storage of GC POR data.

Surveys and social media

The panel considered whether there are any additional standards required for surveys that use a social media venue as either a sample source or to administer a survey.

The current standards, together with the various changes recommended elsewhere by the panel, are sufficient to ensure that any such surveys meet the quality requirements for GC POR surveys. Therefore no additional standards are needed for surveys that use a social media venue for either sampling or survey administration.

Accessibility and literacy

The online and telephone standards do not contain any standards or guidelines pertaining to accessibility.

The panel considered whether a statement should be added to the standards about the importance of accessibility, and what if any specific guidelines might be provided for online and telephone surveys. Note that according to PORD, the Treasury Board Secretariat (TBS) is working on a proposed policy for accessibility standards specific to all devices used to access online surveys. The results of this development work will probably be available in a year or so. When the TBS policy is finalized, it will take precedence.

The majority of panelists supported adding a general guideline encouraging accessibility including providing guidelines on some examples of steps that could be taken to improve accessibility in online or telephone surveys.