Secondary Research into Cell Phones and Telephone Surveys

Download or print a PDF version (742KB)
Help with PDF files

December 2012

Prepared for Public Works and Government Services Canada

Phoenix Strategic Perspectives Inc. (SPI) is a "Gold Seal Certified" Corporate Member of the Marketing Research and Intelligence Association (MRIA)

Table of Contents

List of Figures

Introduction

Public Works and Government Services Canada (PWGSC) commissioned Phoenix SPI to conduct secondary research to explore issues related to the inclusion of cell phones in telephone survey research.

Background and Objective

PWGSC's Public Opinion Research Directorate (PORD) provides coordination and advisory services for all Government of Canada public opinion research. The Directorate facilitates studies by guiding Government of Canada client departments through the research process to ensure that objectives are met and that the research undertaken conforms to Government of Canada policies and Treasury Board regulations, as well as industry standards. To fulfill its role, PORD needs to be aware of, and knowledgeable about, current developments in public opinion research (POR) in order to advise clients and develop or revise survey standards. One area that demands attention is cell phones and their impact on survey research in Canada.

For its part, PORD needs to understand, among other things, the current practices of survey research organizations vis-à-vis cell phones, the potential impact and implications of including cell phones in survey sample frames, best practices and industry standards related to cell phones (to the extent that they exist), as well as what is available in Canada in terms of cell phone sample - in particular, how the frames are created and the extent to which they can be considered representative. The objective of this secondary research assignment, therefore, was to undertake a literature review in order to provide PORD with a fulsome picture of the current environment, with a supporting analysis of the implications for telephone surveys being conducted by the Government of Canada, as well as recommendations for PWGSC's consideration.

Scope of the Assignment

There are several different types of cell phone surveys, both interviewer-assisted and self-administered, used by researchers. Common approaches include standard random digit dial (RDD) landline telephone surveys that are also administered to cell phone users, web-based surveys completed on wireless-enabled cell phones (i.e., smartphones), web panel recruitment for cell phone surveys (i.e., completing the survey on a cell phone), and text messaging (SMS, or short message service) surveys (not pre-notification text messages of an upcoming cell phone survey).Footnote 1 The focus of this assignment was dual sample frame (cell phone and landline) telephone surveys.

Methods

Given the objective, a literature review was conducted, one that included a mix of perspectives from various sources, such as government, academia, industry, and relevant survey research associations. Some sources were contacted directly by email or telephone, while the websites, electronic databases, and publications of others were searched for relevant materials. The focus was on best practices, current standards and approaches to incorporating cell phones, academic and industry thinking in the area, as well as the results and methodologies of experimental research done in this area. The geographic scope of the literature review was Canada and the United States, although not exclusively. For instance, the European Society for Opinion and Marketing Research, or ESOMAR, the international association for market and opinion research, offers guidance on leading-edge issues, including cell phone research. A comprehensive list of the sources and specific works consulted can be found in the appendix.

No primary public opinion research data collection was undertaken as part of this project.

Organization

The selected literature has been organized as follows:

  1. Current State of Affairs
  2. Experimental Findings
  3. Best Practices
  4. Cell Phone Samples in Canada
  5. Conclusions and Recommendations.

In addition, the literature review included an assessment of recent Government of Canada post-campaign advertising evaluations that incorporated cell phone samples. The findings, and implications, are presented in Section 2.

Notes to Reader

When reading the report, reviewers should be aware of the following:

  • All sources consulted and works cited are footnoted in the text, and are available as full, detailed citations in the appendix.

  • Acronyms are used throughout the report for convenience. The table below presents a complete list of acronyms, provided here for ease of reference:

Acronyms
AAPOR:
American Association for Public Opinion Research
ACET:
Advertising Campaign Evaluation Tool
CD:
Census Division
CPI:
Cost-per-completed interview
CPO:
Cell phone only (household)
ESOMAR:
European Society for Opinion and Marketing Research
GP:
General public or general population
HRSDC:
Human Resources and Skills Development Canada
IVR:
Interactive voice response
MRA:
Marketing Research Association
MRIA:
Marketing Research and Intelligence Association
NADbank:
Newspaper Audience Databank
NRCan:
Natural Resources Canada
POR:
Public opinion research
PORD:
Public Opinion Research Directorate
PWGSC:
Public Works and Government Services Canada
RDD:
Random digit dial
SMS:
Short message service
VAC:
Veterans Affairs Canada

1. The Current State of Affairs

This section discusses the current state of affairs with respect to cell phones and telephone surveys. While much of the available literature pertains to the United States, where relevant, issues are discussed within the context of Canada as well.

1.1 Why Cell Phones Matter to Survey Research

With cell phone use increasing in Canada, and landline use decreasing, there is growing interest in the effect this is having on telephone surveys. Until recently, RDD landline-only (or fixed telephone) sampling resulted in very little coverage error, with virtually every Canadian household having a traditional landline (97% as recently as 2000). As the number of households with traditional landlines declines (e.g., from 91% in 2006 to 67% in 2010Footnote 2), and cell phone only (CPO) households increase (e.g., 13% in 2010 up from 8% in 2008), researchers need to explore aspects of cell use and the implications for survey research.Footnote 3

Understanding the methodological challenge of CPO households is all the more important because the number of cell phone only households in Canada is expected to reach 20% by the end of 2015.Footnote 4 In the United States, the problem of cell phones, and CPO households, is even more acute, with 2011 estimates suggesting that as many as 31% of American households have only a cell phone.Footnote 5

The impact of not including cell phone only households in survey research is obviouscoverage error resulting from an incomplete sampling frame. In practice, this means that the CPO segment of the general population will not be reached during a landline-only RDD telephone survey and RDD landline sample frames can no longer be considered representative of the general population. The credibility of the resulting public opinion research (POR) data, in turn, may be questioned and, by extension, the policy, program and other decisions the data are used to support. While this issue has much greater significance at this time among survey researchers in the United States, there is no reason to suspect that the trajectoryincreasing proportions of CPO households at the expense of landline householdswill be any different in Canada.

The proportion of CPO households among the general population increases dramatically when the focus is on certain demographic sub-groups, in particular youth and ethnic minorities. This means that CPO households are a problem for surveys of the general population, but an even more pressing and pronounced problem for studies focused on some sub-groups. Using 2010 data from National Health Interview Survey (NHIS), Blumberg and Luke (2010) offer a demographic profile of CPO households in the United States over time, from 2007 to 2010 (Figure 1 below).

Figure 1: Demographic Profile of Cell Phone Only HouseholdsFootnote 6
National Health Interview Survey January to June 2007 January to June 2010
Sex
Male 14% 26%
Female 12% 24%
Age
18 to 24 28% 40%
25 to 29 31% 51%
30 to 34 17% 40%
35 to 44 11% 27%
45 to 64 7% 17%
65 and older 2% 5%
Education
Less than high school 15% 29%
High school graduate 12% 24%
Some college 15% 27%
College graduate 11% 23%
Employment Status
Employed 15% 29%
Student 21% 33%
Homemaker 10% 23%
Household Type
Owner 7% 16%
Renter 31% 47%
Household Income
In poverty 9% 39%
Near poverty 11% 33%
Higher income 16% 22%

Telephone surveys that include cell phone samples, then, provide access to a different demographic mix of respondents.

The following table offers a different perspective. It provides unweighted demographic data for dual frame RDD telephone surveys conducted in 2010 by the Pew Research Center in the United States. As detailed in Figure 2, a greater proportion of the cell phone sample are male, under 30, Hispanic, and less-educated (i.e., completed only high school). The implication is simple if a survey design calls for an over-sampling of youth, for example, it will be increasingly difficult to achieve the targeted quotas using only a landline sample frame.

Figure 2: Sample Frame Comparison by DemographicsFootnote 7
Pew Research Center
2010 Surveys
Landline Sample
(N=10,723)
Cell Sample
(N=5,352)
Sex
Male 41% 55%
Female 59% 45%
Age
18 to 29 7% 29%
30 to 49 25% 33%
50 to 64 34% 27%
65 and older 31% 10%
Race/Ethnicity
White 81% 71%
Black 10% 14%
Hispanic 6% 11%
Education
Less than high school 7% 9%
High school graduate 27% 37%
Some college 27% 29%
College graduate 39% 35%

In Canada, the findings are similar where age is concerned, with younger Canadians more likely to be cell phone users. Conservatively, at least half of all 18-34 year olds use only a cell phone, as published by Statistics Canada in December 2010Footnote 8 (although this proportion has undoubtedly grown in the past two years). As well, consistent with the U.S. data, a 2008 Canadian studyFootnote 9 found that CPO respondents in Canada are younger, more likely to be male and to live in smaller households, and more likely to be lower-income compared to the general population.

Weighting can correct for demographic deficiencies resulting from coverage error, but what post-data collection statistical adjustment cannot do is control for the unknownthat is, how CPO Canadians compare to landline-only and to dual cell and landline Canadians in terms of attitudes and perceptions, which is the very core of public opinion research. Recent research suggests, moreover, that this is the case that CPO respondents in Canada do differ in attitudes and some behaviours (EKOS / Arcturus Solutions, 2008; Leger Marketing, 2011). For this reason, the issue of coverage error, as well as the potentially unique characteristics of CPO households, are compelling survey researchers to very seriously consider cell phones, in particular cell only households, when designing general public telephone surveys. As more people incorporate cell phones into their daily lives, fewer people will be accessible through RDD landline telephone surveys, which means that a cost-effective and credible alternative is a clear imperative.

1.2 The Current Environment

Considering the methodological challenge presented by cell phones, and cell phone only households, it comes as no surprise that cell phone sample frames increasingly are being incorporated in RDD telephone surveys. Doing so helps to ensure that telephone surveys are more representative of the general population, at least in terms of the unweighted demographic mix of respondents.

In the United States, there has been a marked shift away from landline-only RDD sampling to dual frame (landline and cell) RDD designs in the past few years.Footnote 10 Indeed, dual frame surveys have become standard practice when surveying the U.S. general public. Two leading survey organizations, the Pew Research Center and Gallup, both include cell phones in all their national telephone surveys of the American public. The latter has been doing so since January 2008.Footnote 11 In addition, two of the largest, longitudinal health-related surveys in the U.S., the Behavioral Risk Factor Surveillance System Survey (administered by the Centers for Disease Control and Prevention [CDC]) and the California Health Interview Survey (conducted by the University of California, Los Angeles (UCLA) Center for Health Policy Research), both include cell phones as part of the sampling frame.

Likewise, all the core political pollsters affiliated with the mainstream media outlets (e.g., the NBC News/Wall Street Journal poll or the ABC New/Washington Post polls) use dual frame sample designs.Footnote 12 As Paul Lavrakas, chair of the American Association for Public Opinion Research's (AAPOR) Cell Phone Task Force, contends "exclusion of the mobile frame would lead to a low level of credibility for any telephone survey of the U.S. public that sampled only from the landline frame, unless the researchers could provide compelling evidence to the contrary".Footnote 13

In Canada, the Government of Canada, as well as the leading association for market and public opinion research in Canada, MRIA, have both started to investigate cell phones and the implications for survey research. Government of Canada departments and agencies that conduct public opinion research are starting to request dual frame samples for their telephone surveys. While it is not possible to know how private sector organizations approach telephone survey research, it is worth noting that the Newspaper Audience Databank (NADbank), the research arm of the Canadian daily newspaper industry, began augmenting its telephone sample frame with CPO households in Toronto, Montreal and Vancouver in 2012.Footnote 14 This change in methodology was in response, at least in part, to Statistics Canada's 2010 Residential Telephone Service Survey, which found increasing proportions of CPO households in Toronto (16%), Montreal (14%), and Vancouver (20%). The rationale offered was twofold: to reduce coverage error and to increase the likelihood of reaching underrepresented "mobile-oriented" adults in the landline sample.

In summary, dual frame sampling appears to have become the standard or accepted approach to telephone surveying in the United States. If the proportion of CPO households continues to grow in Canada, it seems inconceivable that survey researchers will not follow the example set by researchers in the United States and incorporate cell phone samples in national telephone surveys as a best practice.

1.3 Common Practices and Considerations

The subject of 'best practices' or standards is discussed in Section 3. However, based on a review of available literature, it is worth outlining what appear to be common practices or considerations when conducting research with cell phones among the general population (considerations and practices will differ when research is focused on sub-segments of the population e.g., youth or immigrants).

Figure 3: Common Practices and Considerations

Sample Ratios

The right proportion of landline and cell phone sample records.

  • Pew Research Center uses a 60% landline and 40% cell phone sample design for general public surveys. This tends to yield 16-18% CPO respondents (which is still much lower than the actual population, which is closer to 30% in the U.S.).Footnote 15 Weighting is required to address under-representation.

  • Recently (October 2012), Gallup changed the ratio to 50:50 equal proportions of landline and cell phones to reduce the size of the weights applied post data collection (a greater number of CPO households are reached with this sample mix, which means that the CPO cases in the dataset do not have to be weighted up as much).Footnote 16

CPO Proportions

The right proportion of CPO completed surveys.

All researchers aim to complete surveys with a representative sample of the target population. Under-representation of sub-segments of the general population is addressed with weighting. Weights are applied to under-represented sub-segments to bring them in line with their actual distribution in the general population. Cell phone research conducted by Pew Research Center tends naturally to yield 16-18% CPO respondents (as described above).

In the United States, pollstersFootnote 17 are finding that CPO respondents have significantly different political views compared to landline respondents. While there is no consensus on the appropriate number of CPO respondents to target when polling, nor on the best approach to reach these people, the NBC News/The Wall Street Journal polling methodology was changed in July 2012—30% of completed interviews are to be conducted with CPO respondents.

Sample Frame Design

An overlapping versus non-overlapping approach.

It appears that some organizations screen their cell phone sample for cell phone only households (i.e., a non-overlapping sample frame design), while others do not (i.e., an overlapping design). The latter uses weights to bring the views of the cell phone sample in-line with those of CPO households and to account for the fact that some units of the sample have multiple chances of selection. In the U.S., pollsters like the ABC News/Washington Post polls and NBC News/The Wall Street Journal polls use a non-overlapping sample design and screen out all cell-phone respondents who say they also have a landline. So too does the CDC's flagship study, the Behavioral Risk Factor Surveillance System Survey. Conversely, the New York Times/CBC News polls, Gallup, the Pew Research Centre, and the California Health Interview Survey do not screen for landline telephone service and usage.Footnote 18

Remuneration

Are incentives required to complete an interview with a cell phone household?

At this time, there is no standard practice when it comes to remuneration. In the U.S., some organizations offer incentives, some provide them only if asked by respondents, and others provide no remuneration at all. Among the organizations that offer remuneration, two approaches are common:

  • Contingent Incentive: Incentives are used only to secure interviews or to respond to complaints. Interviewers are instructed to only mention the incentive if a respondent is reluctant to participate or complains about the call being a financial burden (i.e., causing her/him to incur a cost for the cell phone minutes).

  • Explicit Incentive: Incentives are offered by interviewers to everyone contacted on their cell phone. Potential respondents are told that a monetary incentive will be sent to them upon completion of the survey. Typical amounts being offered appear to be $5 or $10.Footnote 19

With that said, it is worth noting that Pew Research Centre recently changed its remuneration protocol. Respondents are told that they can be sent $X for the completing the interview (as opposed to will be sent the incentive).

"That's the end of the interview. If you would like to be reimbursed for your cell phone minutes, we can send you $5. I will need your full name and a mailing address where we can send you the money".

The organization is finding that many respondents decline the offer (60-70%).Footnote 20

Household Sampling

Who should be interviewed? In landline surveys, the most recent birthday approach is often used.

At this time, it appears that most surveys involving cell phones do not follow any sampling procedures when reaching a cell phone household. The assumption is that a cell phone is a personal device—therefore, no household sampling is required as is the case with landline surveys.Footnote 21

Dialling

In the United States, automatic dialling is prohibited by law—Telephone Consumer Protection Act, meaning all cell phone numbers must be hand-dialed unless an organization has prior consent from the cell phone households. There is no similar legislation in Canada. Cell phone numbers can (and are) subject to auto-dialling just like landlines.

1.4 Summary

With a much higher proportion of CPO households, the U.S. provides a good example of how the erosion of the traditional RDD landline sample frame can be handled by research organizations. The trend in the United States is to use dual sample frames, both landline and cell phone frames, in order to increase the representativeness of telephone surveys of the general population. Currently, Canada seems to be on a parallel trajectory, with the number of CPO households in Canada growing quickly and efforts being taken to explore the utility of dual frames and the implications of surveying cell phone users on their mobile devices.

2. Findings from Experimental Research

The dramatic growth in cell phone only households and concern about the validity of survey findings based on landline-only sample frames has focused attention on advancing the research industry's knowledge in this area. Numerous experimental studies have been conducted within the last ten years, primarily in the United States, to better understand the impact of cell phone only households on survey estimates. Below, we describe the key findings of core experimental studies.

2.1 U.S. Studies

In the United States, much of the telephone survey research taking place right now can be considered experimental to the extent that organizations are 'fine-tuning' or 'tinkering with' their methodologies in order to determine optimal methods for conducting cell phone research. With this in mind, we look at what we can learn from recent research in terms of cell phone survey methods.

2.1.1 Non-coverage Bias

Earlier we described the impact of not including cell phones in surveys of the general population in terms of demographics. A key question for researchers is the extent to which the behaviours and attitudes of the cell phone only population differ from the general population. A recent study by the Pew Research Center for the People and the Press found that dual frame (cell and landline) telephone surveys, weighted demographically to match U.S. population characteristics, continue to provide accurate data on "most political, social and economic measures".Footnote 22 The Pew Research Center has been tracking this issue since the mid-2000s and this finding is consistent with previous research. Lee, Brick, Brown and Grant (2010) reported a similar result when assessing the extent of bias resulting from the exclusion of CPO households in the 2007 California Health Interview Survey. Weighting by demographics can offset the non-coverage bias, but it does not address the fact that CPO respondents are distinctive in some attitudes and behaviours. The key is knowing when these differences matter.

2.1.2 Incentives

The issue for consideration is whether or not incentives will yield cost savings for cell phone surveys. Earlier we indicated that, at this time, some research organizations are opting not to offer incentives, at least in any systematic way. Several studies provide evidence to support this practice.

Guterbock, Holmes, Bebel and Furia (2012) looked at the use of incentives with hard-to-reach segments of the general population. The incentive was a $10 gift card. What they found was that the gift card did not yield sufficient savings in terms of data collection costs to offset the cost of the incentive itself. In short, they found no evidence to suggest that survey incentives help to complete interviews with hard-to-reach groups.

The Pew Research CenterFootnote 23 has also experimented with different incentive amounts to examine the impact on response rates. The Center's 2008 study found virtually no difference in response rates among cell phone respondents, regardless of the incentive amount (the study used a $10 and $20 incentive). Another recent experimental study conducted with CPO respondents found that a $10 gift card had no observable effect on response rates (Oldendick and Lambries, 2010).

Call (2012) had similar results when experimenting with incentives and voice mail among a cell phone sample frame. Whether an incentive, a voice mail, or both in combination were used, there was no impact on response rates. Additionally, using a voice mail that referred to the incentive did not reduce the number of callback attempts required to complete an interview. The message was as follows:

Hello. This is not a sales or marketing call. We're calling to include your household in an important study about health insurance coverage in Minnesota. We will try to reach you again soon. You can also call us toll-free at 1-800-307-5184. If you are eligible for and complete the interview, we can send you a ($5) ($10) check to reimburse you for your cell minutes.

Consistent with the preceding evidence, when examining what motivates cell phone users to participate in survey research, Lutz and Losch (2012) found that 80% of cell phone respondents would have completed the interview even if they had not been offered the $10 incentive.

Conversely, a few studies have found that $10 cash incentives served to improve the rate of production in a call centre, or the number of completed surveys reported in an hour, when conducting cell phone surveys (Diop, Kim, Holmes and Guterbock, 2008; Diop, Kermer and Guterbock, 2008). In the latter study, a $10 incentive (as opposed to a $5 incentive) improved production so much so that the increased cost of the incentives was offset by the lower cost of interviewing. Nair and Gentry (2012) found that using an advance letter with a guaranteed cash incentive of $2 or $5 (a $1 incentive did not affect the cooperation rate) helped to gain cooperation when respondents were later reached to take part in the telephone survey.Footnote 24

Given the conflicting evidence, more research is needed to determine when incentives are cost-beneficial and actually serve to maximize response rates.

2.1.3 Sampling Design

As mentioned earlier, there are two approaches to a dual sample frame design—overlapping and non-overlapping. With an overlapping approach, the cell phone sample is not screened for dual landline and cell users. Conversely, with a non-overlapping approach, the cell phone sample is screened for CPO households (i.e., overlapping units are removed from the sample frame). While the non-overlapping approach is attractive because it does not have the same statistical complexity as an overlapping design, Kelly, Montgomery, Barron and Koppelman (2012) found that using the overlapping sample is more cost effective. While both approaches produced similar demographic distributions, the CPI (cost-per-completed interview) for the overlapping design was slightly less than half that of the non-overlapping design.

Cost aside, another critical consideration is non-response error. Dual frames are used to limit coverage error, but a non-overlapping design may lead to non-response bias. For example, if dual users are unlikely to be reached on their landline, but are not eligible to complete a telephone survey if reached on their cell phone, the screening approach (i.e., a non-overlapping design) may increase non-response error if dual users differ from landline only respondents. While there is little exploratory research in this area, what is available is far from conclusive.

Alanya and De Keulenaer (2012) found that overlapping dual frame samples (with appropriate weights) produced better population estimates in Belgium and Spain. Conversely, Kennedy (2007) found comparable levels of bias for specific variables (post-weighting to adjust for telephone service and demographics) between the results of overlapping and non-overlapping designs (using data from a 2006 Pew Research Center/AP/AOL study). More experimental research is required to determine the impact of screening for CPO households on non-response bias.

2.1.4 Data Quality and Measurement

Common sense suggests that the quality of data collected via cell phones may not be as rigorous as that collected via landlines. There are several reasons for this belief:Footnote 25

  • Audio quality (for the interviewer and respondent)
  • Distractions for the respondent
  • Respondent and/or interviewer rushing to complete the survey
  • Respondent multi-tasking.

Indicators of data quality include things like missing data (i.e., refusals or no opinion), refusals specific to sensitive questions, satisficing (i.e., choosing socially desirable responses, not differentiating ratings for battery-style questions, tending to agree or answer in the affirmative regardless of the question), or a lack of theoretically meaningful correlations in the data.

Most studies, nevertheless, have found little evidence to support the assumption that there are differences in data quality between landline and cell phone interviews. Kennedy (2010) randomly assigned dual cell and landline survey respondents to a follow-up interview on either a cell phone or landline to examine the effect of mode on data quality. While differences in data quality between landline and cell phone interviews were found, these differences tended to be small and limited in scope. In fact, only one of seven tests found clear evidence of a potential data quality issue—cognitive short-cutting (to avoid thinking or recalling to come up with an appropriate response). While encouraging, the study did not use large sample sizes, and the author cautions against over-interpreting the findings.

Similar results have been reported by other researchers. Witt, ZuWallack and Conrey (2009) found little modal difference between cell and landline data in item non-response or the richness of responses to open-ended questions. Brick et al. (2006) found no significant differences between cell and landline data in terms of missing data, length of open-ended responses, or, like Witt, ZuWallack and Conrey, responses to questions of a sensitive nature. And the Pew Research Center (2006), again like Witt, ZuWallack and Conrey, found no differences in the levels of item non-response between cell and landline respondents.

These findings notwithstanding, AAPOR (2010) maintains that more research in the area of measurement is needed to fully understand that impact of cell phones on survey data quality.

2.1.5 Operational Issues

Several studies have explored issues that are operational in nature.

  • Interview Length: When cell phone interviewing was first introduced, there was a general concern within the industry that longer interviews would not be possible, that cell phone users would not be as generous with their time as landline users. In fact, Brick, Edwards and Lee (2007) found respondents willing to complete a 30-minute questionnaire when reached on their cell phone (albeit a survey sponsored by a state government and on the topic of health, which may generate more interest among the general public). After several years of studies, Keeter (2011) concludes that there is no evidence that respondent break-offs occur more frequently with cell samples.

  • Voice mail Messages: Experimental research has found that leaving voice mail messages when conducting surveys with cell phone users does not increase the likelihood of completing an interview (Benford et al., 2010; Call, 2012). Instead, Benford at al. found that leaving voice mail messages decreased the likelihood of a refusal upon callback.

  • Response Rates: These tend to be lower with cell phone samples than comparable landline samples (at least based on data coming from the United States—Link et al. [2007]).

2.2 Canadian Studies

The United States is the clear leader when it comes to experimental research to explore the implications of cell phones on telephone survey research. That said, there are government, as well as industry studies, in Canada that contribute to the discussion.

The Government of Canada pilot tested different approaches to the inclusion of cell phone samples within telephone surveys of the general public. Each of the three pilot studiesFootnote 26 involved evaluations of advertising campaigns, which required the use of a standardized survey instrument, the Advertising Campaign Evaluation Tool (ACET). The rationale for including cell sample was to: 1) increase representativeness by reducing coverage error resulting from the growing number of cell phone only households; and/or 2) target youth in particular in order to increase the response rate among this sub-group of the general population. Figure 4 provides key highlights for each of these studies to facilitate comparisons.

Figure 4: ACET Cell Phone Pilot Studies—Technical SpecificationsFootnote 27
  2011 Care and Recognition Advertising Campaign 2011-12 ecoEnergy Retrofit Homes Advertising Campaign 2011 Better Jobs
Advertising Campaign
Department VAC NRCan HRSDC
Cost $27,831.90 $67,533.06 $64,802.68
Sample size N=1,007 N=1,000 N=1,300
(300 youth;
1,000 general public)
Average length 9.5 minutes 10 minutes 12 minutes (youth)
7 minutes (general public)
Audience Canadians 18+ Canadians 18+ Canadians 16+
Sample breakdown

871 interviews with landline respondents;

131 interviews with cell phone only respondents

870 interviews with landline respondents;

130 interviews with cell phone only respondents

For the general public survey, 700 interviews were completed with landline respondents and 300 with cell phone only respondents.

For the youth survey, 194 interviews were completed with landline respondents and 106 with cell phone only respondents.

Regional breakdown Regionally
disproportionate
Regionally
disproportionate
Regionally
disproportionate
Sample source

Landline: ASDE Survey Sampler (listed and unlisted sample)

CPO: Sample drawn from Probit, EKOS' online hybrid panel

Landline: ASDE Survey Sampler (listed and unlisted sample)

CPO: ASDE Survey Sampler; random sample based on cell phone exchanges; IVR used to screen for cell only households.

Landline: ASDE Survey Sampler (listed and unlisted sample)

CPO: ASDE Survey Sampler; random sample of active cell phone numbers.

Sample performance

Overall response rate: 16.3%

Call dispositions are not broken out by sample type.

Notable differences between the two samples:

  • No answer: 33% landline; 20% cell

  • Refusals (household): 22% each

  • Refusals (respondent): 6% landline; 10% cell

Notable differences between the two samples:

No answer:

  • GP: 43% landline; 32% cell
  • Youth: 28% landline; 22% cell

Answering machine:

  • GP: 11% landline; 6% cell
  • Youth: 10% landline; 9% cell

Refusals:

  • GP: 13% landline; 16% cell
  • Youth: 9% landline; 9% cell
Non-response
bias
Analysis revealed some sources of systematic sample bias, in particular, the under-
representation of youth (4% vs. 12% in the population).
No analysis conducted. Regarding non-response, the analysis revealed little to no difference between
un- and weighted results.

The technical specifications available for each of these pilot tests are not sufficient to enable reliable conclusions. Nevertheless, it is worth noting the different approaches to sample generation—recruitment through a proprietary online panel, RDD coupled with IVR screening, and RDD—as well as the fairly similar refusal rates regardless of the frame (landline or cell). This suggests that respondents are similarly likely to refuse to participate in a survey, whether contacted on a landline or their cell phone.

Turning briefly to the survey data, U.S. studies have shown clear demographic differences between landline samples and CPO samples (see Section 1.1 above). The data from these ACET surveys permit a comparison of attitudes by sample type: landline and cell phone only.

Figure 5: ACET Cell Phone Pilot Studies—Comparison of Attitudinal DataFootnote 28
  2011 Care and Recognition Advertising Campaign 2011-12 ecoEnergy Retrofit Homes Advertising Campaign 2011 Better Jobs Advertising Campaign
ACET Landline CPO Landline CPO Landline CPO
Q. How would you rate the performance of the Government of Canada in providing information about…?
Negative scores 18% 21% 29% 27% 22% 20%
Neutral 30% 24% 34% 29% 33% 26%
Positive scores 48% 56% 34% 42% 40% 50%
Q. And, using the same scale, how would you rate the performance of the Government of Canada in providing information to the public in general?
Negative scores 35% 30% 33% 28% 29% 29%
Neutral 25% 23% 29% 27% 28% 25%
Positive scores 38% 47% 35% 42% 41% 45%
Q. Generally speaking, how would you rate the overall performance of the Government of Canada?
Negative scores 30% 30% 32% 29% 26% 25%
Neutral 23% 24% 26% 20% 26% 28%
Positive scores 45% 45% 40% 48% 45% 46%

As evident in Figure 5, small differences in attitudes are noted between the two samples: overall, cell phone only respondents were more apt than landline respondents to offer positive ratings of the Government of Canada. This reinforces the growing importance of dual sample frame telephone surveys.

In addition to the ACET pilot studies, several other recent Government of Canada studies have incorporated a cell phone component—primarily for experimental purposes, but also to target youth and/or CPO households. Insights can be drawn from these studies, including, for example:

  • The 2011 Employment Insurance Tracking Survey (HRSDC) found that cell phone users differed in several significant ways from landline users and, especially, online panelists. In matters relevant to Employment Insurance (EI), these differences included holding a more positive view of the economy and labour market, a greater likelihood to be employed, a greater propensity to believe in the security of their jobs and their ability to find comparable work should their position be lost, and a greater likelihood of using EI in the event of unemployment. The conclusion: excluding cell phone users in the research (i.e., using a traditional landline frame) would not have presented an accurate reflect of the attitudes of the Canadian population.

  • The 2012 Arts and Heritage in Canada: Access and Availability Survey (Canadian Heritage) reported demographic results consistent with the U.S. studies—that is, RDD landline telephone surveys under-represent youth. The landline portion of the sample resulted in 12% of all interviews being completed with 18-34 year olds. Once the cell phone completes were added, this proportion increased to 19%. While still under-represented (youth account for approximately 28% of the Canadian population), this figure came much closer to the actual proportion of the population under 35. Also of note, almost half of the cell phone interviews completed were with Canadians under 35 years, which, again, is in line with the U.S. studies that have found cell phone users to be disproportionately young.

Similarly, Statistics Canada has experimented with the use of cell phones as part of the agency's data collection for the Canadian Tobacco Use Monitoring Survey. The sample for the pilot was taken from a list of cellular phone subscribers, 15 years or older residing in Ontario and Quebec (which essentially is a list-based sample, not an RDD cell sample). While the results of this pilot test are not publicly available (due to Statistics Canada's legislative environment), the fact that the agency is experimenting in this area is indicative of the growing importance of cell phones in survey research.

Also worth noting are the conclusions of an earlier pilot study—undertaken in 2008Footnote 29—commissioned by the Government of Canada. The project had several objectives, but the main goal was to collect information that would help government departments and agencies assess the feasibility of cell phone surveys. Numerous conclusions were drawn from the pilot, including:

  • that there are significant demographic differences between CPO households and the general population of Canadians—CPO individuals are younger, more likely to be male and to live in smaller households, and to be lower-income;

  • that there are some attitudinal and behavioural differences between CPO households and the general population (that may not be related to age and income necessarily);

  • that RDD (landline only) sampling is no longer a sufficient means of surveying the general population; and

  • that cell phone interviewing costs considerably more compared to other modes of data collection and should be taken into consideration when designing survey projects.

Almost five years later, these observations are still valid and consistent with the findings of other research conducted in United States and Canada.

Finally, moving beyond public affairs and Government of Canada research, a study by Leger MarketingFootnote 30 found numerous attitudinal differences between CPO and landline households across a range of topics, including health, travel, shopping, and food. Most differences were explained by age—cell phone only users are younger—followed by household status (renting versus owning). That said, even when demographic differences were controlled for by weighting, some attitudinal differences remained (but these were quite random—for example, on the subject of travel, CPO Canadians prefer travel to the Bahamas and, regarding health, they were more likely to have made an effort to improve their health in the past year). Once again, this study tends to support the importance of combined cell and landline RDD telephone surveys. Researchers cannot be certain when attitudinal differences will be the result of cell only use; therefore, the prudent approach is to include cell sample in a survey sample frame.

2.3 Summary

Since the landscape is quickly evolving, experimental research in the U.S. is plentiful, as survey researchers and organizations attempt to fill the knowledge gaps that exist with respect to cell phones and CPO households. Recent research suggests that incentives are not cost-effective with cell phone samples, that an overlapping frame, despite the complexity of weighting, is more cost-effective, and that data quality is not compromised by cell phone data collection. In Canada, there is simply not enough empirical evidence available to help shape cell phone research. Until there is, the U.S. research organizations provide an excellent source of learning.

Figure 6 provides a snapshot of some of the advantages and disadvantages of dual landline and cell phone telephone surveys (based on the existing research).

Figure 6: Summary of Advantages and Disadvantages of Incorporating Cell Phones
  Value-Added Drawbacks
Coverage
  • Reduces coverage error in a general public survey
  • Can significantly increase access to youth and some other sub-groups (e.g., renters, lower-income groups)
  • Increases cost of data collection, especially when a non-overlapping frame is used (i.e., dual users are screened from the sample)
  • Careful weighting is required to ensure proper estimates and the industry's understanding is still in its infancy
Response Rates --
  • Lower response rates with cell sampling
  • Higher noncontacts and refusals
  • Refusal conversion techniques are less useful
  • Can be harder to calculate response rates because a greater proportion of numbers tend to remain unresolved at the end of the field period
Data Collection
  • May reduce field time (i.e., if less dialling and fewer callback attempts are required to reach youth)
  • Requires special protocols
  • Requires specialized interviewer training
  • New call dispositions are needed
Cost
  • May reduce field costs with certain sub-groups like youth
  • More expensive in general

3. Best Practices for Incorporating Cell Phone Samples

A review of the relevant industry associations in North America and Europe found an absence of any widely-recognized 'formal' best practices in the area of cell phones and survey research. The information that follows in this section comes from publications issued by the leading industry associations, including the American Association for Public Opinion Research (AAPOR), the Marketing Research and Intelligence Association (MRIA), the European Society for Opinion and Marketing Research (ESOMAR), and the Marketing Research Association (MRA).Footnote 31 Among these associations, AAPOR has taken a lead role in this area, with the MRIA even referring to AAPOR as the most comprehensive source of information on this topic.Footnote 32 It is worth noting, that at the time of writing, the American Statistical Association, Statistics Canada and the U.S. Census Bureau had not published comparable documents.

3.1 Overview

What is most apparent in the publications is that the landscape is quickly changing and that the industry has yet to reach a consensus on standards or best practices. In fact, the 2010 AAPOR Cell Phone Task Force Report concludes that it is premature to establish standards or 'best practices' to address the issues surrounding cell phone survey research. The task force chair, Paul Lavrakas (Ph.D.), a leading researcher in the area, was more blunt, when he recently stated that "Anyone who claims there's a best practice doesn't know what they're talking about. We as an industry don't know."Footnote 33 Accordingly, the language used in the publications examined includes expressions such as 'guidelines', 'recommendations', or 'considerations'. The MRA publication is alone in using the term 'best practices' in relation to cell phone research.

The guidelines, recommendations, and considerations offered by these associations cover the entire lifecycle of a telephone survey project, from design to reporting. While details and specifications appear to vary across the publications, there is considerable overlap in the types of considerations or issues identified by these associations. Where there are differences regarding suggested practices and cell phone sampling frames, these differences tend to complement rather than contradict one another. What follows below is a discussion of various guidelines, recommendations and considerations when conducting cell phone survey research, organized by theme or topic area. Once again, these are not universally accepted 'best', or even suggested, practices.

3.2 Design-related Considerations

According to the leading industry associations, the following issues should be given consideration when designing a telephone survey project:

  • Coverage and Sampling: Including an RDD cell phone frame in a telephone survey minimizes the potential for error resulting from inadequate coverage of the target population with a RDD landline sampling frame. When it comes to coverage, and the potential benefit of including cell phones when surveying the general public, there are several key issues that should be given consideration during the design phase:

    • Researchers need to decide whether the dual frame design will be overlapping (i.e., with no screening for landline telephone service and usage) or non-overlapping (i.e., screening cell phone sample for CPO households). At this time, the AAPOR Task Force feels that neither of the dual frame sample designs is consistently preferred by researchers.

      • An overlapping approach is less expensive, but requires the construction of a more complicated weighting scheme to reflect the multiple probabilities of respondent selection. This is further complicated by the issue of dual cell and landline users. Studies suggest that 'wireless-mostly' and 'landline-mostly' individualsFootnote 34 have different response propensities depending on which service they are contacted for a survey. This can lead to differential non-response, and biased survey results, if such differences are not accounted for in the weighting scheme.

      • A non-overlapping approach makes weighting much simpler post-data-collection, but is a far more expensive option given the relatively low incidence of CPO households (estimated to be less than 15% in Canada at this time).

    • Consideration should be given to purchasing cell phone sample. A number of issues are identified by AAPOR: how the sample provider's frame been constructed, how often the frame is updated, the types of wireless services included (e.g., dedicated, shared, special billing), the extent of non-coverage and overlap between the provider's landline and cell frames, how shared service numbers are handled, as well as the levels of geography available for sample selection and how they have been determined. See Section 4 for a discussion of cell samples in Canada.

    • Sample allocation is another decision to be made by researchers—that is, the relative proportions allocated to the cell phone and landline frames. In terms of guidance, AAPOR suggests that enough interviews be completed with the cell phone sample to avoid the need for large weights.

    • Within-household sampling is another issue that should be considered when conducting cell phone research. A RDD telephone survey typically uses the 'most recent birthday approach' to randomize respondent selection within a household. When a potential respondent is reached on a cell phone, the interview is typically conducted with this person, provided that the individual meets the study's eligibility criteria, because any attempt at respondent selection may result in a refusal. In the U.S., most organizations treat cell phones as personal, not household, devices.

  • Questionnaire Length: Questionnaire length is an important consideration for all survey research. However, it takes on even more importance with dealing with a cell phone sample frame. While evidence tends to be anecdotal, some researchers report that cell phone respondents are more difficult to keep on a call than respondents on landlines (i.e., respondent break-offs). Respondents speaking on their cell phone may be more easily distracted—for instance, they may be engaged in other activities like driving, shopping, or exercising. In addition, because of the mobile nature of cell phones, respondents' environments may change during the course of an interview. They may go from having no distractions to many, or from a safe environment to one where their personal safety (e.g., driving) or confidentiality (i.e., from a private to a public space) may be at risk during the interview. Given these concerns, AAPOR and ESOMAR suggest that researchers may want to consider whether the length of an interview conducted on a cell phone should be shorter than one conducted on a landline.

  • Remuneration: The issue of remunerating cell phone respondents stems from the ethical concern that researchers not do any harm to a respondent, including not causing the respondent to bear any financial burden on behalf of the researcher. Given the nature of cell phone billing (i.e., some plans bill customers by the minute), there may be a financial burden associated with responding to an incoming survey research call. When appropriate, these associations suggest that researchers give consideration to offering some form of remuneration to eliminate, or offset, the potential financial burden of participating in a survey.

  • Cost: The cost and anticipated benefit(s) of including cell phone sample in a telephone survey should be carefully considered by researchers. Including cell phone sample can be expensive (especially when targeting CPO households—i.e., using a non-overlapping sample frame approach), and the cost of doing so, may not justify the perceived benefits of greater coverage. The Pew Research Center has found that it takes roughly 60% more working numbers to complete an interview with the cell phone sample. This is due, in large part, to higher ineligibility rates (e.g., ineligible minors).

3.3 Data Collection Considerations

Conducting a dual frame RDD landline and cell phone telephone survey requires careful consideration of myriad operational issues that affect the data collection process. These include, but certainly are not limited to, the following:

  • Respondent Safety: Above everything else, researchers need to consider respondent safety and take all reasonable precautions to ensure that respondents are not harmed or adversely affected as a direct result of participating in an interview. Respondents should be screened to ensure that it is safe and legal to conduct the interview at the time of the call. Furthermore, consideration should be given to the possibility of calling back at another (safer) time, possibly even on a landline telephone.

  • Data Quality: When developing data collection protocols for a dual frame RDD landline and cell phone telephone survey, researchers should be attentive to the issue of data quality. If a respondent is contacted while in an environment that is not conducive to providing full and accurate answers (e.g., in a public place when contacted to take part in a survey on a sensitive subject matter such as personal health), data quality may be compromised if the respondent is not comfortable providing truthful and accurate responses. Respondents in public or semi-private places should not be required to verbalize responses that could: 1) reasonably place them at risk of criminal or civil liability; 2) be damaging to their financial standing, employability, or reputation; or 3) otherwise violate their privacy. In these situations, it is generally best to schedule a callback with the respondent.

  • Calling Protocols: Since some people consider their cell phone to be a personal (or private) device, researchers need to be sensitive to potential privacy concerns. For this reason, when conducting a dual frame telephone survey, it is appropriate for calling protocols to vary by frame. To reduce the potential for irritating (and/or being seen as harassing) potential respondents in a cell phone sampling frame, AAPOR recommends that the total number of call attempts be limited to no more than 10 attempts, with the ideal range being six to 10 call attempts). In addition, the Cell Phone Task Force suggests that callbacks not be attempted too frequently (e.g., several call attempts within a 24-hour period).

  • Calling Times: Cell phones are mobile by nature and can be used in geographic areas other than the geographic area in which the cell phone was purchased. This means that a cell phone number registered in one time zone may, at the time of the survey call, be in a different time zone. AAPOR, ESOMAR and MRA suggest that calling windows may need to be modified to reduce the chances of reaching a respondent who has moved to, or is currently in, a different time zone at a time considered too early or too late for calling.

  • Call Dispositions: AAPOR suggests that these should be adjusted to accommodate cell phones. For example, it is standard to log a record as 'no answer' after six rings. However, when dealing with cell phones, this should be extended to eight rings before a call is logged as no answer. As well, consideration should be given to new codes based on interim outcomes that only apply to cell phones (e.g., 'not in service at this time', 'network busy', 'customer unavailable at this time').

  • Voice mail Messages: A common practice to help maximize the likelihood of interviewing a landline respondent is leaving a voice mail message. Leaving a voice mail message on the first call attempt to a cell phone can act as an important pre-alert of the survey. An important consideration, however, is whether a callback number is left in the voice mail message. Researchers should decide whether interviewers should leave a callback number in this message because the outbound number that appears on the cell phone's Caller ID may not be valid for an inbound callback.

  • Personal and/or Business Cell Phones: Many respondents using a company-provided cell phone typically use the phone to take both business-related and personal telephone calls. For this reasons, researchers should establish clear and consistent rules for interviewers to use to determine when a number should be assigned a disposition of "business phone".

  • Refusal Conversions: Refusal rates tend to be higher among respondents reached on a cell phone than among those reached by landline telephone surveys. As such, it is recommended that until definitive research has been conducted, refusal conversion attempts be of a limited nature to reduce the potential for further agitating cell phone respondents. This is in large part a result of reaching again the same respondent who previously refused and not some other member of the sampling unit (household).

  • Interviewer Training: Interviewing respondents on a cell phone is a more complex task than interviewing a respondent on a landline. Therefore, researchers should ensure that interviewers are properly trained to handle these interviewing requirements and have the tools (e.g., scripts and other protocols) at hand to conduct a high quality interview when reaching a respondent on their cell phone.

  • Response Rate Calculations: AAPOR suggests that for dual frame surveys, response rates should be calculated first for each frame, and then for the survey overall, using weights that reflect the sample allocation proportions.

3.4 Weighting

Weighting is another consideration, and one that is addressed at length by AAPOR. When conducting a dual frame RDD telephone survey, weights are needed to account for the different frames. With an overlapping design, weights must take into account the multiple probabilities of selection. Weighting is simpler when working with a non-overlapping design. Essentially the design is treated as a stratified sample with three strata: landline only, dual users ('wireless-mostly' and 'landline-mostly'), and cell only. Weights are first calculated for the landline sample (which includes dual users and must factor in response propensity by service) and the cell sample separately, and then combined. When it comes to suggested practices, the literature is generally silent, except to emphasize the importance of disclosing weighting procedures so the industry, collectively, can learn which approaches are the most effective.

3.5 Legal Issues and Ethical Considerations

All the documents consulted stressed the importance of legal issues and ethical considerations when conducting cell phone research. Researchers need to ensure compliance with existing laws. In the United States, the Telephone Consumer Protection Act prohibits the use of an auto-dialer with cell phones unless there is prior consent. In Canada, there are no comparable laws. Legal issues include being in compliance with restrictions on calling cell phones (if any exist), text messaging, and spam, as well as considering the possible implications of such laws.

Ethical considerations include:

  • Time-of-Day Calling Restrictions: Some people may be reached in a time zone other than the one the survey sample is meant to cover geographically. Therefore, interviewers who work cell phone survey samples should be trained about how to politely explain the inadvertent problem of reaching someone in a different time zone "too early" or "too late".

  • Taking Safety and Respondent Privacy into Account: To ensure safety, any researcher who conducts a survey that reaches people on a cell phone should take appropriate measures to help protect the safety of the respondent and whoever else may be nearby. It is suggested that researchers leave the responsibility for determining safety to the respondents themselves and encourage respondents to consider their own safety by asking about it directly (e.g., "Are you in a place where you can safely talk on the phone and answer my questions?"). To ensure privacy, respondents in public or semi-private places should not be required to verbalize responses that could reasonably place them at risk of criminal or civil liability, be damaging or to their financial standing, employability, or reputation, or otherwise violate their privacy.

  • Transmitting Accurate Caller ID Information: Given that cell phones routinely display the number of the party that is calling, researchers should avoid any inadvertent or purposeful falsification of Caller ID information, either in terms of the number displayed or the name of the calling party. MRA recommends that all researchers use calling equipment that is capable of transmitting Caller ID information and ensure that the telephone number and other identifying information that is transmitted allows the call recipient to identify the entity making and/or responsible for the call. It also is advised that the transmitted number be one that the respondent can call back on.

  • Maintaining an Internal Do Not Call List: Non-response due to refusals is generally higher in cell phone surveys compared to similar landline surveys. For this reason, organizations that conduct cell phone surveys should consider the ethics of maintaining an internal Do Not Call List of cell phone numbers to respect the wishes of owners who have requested not to be called again by the organization.

3.6 Summary and Implications

There are myriad issues associated with conducting survey research with cell phone users. As discussed, these issues range from methodological considerations, to legal and ethical concerns, to operational implications for survey field houses. While the issues identified in this section have implications related to the entire lifecycle of a survey project, they clearly have a major impact on proper interviewer training. This was identified as an important issue in the AAPOR Cell Phone Task Force report but bears mentioning again, given that interviewing respondents by cell phone is a more complex task than interviewing a respondent on a landline (e.g., calling protocols, eligibility requirements, respondent safety, etc.). Finally, as far as best practices are concerned, at this time, there are none to speak of. The research industry in North America is grappling with cell phones and how best to incorporate them in order to deliver representative and reliable survey results. Any attempts by the Government of Canada to develop best practices regarding cell phones, at this time, would appear to be premature.

4. Cell Phone Sample Frames in Canada

Having discussed the importance of dual sample frame telephone surveys, the state of current practice, as well as best practices, the question now is what is available in Canada in the area of cell phone sample frames?

4.1 Sample Providers

The majority of sample providers in Canada do not appear to have entered the cell phone market—at least not at the time of this literature review. Based on the listing of sample providers registered with the MRIA (as part of the Research Buyer's GuideFootnote 35), very few firms appear to have offered cell samples for purchase.

4.2 Sample Frame Construction

There are several ways in which cell phone sample frames can be constructed—RDD, pre-screened cell phone numbers, and cell phone only households. Each approach has its own methodological issues and cost implications.

4.2.1 Random Samples

The least expensive cell phone sample available for purchase are RDD sample frames. Since directories of current cell phone numbers do not exist, cell phone sample frames are generated based on the list of existing dedicated cell phone exchanges in Canada. One firm, for example, breaks cell phone exchanges into blocks of 100 numbers, weights the sample by geography, and then calculates the quantity of numbers needed per block. The numbers themselves are randomly selected via a computer program. The typical connection rate is 50-55%. These samples will include CPO households and dual phone users (cell and landline households).

The administrative information available with these sample frames is limited to the area code and the rate center. This means that the cell number can be linked to a province and the city in which the phone exchange switching station is located, and this information can be combined to link the number of a Census Division (CD) in Canada. The numbers cannot be linked to smaller geographical areas, like a census subdivision (i.e., a municipality).

Pros

  • Least expensive type of cell sample.
  • Sample is a random probability sample.

Cons

  • Includes dual users, so screening needed to target CPO households.
  • Numbers can only be linked to CD; if weighting to a lower geographic level—e.g., city level—is needed, additional data, perhaps area code, needs to be collected from respondents.
4.2.2 Working Cell Numbers

Another option when it comes to cell phone samples is a sample frame of 'working' cell phone numbers. Some firms maintain a database of pre-screened working cell phone numbers. These numbers have been 'pre-dialled' and the Not in Service (NIS) records removed.

Pros

  • All numbers purchased should be working cell phone numbers.

Cons

  • More expensive than a random sample.
  • NOT a random probability sample.
  • Results cannot be considered probability-based.
  • Includes dual users.
  • Numbers can only be linked to CD.
4.2.3 CPO Households

The third option for cell phone samples is a sample frame of CPO households. CPO sample can be compiled from a database of working cell numbers. The only difference is that all of these numbers are confirmed CPO numbers. Screening can be done via an IVR system. Like the working cell number sample, some of the numbers included in the purchased CPO sample may no longer be in service. By far, this is the most expensive type of cell phone sample at this time.

Pros

  • All numbers purchased should be working CPO numbers.
  • Screening for CPO households is not required, which results in a shorter interview.

Cons

  • Very expensive.
  • NOT a random probability sample.
  • Results cannot be considered projectible.
  • Numbers can only be linked to CD.
  • Poor geographic coverage.

4.3 Conclusions and Implications

Cell phone sample is available in Canada, but only through a very small number of providers (at the time of writing, two providers). While several different types of cell phone sample are available, only the random samples should be used when conducting dual frame RDD telephone surveys. Cell phone only households would need to be identified through screening, which naturally, increases the cost (i.e., quantity) of the sample required given the incidence of CPO households in Canada. And, while convenience samples of both working cell phone numbers and CPO households are available, and have some advantages depending on the type of research being conducted, they are not random probability samples, and will not generate survey results that can be considered projectible to the target population.

5. Conclusions and Recommendations

The issue of cell phones and telephone surveys is at the forefront of discourse within the public opinion research community in North America. Clear trends in declining landline use have been documented and the consensus view is that the proportion of CPO households will continue to grow. Interest in the issue stems largely from a concern about coverage error and the potential for non-response bias resulting from the systematic exclusion of a segment of the general population from national telephone surveys.

From this review of related literature and experimental studies, one of the most important pieces of learning is that the industry is in a state of flux because the implications of CPO households for survey research are not yet fully understood. While there is widespread agreement that the demographic profile of CPO households differs from landline households, there is no fulsome understanding of where, when or how often such differences manifest themselves in attitudes and opinions. In addition, there is no agreement on the most effective and efficient approaches to sampling cell phones and including them in telephone surveys. Currently, the industry is experimenting with, for example, sampling designs, data collection and interviewing protocols, as well as analytical issues stemming from dual frame (landline and cell phone) sampling. At this point in time, there is no 'right' methodological approach to govern the inclusion of cell phones in telephone surveys.

Based on the literature reviewed, we offer the following recommendations for PORD's consideration. It is understood that some of these actions and/or activities may already have been undertaken by the Directorate, but they are nevertheless included here in an effort to be comprehensive. PORD may want to consider the following:

  • Do not develop standards or best practices at this time. The findings suggest that it would be premature for the Government of Canada to establish a set of best practices regarding the use of cell phone samples in federal government-commissioned POR telephone surveys. As noted, other than the MRA, no industry organization has put in place best practices governing the inclusion of cell phone samples in telephone surveys. And, while the MRA refers to its one-pager for researchers as 'Best Practices', the document is very narrow in scope. It deals only with compliance with the Telephone Consumer Protection Act (TCPA) in the U.S. and ethical guidelines related to calling protocols and respondent safety.

  • The inclusion of cell phones in POR telephone surveys should be considered on a case-by-case basis. Some guidelines for consideration in so doing include: 1) the need for nationally-representative results; 2) the target audience; and 3) the available budget. Survey design should reflect a balanced approach, one that addresses each of these considerations. For example, should budget be limited and statistical precision not be the top priority for a particular study, it may be sufficient and valid to exclude cell phones from the sampling design.

  • Regarding target audience, POR telephone surveys targeting specific sub-groups of the Canadian population for which there is a high proportion of CPO households documented (e.g., youth) should consider including cell phone sample (to the extent that this is feasible). Surveys of this type that do not sample cell phones should be required to include a rationale to explain why not, as well as a discussion on the impact of this on the survey results.

  • For studies that do incorporate cell phone sample, there are several considerations that should be discussed with research firms, even though it is too early to establish best practices in these areas:

    • Sample design: When RDD sampling, will the RDD cell and landline frames overlap or will screening be used to exclude those with a landline from the cell sample? PORD may want to ask research suppliers to clearly state in their proposals which approach will be used for the study and provide details about how it would be implemented.

    • Remuneration: Is there any new research/evidence that supports the use of incentives? If not, should a research supplier recommend the use of incentives, PORD may want to have the supplier explain the reasoning for this approach because evidence (at the time of writing) suggests that incentives are not cost-effective, although research in this area is not definitive.

    • Call protocols: PORD may wish to have research suppliers limit the number and frequency of callbacks when cell samples are used to avoid harassment (real or perceived). At this time, six to 10 call attempts is what is suggested by AAPOR. It is important to note that there is a trade-off between the number of callbacks and response rates. In general, the fewer the callbacks, the lower the response rate of a survey. Since the Government of Canada is interested in maximizing response rates, having too restrictive a callback regime will work at cross-purposes with this objective.

    • Weighting: Although the industry is still grappling with how to best weight surveys that include cell phones, and there is no single approach recommended, this is something that PORD may want to: 1) have addressed in proposals (e.g., will weights be applied? what is the anticipated approach?); and 2) require research firms to include relevant information in the methodological section of survey reports (i.e., how were the weights constructed and applied?).

  • Time zone differences should be taken into account when scheduling interview times. Given that cell phone subscribers are free to take their cell phone devices and numbers with them when they travel or relocate across Canada, there exists the potential for contacting cell phone subscribers at times inconvenient for the respondent owing to time zone differences. PORD may want to ask suppliers to specify times that interviewing will be conducted from the call centre in order to ensure to ensure that no respondent is contacted at an inappropriate time (e.g. after 9:30 PM local time.)

  • Interviewer training is critical when it comes to studies that incorporate cell phones, considering the additional sensitivities/requirements (i.e., respondent safety, data quality, technical issues, etc.). PORD may want to confirm with suppliers that interviewers assigned to cell phone studies receive appropriate training and/or stipulate this as a requirement to be addressed in statements of work for POR studies.

  • Continue to use telephone surveys, like ACET studies, for experimental purposes. These studies are generally standardized in terms of methodology (i.e., sample design and questionnaire), which means that demographics and certain variables can be tracked over time to examine the impact of CPO households (i.e., their inclusion or lack of inclusion) on survey results.

  • In order for the industry in Canada, and PORD specifically, to increase learning in this area and grow the base of knowledge available, Government of Canada research reports for studies that incorporate cell phones should fully disclose the methods used, in particular the methods that apply specifically to the cell phone sample. This includes the sampling design, technical specifications (i.e., number of callbacks, calling times, call dispositions, etc.), as well as how the two samples are merged and weighted to be representative of the target population. Consider specifying this as a requirement in statements of work for POR studies, as well as adding it to the Standards for the Conduct of Government of Canada Public Opinion Research - Telephone Surveys.

  • Monitor current thinking in the survey research community by keeping abreast of activities in Canada and elsewhere - in particular, the United States. For example, the AAPOR Dual Frame RDD Task Force is expected to begin work again in 2013. AAPOR has identified a need for more research to better understand non-response in dual frame surveys (and the potential for bias), as well as the most efficient way of contacting dual cell and landline users, among other things. Given this call for research, and AAPOR's efforts in this area, it is reasonable to expect new learning to be achieved, on a fairly regular basis, over the foreseeable future.

  • Keep track of legislative initiatives in Canada that might affect cell phone sampling. Recall that legislation in the United States (i.e., the TCPA) prohibits autodialling of cell phone numbers (all numbers must be manually dialled) unless the cell phone owner has given prior consent to be contacted this way.

  • Consider partnering with MRIA, the industry association in Canada, to explore issues related to CPO households and the impact of cell phones in various areas. For instance, AAPOR has indicated that standard call disposition codes, as well as response rate calculations, may need revision to accommodate cell phones. Similarly, so too may callback protocols, interviewer training, and refusal conversion strategies require re-thinking for cell phone samples.

  • Consider further discussions with Statistics Canada to learn from the Agency's experience in this area. As Canada's national statistical agency with a mandate "to provide statistical information" and "to promote sound statistical standards and practices", it stands to reason that Statistics Canada is exploring the impact of cell phones on telephone surveys from myriad perspectives—i.e., sampling, calling protocols, measurement error, non-response bias, etc. As such, much can be learned from Statistics Canada's experimental efforts in this area, as well as the Agency's expert analysis and commentary that contributes to the international research community's basis of methodological knowledge. In addition to Statistics Canada's thinking and practices in the area of cells phones, it may be valuable for PORD to establish some form of information sharing with Statistics Canada—e.g., identifying a key contact at the Agency and nurturing a mutually beneficial (informal or otherwise) working relationship.

  • Cell phone sampling has cost implications. Data from the United States suggests that cell phone completes are roughly twice as expensive as landline completes and that this increases two-fold when non-overlapping dual frame sample designs are used. Cost in the U.S. is affected by legislation that prohibits auto-dialling of cell phone numbers (which is not the case in Canada). Even though auto-dialling is permitted in Canada, cell phone completes do cost more than comparable landline interviews. Going forward, federal government departments and agencies should be aware of this fact and be prepared to adjust the budgets allocated to POR telephone survey projects to accommodate the additional costs of cell phone sampling. A move toward including cell phone sample without sufficient budget will drive down the quality of survey research.

Appendix

Sources Consulted and Works Cited

Footnotes

Footnote 1

Return to footnote 1

MRIA (2007). Cell Phones and Surveys: Issues of Interest to Field Operations (PDF Version 480 KB) (Help with PDF files).

Footnote 2

Unless otherwise specified, data are from Statistics Canada's Residential Telephone Service Survey (2010).

Return to footnote 2

Footnote 3

Of the rest, 16% reported various combinations of phone services (but do not have a traditional landline), 4% receive phone service exclusively via a cable or voice over Internet provider, and 1% have no phone service of any kind.

Return to footnote 3

Footnote 4

Return to footnote 4

Canadian Radio-Television and Telecommunications Commission (CRTC) (2010). Navigating Convergence: Charting Canadian Communications Change and Regulatory Implications. Convergence Consulting Group's forecast: footnote 62.

Footnote 5

Lavrakas (2011). Is the Exclusion of Mobile Phones from Telephone Surveys a Problem: The U.S. Experience. Prepared for the Australian Mobile Phone Survey Workshop.

Return to footnote 5

Footnote 6

Return to footnote 6

Blumberg and Luke (2010) (PDF Version 332 KB) (Help with PDF files).

Footnote 7

Unweighted data from 2010 surveys conducted by the Pew Research Center cited in: Keeter (2011).

Return to footnote 7

Footnote 8

Statistics Canada's Residential Telephone Service Survey (2010).

Return to footnote 8

Footnote 9

EKOS / Arcturus Solutions. (2008). A Survey of Cellular-Telephone-Only Households: The New Technologies ("Web 2.0") and Government of Canada Communications Project. POR-300-07. Prepared for Public Works and Government Services Canada.

Return to footnote 9

Footnote 10

AAPOR (2010).

Return to footnote 10

Footnote 11

Return to footnote 11

Gallup website. Does Gallup call cell phones?

Footnote 12

Return to footnote 12

Harwood, J. (August 5, 2012). Pollsters Struggle to Pin Down the Right (Cell) Number.

Footnote 13

Lavrakas (2011).

Return to footnote 13

Footnote 14

Return to footnote 14

NADbank (No date). Cellphone-Only Household Sample Added to Major Markets.

Footnote 15

Keeter, S. (2011). Adding Cell Phones to Your Telephone Surveys. An AAPOR webinar presentation.

Return to footnote 15

Footnote 16

Return to footnote 16

Weinger, M. (2012). Gallup ups cell-hones to 50 percent.

Footnote 17

Return to footnote 17

Harwood, J. (August 5, 2012). Pollsters Struggle to Pin Down the Right (Cell) Number.

Footnote 18

Ibid.

Return to footnote 18

Footnote 19

Return to footnote 19

AAPOR (2010). Cell Phone Task Force.

Footnote 20

Keeter, S. (2011). Adding Cell Phones to Your Telephone Surveys. An AAPOR webinar presentation.

Return to footnote 20

Footnote 21

Ibid.

Return to footnote 21

Footnote 22

Pew Research Center (2012). Assessing the Representativeness of Public Opinion Surveys.

Return to footnote 22

Footnote 23

Keeter, S. (2011). Adding Cell Phones to Your Telephone Surveys. An AAPOR webinar presentation.

Return to footnote 23

Footnote 24

Such a methodology is not possible with RDD sampling (at least not in terms of cost-effectiveness).

Return to footnote 24

Footnote 25

Keeter, S. (2011). Adding Cell Phones to Your Telephone Surveys. An AAPOR webinar presentation.

Return to footnote 25

Footnote 26

Full citations for each study are available in the appendix, in the Reference List.

Return to footnote 26

Footnote 27

Projects 2 (NRCan) and 3 (HRSDC) are of higher cost than project 1 (VAC) because additional qualitative research was undertaken and the costs for targeted sampling/oversampling. More specifically, the ecoEnergy project contained the pretesting component for the advertising campaign, while the Better Jobs project was a post-campaign test only but included oversamples of youth, and two waves of research.

Return to footnote 27

Footnote 28

All data in the table are unweighted.

Return to footnote 28

Footnote 29

EKOS / Arcturus Solutions. (2008). A Survey of Cellular-Telephone-Only Households: The New Technologies ("Web 2.0") and Government of Canada Communications Project. POR-300-07. Prepared for Public Works and Government Services Canada.

Return to footnote 29

Footnote 30

Leger Marketing. (2011). Cellular-Only Householders vs. Landlines: Are there Attitudinal Differences? Presented by Barry Davis at the MRIA Conference in Kelowna, BC.

Return to footnote 30

Footnote 31

AAPOR materials consulted include the 2008 and 2010 Cell Phone Task Force reports (the latter updating and adding to the 2008 report), a slide presentation on guidelines and considerations based on the task force study, and a short document titled Legal and Ethical issues in RDD Cell Phone Surveys. Information from MRIA comes from a 2007 document titled Cell Phones and Surveys: Issues of Interest to Field Operations. ESOMAR's position on standards and best practices is summarized in a publication titled Guideline for Conducting Survey Research via Mobile Phone. Finally, the information from the MRA is in a one-pager titled Calling Cell Phones: Best Practices for Researchers. All works consulted are listed in the Appendix.

Return to footnote 31

Footnote 32

Return to footnote 32

MRIA (2007). Cell Phones and Surveys: Issues of Interest to Field Operations (PDF Version 480 KB) (Help with PDF files).

Footnote 33

Return to footnote 33

Harwood, J. (August 5, 2012). Pollsters Struggle to Pin Down the Right (Cell) Number.

Footnote 34

These are individuals who, in theory, are accessible through both a cell and landline frame, but who, in practice, receive most calls on one or the other service.

Return to footnote 34

Footnote 35

All firms contacted were advertised in the Sampling/Data/Analysis/Software section of the 2011-2012 MRIA Research Buyer's Guide.

Return to footnote 35