Canadian Journal of Educational Administration and Policy, Issue #37, March 12, 2005. © 2005 by CJEAP and the author(s).

What Parents Know and Believe About Large-Scale Assessments

by Ming Mu and Ruth A. Childs, Ontario Institute for Studies in Education of the University of Toronto

Every year, millions of elementary school students take large-scale assessments. In Ontario alone, almost 300,000 students in grades 3 and 6 take assessments in reading, writing, and mathematics. Not surprisingly, many parents want information about the assessments their children take. In response, many jurisdictions prepare informational materials specifically for parents. These materials may provide information about when the testing will occur, suggestions for preparing the students to take the tests, rationales for the testing, information about how to interpret the results, or some combination of these.

The purpose of this study is to investigate to what extent parents access the information available about a large-scale assessment and how what they know and believe about the assessment is related to that information, through a survey of parents of children in an urban Ontario school. Also examined is the relationship between parents’ beliefs and their own experiences taking large-scale assessments.

What Parents Believe

Some information is available about parents’ beliefs about large-scale assessments. For example, Livingstone, Hart, and Davie (2001), in a survey administered in 2000, found that 46% of Ontario parents, but only 12% of teachers, believed that “using province-wide tests to measure how students are doing” would “improve student achievement” at the elementary level (p. 25). At the secondary level, 76% of parents, but only 35% of teachers agreed “students should have to pass a provincial examination in each compulsory subject in order to graduate from high school” (p. 30). A 2004 survey of teachers and members of the public by the Ontario College of Teachers (Jamieson, 2004) found that 43% of the public, but only 11% of teachers believed that province-wide assessments were accurate or very accurate.

In a study conducted by Shepard and Bliem (1995), the majority of parents rated informal sources of information, such as talking to the teacher and seeing graded samples of their child’s work, as more useful than large-scale assessments for learning about their child’s progress in school and for judging the quality of education provided at their child’s school, though some parents thought such assessments could also be very informative. Other studies of parents’ opinions regarding large-scale assessments, however, found a high percentage of respondents in favour of such assessments (Elam, Rose & Gallup, 1992; Phelps, 1998). The 1992 Gallup Poll indicated that more than 70% of public school parents favoured using large-scale assessments to measure students’ academic achievement. Phelps (1998) reviewed three decades of public polls and surveys about student testing and concluded that parents generally supported large-scale assessments.

What Parents Want to Know

Little information is available about the sources of information on which parents base their beliefs about large-scale assessments. An exception is Barksdale-Ladd and Thomas’s (2000) study, in which they interviewed 20 parents and found that most had learned about the tests their children were to take from their children’s teachers. Parents also reported gleaning information from the media. However, as Koretz and Diebert (1993) found, many media reports of test results may be simplistic or incorrect.

More research has investigated what teachers want and need to know about large-scale assessments. For example, Grant (1999) found that teachers in his study wanted to understand the rationales for adopting specific assessment-based reforms. Brookhart (1999) has suggested that understanding how to interpret test results is particularly important for teachers, as parents are likely to turn to teachers for assistance in understanding their children’s results. Studies by Plake, Impara, and Fager (1993) and Marso and Pigge (1993) have investigated the extent of teachers’ knowledge.

Childs, Jaciw, and Schneid (2004) identified eight categories of information teachers require about large-scale testing programs, based on an analysis of The Standards for Teacher Competence in Educational Assessment of Students (American Federation of Teachers, National Council on Measurement in Education, & National Education Association, 1990), the Principles for Fair Student Assessment Practices for Education in Canada (Joint Advisory Committee, 1993), and the Standards for Educational and Psychological Testing (American Educational Research Association, American Psychological Association, and National Council on Measurement in Education, 1999). These are (1) purpose, (2) content and format, (3) administration, (4) practice items and test-taking strategies, (5) consequences, (6) using and explaining results, (7) technical quality, and (8) results from previous years. For parents, the first six of these categories affect how their children are prepared for the test, how the test is administered to them, and how their results are reported, and so are likely to be of interest. Where test results have consequences for individual students or for schools, parents may also be concerned about the seventh category, the technical quality of the test. Where schools are ranked on the basis of their test results, the eighth may also be of interest to parents.

Sources of Information

Although little research has examined the information available to parents, considerable effort has been expended by Ministries and Departments of Education, school boards, parent groups, and publishers to provide information. For example, the North Carolina Department of Education has created numerous publications for parents, including “Understanding Your Child’s End-of-Grade Test Scores” and “Myths About Testing and Accountability in North Carolina” (both at www.dpi.state.nc.us/parents.html). The Association of Test Publishers’ answers “Frequently Asked Questions” (at www.testpublishers.org). The U.S. Department of Education provides tips on how to “Help Your Child Improve in Test-Taking” (at www.ed.gov/pubs/parents/TestTaking).

In Ontario, the Education Quality and Accountability Office (EQAO), which develops and administers the province-wide assessments, provides information for parents through its website (www.eqao.com). For parents of grade 3 and 6 students, this includes a one-page “Parent Bulletin” describing the assessments, sample test items, and an “Assessment Results Guide” explaining how to interpret the test results. The “Frequently Asked Questions” page provides answers to a variety of questions parents may have. Through the website, parents also have access to materials that the provincial testing agency has prepared for teachers.

Ontario school boards and schools also provide information to parents by issuing newsletters and occasionally by running information sessions. The websites of school boards and schools give information about their test results and how their test results compare with the board and/or provincial averages. Some school board websites provide a link to the provincial testing agency’s website. Principals and teachers may also provide information.

The children’s sections of many bookstores include shelves of “educational materials” – workbooks focusing on curriculum areas or, in some cases, intended as preparation for large-scale assessments. Examples include Better Test-Taking Skills in 5 Minutes a Day: Fun Activities to Boost Test Scores for Kids and Parents on the Go (Pennington, 2001) and What Every Parent Needs to Know about Standardized Tests: How to Understand the Tests and Help Your Kids Score High! (Harris, 2002). Books for specific testing programs are also available: for example, the No-Stress Guide to the 8th Grade FCAT (Florida Comprehensive Assessment Test; Johnson & Johnson, 2000) and Help Your Child Prepare for Ontario Grade 3 Language Tests (Yeaman, 1999).

Other materials for parents provide critiques of large-scale assessment. Testing! Testing! What Every Parent Should Know About School Tests (Popham, 2000) advocates parents’ involvement in improving testing programs. Some materials are strongly against testing: for example, publications from the National Center for Fair and Open Testing (FairTest, www.fairtest.org) and The Case Against Standardized Testing: Raising the Scores, Ruining the Schools (Kohn, 2000).

The media also provide information about large-scale assessments. Ontario newspapers and magazines have published numerous articles criticizing the Ontario assessments as wasting taxpayers’ money and creating unnecessary anxieties in children. Opponents of the assessments have discussed possible bias in the assessments and the effects of over-emphasis on the improvement of scores and have challenged the validity and reliability of the tests (Cook, 2001; Lindgren, 2001; Macleod, 2001; Mulroney, 2002; Wake, 2001a, 2001b; Walters, 2001).

Why Parents Need Information

Positive effects of parental involvement in their children’s education have been demonstrated across a wide range of age levels and in a variety of ways (Fehrmann, Keith, & Reimers, 1987; Steveson & Baker, 1987; Thurston, 1989). Studies of parental involvement have shown that parents’ active and appropriate involvement can improve children’s school performance and grades. The need for involvement extends to large-scale assessments. As Popham (2000) argues,

Parents cannot be adequately involved in their children’s education if the topic of school testing is off-limits. And school testing will be off-limits to parents if they do not know anything more about such testing than what they recall from their own school days. Educational testing has not only become more important in recent years, but it is almost certainly different from what parents recall from when they were students. (p. 5)

The Ontario assessments, for example, are aligned to Ontario’s curriculum and include many items that require lengthy responses. Parents, if they took large-scale assessments as children, most likely took tests that were created by commercial companies, were not specific to a province or state, and included only or mostly multiple-choice items. Many parents who went to school in Ontario have never taken any large-scale assessment, since Ontario had few such assessments between the mid-1960s and the mid-1990s.

As Cookson and Halberstam (1998) and Harris (2002) observe, accurate and accessible information is the parents’ best resource for ensuring that their children’s experiences with assessments are positive and educationally productive. Although a large amount of information about Ontario’s assessment and about assessments more generally is available on the Internet and in books about testing, it is not known if parents use these resources. Although the media provide information about the assessments, it is not known how critical parents are in receiving the information. School principals and teachers also provide parents with some information; however, it is not known how their opinions influence parents’ attitudes towards the assessments. In brief, parents need accurate information about the assessments and some sources of information are available. However, it is not clear how accessible those sources of information are to parents, how well-informed parents are, and how they learn about the tests their children take.

This Study


It appears that no previous research has examined how parents’ beliefs about large-scale assessments are related to their sources of information. This study addresses the need for such research, through a survey of parents of students in an elementary school in Ontario about their knowledge and beliefs about the provincial assessments and where they obtained information about the assessments.

Method


Participants

Participants were parents of students attending an elementary school (kindergarten through grade 5) in Toronto. The majority of the parents immigrated as adults to Canada. The most frequent first language was Punjabi, followed by English and Urdu. With the school principal’s permission, four hundred questionnaires with information/consent letters were sent home with students in grades 3, 4, and 5. Parents in these grades were selected because they were most likely to know and have opinions about the Ontario assessments, because the assessments are administered in grades 3 and grade 6. One hundred and six parents responded to the questionnaire. Two questionnaires were discarded because the response patterns suggested that the respondents had not read the questions before responding. The remaining one hundred and four parents’ questionnaires (a 25% response rate) were analyzed.

According to the school's teachers, the school mentioned the tests at school meetings, but did not otherwise formally communicate with parents about the tests. When the school sent the test results to the parents, a letter from the principal was enclosed, providing some information about how to interpret the results and informing parents that they could review their child’s test responses.

Questionnaire

The questionnaire in this study was modeled after a questionnaire of pre-service teachers’ knowledge of and opinions about the Ontario assessments (Childs & Lawson, 2003). Like the questionnaire for pre-service teachers, this questionnaire had three sections. The first section concerned parents’ information and knowledge about the assessments. The second section solicited the sources of information that parents had used to keep them informed of the assessments. The last section asked for parents’ opinions about or attitudes towards the assessments.

In adapting the questionnaire for use with parents, some of the items in the first and third sections were reworded (e.g., “Having to take the test makes some students very anxious” became “Having to take the tests makes my child(ren) anxious”), while others were dropped because they were unlikely to be of interest to parents (e.g., “Having his or her class take the test makes some teachers very anxious”). The resulting set of items was checked against the eight categories of information listed previously – (1) purpose, (2) content and format, (3) administration, (4) practice items and test-taking strategies, (5) consequences, (6) using and explaining results, (7) technical quality, and (8) results from previous years – to ensure that all categories were included. Items such as “I can compare the test results of my child(ren)’s school for this year with those from previous years” and “The tests help improve my child(ren)’s learning” were added to ensure better coverage of all eight categories. For the second section, regarding sources of information, all the sources of information about the assessments that were available to parents were listed.

In the first and second sections, the parents responded using three choices: Yes, No and Not Sure. In the third section, the parents had five choices: Strongly Disagree, Disagree, Agree, Strongly Agree and Don’t Know. At the end of each section of questions, blank space was provided for the parents to describe other knowledge, list other sources of information and make additional comments. Background questions about which grade the students were in, the parents’ first languages and the parents’ educational qualifications were included at the beginning of the questionnaire.

Analyses

Descriptive statistics were computed for the responses to all the questions. Most of the written responses to the open-ended questions at the end of each section were elaborations of the preceding responses and so will be reported along with the parents’ responses to the relevant questions. For the descriptive statistics and the later chi-square analyses, third section responses of “Agree” and “Strongly Agree” were combined and responses of “Disagree” and “Strongly Disagree” were combined.

The parents’ total scores for each of the three parts were computed. In the first and second sections, responses of Yes were coded as 2, No as 0, and Not Sure as 1. For most questions in the third section, the responses of Strongly Disagree, Disagree, Agree, Strongly Agree and Don’t Know received values of 1, 2, 4, 5, and 3, respectively. A few questions were worded so that Strongly Disagree indicated the most positive opinion: These were coded as 5, 4, 2, 1, and 3. Responses of Don’t Know were coded 3 so that when a parent indicated not knowing if he or she agreed or disagreed with the opinion statements, his or her response would not affect the average total score. The coefficient alpha, an index of the reliability of each set of questions, was also computed. The correlation of each item with the total for all other items in that section (i.e., the corrected item-total correlation) was also calculated.

Chi-square tests were performed to test the relationship between the children’s grade levels and parents’ responses, between parents’ first language and their responses and between parental education and their responses. To determine if children having already taken the assessments is related to parents’ information, use of sources of information and opinions, grade 4 and grade 5 parents were grouped together for comparison with grade 3 parents, using more chi-square tests. Lastly, parents’ first languages were grouped into English and Non-English, and chi-square tests were used to see if there were significant differences in responses given by parents whose first language was English and parents whose first language was not English. Because almost half of the parents did not indicate their educational qualifications, no comparisons were performed by level of education.

Finally, correlations among the total scores were computed to see if there were significant relationships between the total scores for the three sections.

Results

Description of the Sample


Of the parents participating in this study, 48.1% had children in grade 5, 20.2% had children in grade 4 and 25.0% had children in grade 3; 6.7% did not report their child's grade. Parents’ first language varied greatly: 19.2% reported speaking Punjabi; 17.3%, English; and 14.4%, Urdu, while the rest listed 14 other languages. Only 56.7% of the parents indicated their educational qualifications on the questionnaires: 13.5% had a Master’s degree, 22.1% had a Bachelor’s degree, and 21.1% had less than a Bachelor's degree.

Sources of Information

As summarized in Table 1, more than half of the parents had read some of the provincial testing agency’s printed materials or the newsletters about the assessments that were sent by the schools or the school boards. Also, more than half of the parents had talked with their children or with other people, such as other parents, about the assessments. However, less than a third of the parents had talked with their children’s teachers or principals about the assessments. Slightly more than a third of the parents had read, heard or seen news about the assessments in the media. Although more than two-thirds of the respondents were parents of grade 4 and grade 5 children, only about a third of all the respondents reported having seen a report of an individual student’s results; this is surprising, as such reports would have been sent home in the first semester of grade 4 for all students who took the assessment in grade 3. Fewer than 15% had visited the testing agency’s website. Barely 40% of the respondents had seen sample tests or previous years’ tests released by the testing agency. The reliability for the eight items in this section is a moderate .61. As Table 1 shows, the corrected item-total correlations for these items range from .25 to .46, except for the question about the media, which has a correlation of only .07 with the rest of the questions. This indicates that whether a parent heard, read, or saw references to the tests in the media was unrelated to whether they obtained information about the tests from the other sources.

In the space for written comments at the end of the questionnaire, several parents indicated that they wanted more information and suggested how that information might be provided. The following are direct quotes from their written comments:

• “Provide us more information from time to time to help raise the education standard of public schools in Ontario.”

• “I didn’t know other sources of information except what the school provides.”

• “Parents should be provided with samples of the tests so they can prepare their children for the tests.”

• “Results of the tests should be more accessible to the parents.”

• “[The] assessments are challenging, but what can I do to help my child?”

What Parents Know


Table 2 summarizes parents’ responses to the questions about knowledge about the provincial assessments. As the table shows, the majority of the parents knew the reasons why the assessments were administered, what subjects the assessments covered, and what the test results meant. More than half of the parents knew what item formats the tests included and that test developers tried to ensure the fairness of the tests. Approximately half of the parents knew where they could find the test results for their children’s school and that they could see their children’s responses or answers to the test questions once the test results were released to the school. However, only slightly more than a third of the parents knew that they could request a review of their children’s test results. More than half of the parents were either unsure or did not know that their children could be exempted from the assessments under certain conditions. About 60% to 70% of the respondents did not know or were not sure that they could compare the test results of their children’s school with those of other schools or compare the test results of this year with those of previous years in the same school. As shown in Table 2, the corrected item-total correlations range from .30 to .55. The coefficient alpha (reliability) for this set of 11 items is quite high: .79.

What Parents Believe

Table 3 is a summary of parents’ responses to the opinion questions in Section 3. As can be seen from the table, more than 70% of the responding parents believed that the assessments provided accurate evaluations of individual students, schools, and school boards.

Only about half of the parents reported that taking the assessments made their children anxious and that having their children take the assessments made the parents themselves anxious. Regarding test anxiety, two parents wrote on the questionnaire:

• “Taking the assessments makes my child nervous.”
• “I was nervous and anxious when I had to take similar tests.”

Both of these parents had used four of the information sources, including the media. One of them knew the reasons, the subjects, the formats, exemption, the meanings of test results and the right of parents to request a review of the test result. The other parent did not seem to know much about the assessments, and his/her comments also reflected an incorrect belief that students who failed the assessment could not be promoted to the next grade.

The majority of the parents agreed or strongly agreed that the assessments helped increase accountability of teachers, schools, and school boards and helped improve their children’s learning. More than half of the parents believed that comparison of the test results among schools and school boards should be encouraged. Most of the comments parents wrote on the questionnaire supported this view:

• “All the students should participate in the … test. It helps the judgment of students’ ability and our education system.”

• “It should be held, because it increases children’s sense of competition.”

• “This test is very important to know our children, this way we can guide them better.”

• “[The] test is very important to know about a child’s ability. This way teacher and parents can know about their children and guide her. Teacher should discuss parents about this test and about child’s result.”

• “[The tests] help the kids learn lots more than often.”

• “You should never stop giving [the tests] to kids, it helps go to a higher level and improve in their marks.”

• “By [the] tests, you will know how your child’s progressing and it helps you to choose the subjects in future.”

• “[The test] is a good way to acknowledge your child’s learning ability; it’s a good way to help students learn. I think the government should continue this.”

About 60% of the parents agreed or strongly agreed that the provincial testing agency should develop and administer tests at every grade level. One parent’s written comment gives a reason for this support:

• “This test should be administered at every grade level and whenever you find that most of the students have achieved this level very confidently, you can increase its level a little bit. The reason for this suggestion is that our children are living in a very competitive age and they must be smarter everyday to survive in this life…”

Only about 10% of the respondents agreed or strongly agreed that the assessments were a waste of taxpayers’ money, while the majority of the parents disagreed or strongly disagreed that the assessments were a waste of money.

The reliability of the scale comprised of these nine items was .76. The corrected item-total correlations ranged from .09 for the question about whether the tests make their children anxious to .62 for the belief that the tests increase accountability of teachers, schools, and boards.

Comparisons by Grade Level


Of the eight sources of information, 78% of the grade 4 and grade 5 parents had used three or more, compared to only 35% of the grade 3 parents who had used three or more, a difference that is statistically significant (X²(1, N=97)=15.57, p< .001).

An analysis was conducted to evaluate whether a larger proportion of grade 4 and grade 5 parents than grade 3 parents had seen Individual Student Reports. About 15% of the grade 3 parents had seen an Individual Student Report, compared with 48% of the grade 4 and grade 5 parents, X²(1, N=97)=8.43, p< .01. While more parents of grade 4 and 5 students had used most of the other sources of information, the differences were not significant.

Comparisons by First Language


Parents’ first language did not seem to make a significant difference in their knowledge, except in the responses to the question “I know what subjects the tests cover.” To that question, significantly more parents whose first language was not English responded “Yes” than parents whose first language was English, X²(2, N=80)=6.99, p<.05. Nor did parents’ first language make a significant difference in their use of sources of information or in their opinions.

Relations Among Total Scores


Pearson correlation coefficients were computed among the total scores from the three sections. The total information parents had is significantly correlated with the number of sources of information they used (r=.48, p<.001) and their positive opinions about the assessments (r=.29, p<.001). The better-informed parents were, the more positive opinions they held about the assessments. The more sources of information they used, the more information they had. The correlation between the number of sources of information and positive opinions is .19, which does not quite meet the criteria for statistical significance, with p= .053.

Discussion

The school in which this research was conducted is in a large city in Ontario. The majority of the parents were first generation immigrants; this is quite typical for Toronto schools, but would be unusual elsewhere in the province. In addition, the sample for this study was neither random nor representative: although all parents of children in grades 3, 4, and 5 in the selected school were invited to participate, only 25% returned questionnaires. The resulting small sample size also limits the generalizability of the findings.

Bearing these limitations in mind, the study nevertheless provides important information both about the sources of information, the knowledge, and the opinions of parents in this school, and about the relationships among sources, knowledge, and opinions. We found that these parents were knowledgeable about and generally supportive of the assessments. They believed that the assessments had helped increase accountability of teachers, schools, and school boards and thus enhanced their children’s learning. Parents did not seem overly concerned about stress or other possible negative effects of large-scale assessments.

We expected that the more sources of information parents used, the more informed they would be and therefore the more positive their attitudes towards the assessments. Indeed, we found that the number of sources parents used, their knowledge, and positive opinions were positively correlated.

It is interesting that the parents in this community did not give much attention to the media reports on the assessments, as fewer than 40% of the parents said they have read, heard or seen news about the tests from the media, although the major newspapers for the area had published numerous stories about the tests during the preceding year. Fourteen of the parents either did not or were unsure that they had used any of the sources of information and yet they all expressed opinions about the assessments. While we might speculate that those parents’ opinions resulted from other personal experiences and beliefs, it is clear that sources of information do not necessarily affect parents’ attitudes towards the assessments.

The expectation that the more sources of information the parents use, the better informed they are about the assessments is supported by the results. This suggests that jurisdictions with large-scale assessment programs would do well to emphasize providing information to parents if they want parents’ support. In Ontario, the provincial testing agency may need to advertise its website, which has a great deal of detailed information about the assessments. It is not clear why only 13.5% of the parents have visited the website. Parents may not have known of the existence of the website or it could be that many parents did not have easy access to the Internet. If that were the case, the agency should make sure information reaches parents in other ways.

It is noteworthy that only about forty percent of the parents had seen the samples or released tests and knew the formats of the tests. Given the importance of parental involvement, including helping students with their homework and preparing them for the assessments (Fehrmann, Keith, & Reimers, 1987; Popham, 2000; Thurston, 1989), it would be good for all parents to see the sample or released tests, so that they could get to know the formats of the tests and the kinds of materials tested. By doing that, parents would be able to give their children more relevant support in their academic studies. If any of the parents’ negative opinions resulted from an assumption that the assessments consist mostly of multiple-choice questions, they might also change their attitude towards the assessments if they saw samples, since multiple-choice questions are only a small part of the assessments.

Most parents did not seem to be concerned about potential negative effects of the assessments on their children. In the written comments, a few parents mentioned that it was good that the assessments encouraged competition. However, other parents believed that encouraging competition was one of the drawbacks of having the provincial assessments.

Conclusion

Future research should include a larger sample of parents from a wider variety of backgrounds. It would also be interesting to interview parents to gain more insights into their perspectives on the assessments and the reasons for their opinions.

In spite of these limitations, this study provides an important overview of what parents in one school knew and believed about the large-scale assessments their children take. In general, parents who knew more about the assessments were more supportive. This finding has clear implications for jurisdictions seeking parents’ support for their assessment programs.

Table 1
Responses to Sources of Information Questions

Question
Percent Agreeing

Corrected Item-Total Correlation
I’ve read some of the provincial testing agency’s printed materials (e.g., brochures, letters) for parents and/or teachers.
53.8%

.46
I’ve read some newsletters about the tests that are sent by the schools or the school board.
57.7%
.40
I’ve visited the provincial testing agency’s website.
13.5%
.29
I’ve seen a provincial testing agency report card for an individual student.
36.5%
.42

I’ve read, heard, or seen news about the tests from the media.
39.2%
.07

I’ve talked with my child(ren)’s teachers and/or the principal about the tests.


29.4%
.32
I’ve talked with my children and/or other people (such as other parents) about the tests.
51.0%
.45

I have seen sample tests or real tests that were released by the provincial testing agency.
39.4%
.25

Note. Because of missing responses, the number of responses per question varied between 101 and 104.

Table 2
Responses to Knowledge Questions

Question
Percent Agreeing

Corrected Item-Total Correlation
I understand the reason(s) why the tests are administered.
80.8%
.36

I know that test developers take precautions to ensure the fairness of the tests.
65.4%
.38
I know what subjects the tests cover.
71.2%
.50
I know what formats the test items have (e.g., multiple choices, short answers).
58.7%
.47
I am aware that students can be exempted from taking the tests under some conditions.
43.3%
.30

I know what the test results mean (e.g., Level 3 is meeting the provincial standard, and Level 4 is above the provincial standard).
72.8%
.39
I know where I can find the test results of my child(ren)’s school.
48.1%
.51
I can compare the test results of my child(ren)’s school with those of the other schools.
29.8%
.52
I can compare the test results of my child(ren)’s school for this year with those from previous years.
39.4%
.55
I can see my child(ren)’s responses/answers on the tests, once the test results are released (to schools).
47.1%
.47
I can request a review of my child(ren)’s test results if I don’t agree with them.
37.5%
.48

Note. Because of missing responses, the number of responses per question varied between 101 and 104.

Table 3
Responses to Opinion Questions

Question
Disagree/Strongly Disagree
Agree/Strongly Agree
Don't Know
Corrected Item-Total Correlation
The tests provide accurate assessment for individual students.
10.9%
76.1%
13.0%
.50
The tests provide accurate evaluation for schools and boards.
7.9%
72.1%
20.0%
.53
Having to take the tests makes my child(ren) anxious.
30.0%
54.0%
16.0%
.09
Having my child(ren) take the tests makes me anxious.
50.0%
44.0%
6.0%
.28
The tests help increase accountability of teachers, schools and school boards.

9.9%
78.1%
12.0%
.62
The provincial testing agency should develop and administer tests at every grade level.
21.8%
62.4%
15.8%
.49
The tests are a waste of taxpayers’ money.
75.3%
9.8%
14.9%
.49
The tests help improve my child(ren)’s learning.
8.0%
85.1%
6.9%
.61
Comparison of the test results among schools/boards should be encouraged.
10.9%
65.3%
23.8%
.52

Note. Because of missing responses, the number of responses per question varied between 101 and 104.



References

American Educational Research Association, American Psychological Association, & National Council on Measurement in Education. (1999). Standards for educational and psychological testing. Washington, DC: AERA.

American Federation of Teachers, National Council on Measurement in Education, & National Education Association. (1990). Standards for teacher competence in educational assessment of students. Washington, DC:
NCME.

Barksdale-Ladd, M. A., & Thomas, K. F. (2000). What’s at stake in high-stakes testing: Teachers and parents speak out. Journal of Teacher Education, 51, 384-397.

Brookhart, S. M. (1999). Teaching about communicating assessment results and grading. Educational Measurement: Issues & Practice, 18 (1), 5-13.

Childs, R. A., Jaciw, A. P., & Schneid, A. (2004). Telling teachers about tests: Education Departments’ uses of the Internet to communicate about large-scale assessments. International Electronic Journal for Leadership in Learning, 9(2). [Available on-line: http://www.ucalgary.ca/~iejll]

Childs, R. A., & Lawson, A. (2003). What do teacher candidates know about large-scale assessments? What should they know? Alberta Journal of Educational Research, 49, 354-367.

Cook, M. (2001, October 16). Setting a standard for student tests. The Ottawa Citizen. p. D3.

Cookson, P. W., Jr., & Halberstam, J. (1998). A parent’s guide to standardized tests in school. New York: Learning Express.

Elam, S. M., Rose, L. C., & Gallup, A. M. (1992). The 24th annual Gallup-Phi Delta Kappan Poll of the public’s attitude toward the public schools. Phi Delta Kappan, 74, 41-53.

Fehrmann, P. G., Keith, T. Z., & Reimers, T. M. (1987). Home influence on school learning: Direct and indirect effects of parent involvement on high school grades. Journal of Educational Research, 80, 330-337.

Grant, S. G. (1999). Teachers and tests: Elementary and secondary school teachers’ perceptions of changes in the New York State testing program. Paper presented at the annual meeting of the National Council for the Social Studies, Orlando, FL.

Harris, J. (2002). What every parent needs to know about standardized tests: How to understand the tests and help your kids score high! New York: McGraw-Hill.

Jamieson, B. (2004, September). State of the teaching profession 2004: Confidence in public education on the rise. Professionally Speaking, 28-37.

Johnson, C., & Johnson, D. (2000). The no-stress guide to the 8th grade FCAT. New York: Kaplan.
Joint Advisory Committee. (1993). Principles for fair student assessment practices for education in Canada. Edmonton, Alberta: Author.

Kohn, A. (2000). The case against standardized testing. Portsmouth, NH: Heinemann.

Koretz, D., & Diebert, E. (1993). Interpretations of National Assessment of Educational Progress (NAEP) anchor points and achievement levels by the print media in 1991. Santa Monica, CA: Rand.

Lindgren, A. (2001, October 19). How politics is put to the test. The Ottawa Citizen, p. D3.

Livingstone, D. W., Hart, D., & Davie, L. E. (2001). Public attitudes towards education in Ontario 2000. The 13th OISE/UT survey (ORBIT Monograph). Toronto: Ontario Institute for Studies in Education of the University of Toronto.

Macleod, I. (2001, October 15). Parents snap up exam guides: Anxiety over Ontario standard tests creates best-sellers. The Ottawa Citizen, pp. A1, A7.

Marso, R. N., & Pigge, F. L. (1993). Teachers’ testing knowledge, skills, and practices. In S. L. Wise (Ed.), Teacher training in measurement and assessment skills (pp. 129-185). Lincoln, NE: Buros Institute of Mental Measurements, University of Nebraska-Lincoln.

Mulroney, C. (2002, May 13). Standardized tests just a tool for ranking schools. Toronto Star, p. A6.

Pennington, M. (2001). Better test-taking skills in 5 minutes a day: Fun activities to boost test scores for kids and parents on the go. Roseville, CA: Prima.

Phelps, R. P. (1998). The demand for standardized student testing. Educational Measurement: Issues and Practices, 17(3), 5-23.

Plake, B. S., Impara, J. C., & Fager, J. J. (1993). Assessment competencies of teachers: A national survey. Educational Measurement: Issues and Practice, 12 (4), 10-12, 39.

Popham, W. J. (2000). Testing! Testing! What every parent should know about school tests. Boston, MA: Allyn-Bacon.

Shepard, L. A., & Bliem, C. L. (1995). Parents’ thinking about standardized tests and performance assessments. Educational Researcher, 24(8), 25-32.

Steveson, D. L., & Baker, D. P. (1987). The family school relations and the child’s school performance. Child Development, 58, 1348-1357.

Thurston, L. P. (1989). Helping parents tutor their children: A success story. Academic Therapy, 24, 579-587.

Wake, B. (2001a, October 15). The dispute over standardized testing. The Ottawa Citizen. p. B3.

Wake, B. (2001b, October 17). Anger over testing sweeps the U.S.. The Ottawa Citizen, p. C3.

Walters, J. (2001, November 30). Glitches plague test scores. Toronto Star, p. A23.

Yeaman, E. J. (1999). Help your child prepare for Ontario grade 3 language tests. Barrie, ON: Benchmark.