UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

The northern Saskatchewan norms project : obtaining Northern Lights School Division norms on group tests.. Tomusiak, Raymond Frank 1984

You don't seem to have a PDF reader installed, try download the pdf

Item Metadata

Download

Media
[if-you-see-this-DO-NOT-CLICK]
UBC_1984_A8 T64.pdf [ 10.17MB ]
Metadata
JSON: 1.0054522.json
JSON-LD: 1.0054522+ld.json
RDF/XML (Pretty): 1.0054522.xml
RDF/JSON: 1.0054522+rdf.json
Turtle: 1.0054522+rdf-turtle.txt
N-Triples: 1.0054522+rdf-ntriples.txt
Original Record: 1.0054522 +original-record.json
Full Text
1.0054522.txt
Citation
1.0054522.ris

Full Text

THE NORTHERN SASKATCHEWAN NORMS PROJECT: OBTAINING NORTHERN LIGHTS SCHOOL DIVISION NORMS ON GROUP TESTS OF SCHOLASTIC ACHIEVEMENT AND ABILITY by RAYMOND FRANK TOMUSIAK B.Sc, University Of Alberta, 1972 B.Ed, University Of Saskatchewan, 1978 •A THESIS SUBMITTED IN PARTIAL FULFILMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF ARTS in THE FACULTY OF GRADUATE STUDIES Department Of Educational Psychology We accept this thesis as conforming to the required standard THE UNIVERSITY OF BRITISH COLUMBIA April 1984 © Raymond Frank Tomusiak, 1984 In presenting this thesis in partial fulfilment of the requirements for an advanced degree at the University of British Columbia, I agree that the Library shall make it freely available for reference and study. I further agree that permission for extensive copying of this thesis for scholarly purposes may be granted by the head of my department or by his or her representatives. It is understood that copying or publication of this thesis for financial gain shall not be allowed without my written permission. Department of _frbc.c AT/OVA L 7?yc4-4i-o6Y The University of British Columbia 1956 Main Mall Vancouver, Canada V6T 1Y3 Date rtfptiL JO. /c?c54  DE-6 (3/81) i i Abstract The purpose of the present study was to develop regional norms on tests of scholastic achievement and ability. The tests considered for norming were the Gates-MacGinitie Reading Tests -Canadian Edition, the Canadian Tests of Basic Skills Mathematics or Mathematics Computation subtests, and the Otis Lennon School Ability Test. The region of interest was that under the jurisdiction of the Northern Lights School Division (NLSD) in northern Saskatchewan. The rationale for obtaining regional norms was based on the premise that the NLSD population is largely atypical of the national school population with respect to salient characteristics affecting academic performance. Regional norms accompanying national norms would enhance the interpretation of test scores for educational decision-making by providing another important perspective by which to examine the scores. Furthermore, the dual norming of achievement and ability tests would provide for useful diagnostic information of NLSD students as the normative data would be built upon equivalent scales of measurement. The tests were administered to 70 to 92 percent of students in four grades and four age levels within the Northern Lights School Division. The grades assessed were three, five, seven, and nine; age levels assessed were 9 years 0 months to 9 years 5 months, 11 years 0 months to 11 years 5 months, 14 years 0 months to 14 years 5 months, and 16 years 0 months to 16 years 5 months. The assessment period occurred in the fall of 1982. Statistical tests comparing the means of the NLSD groups to means or medians of the national standardization samples revealed the NLSD groups scored significantly lower than the national samples in all cases. Futhermore, examination of the NLSD test score distributions at points corresponding to the national 1.0th, 20th, 50th, and 80th percentiles exposed large differences between the NLSD and national distributions. Results of item analysis revealed the tests to be good discriminatory and reliable measures of NLSD student performance. Based on item analysis, the large majority of test items were regarded as having fair to very good values of difficulty and discrimination. Finally, correlational analyses showed strong relationships between achievement scores and ability scores within all groups. As a result of the preceding findings, regional norm tables were constructed for each test for seven of the eight groups. These consisted of percentile ranks, stanines, and T-scores associated with raw scores. Tables of derived scores were not established for the 16 year old group as the sample number was considered too low for useful percentile information. Research Supervisory Dr. W.T. Rogers i v Table of Contents Abstract ii List of Tables vList of Figures viAcknowledgement viiChapter I: INTRODUCTION 1 Value of Local Versus National Norms 3 Issues Involving the Development and Use of Norms 5 Number of Subject 5 Representativeness 6 Relevancy 7 Region and Population of Interest in the Present Study .. 8 The Problem 10 Limitations of the Study 13 Chapter II: REVIEW OF THE LITERATURE 14 Two Canadian Regional Norming Studies 1Saskatchewan Norming StudyNorming of Intelligence Tests in British Columbia .... 15 Technical Data of the Tests Used in This Study 16 Gates-MacGinitie Reading Tests - Canadian Edition .... 17 Canadian Tests of Basic Skills 18 Otis-Lennon School Ability Test 9 Chapter III: METHODOLOGY 22 Subjects . . 2Tests 4 Testing Preparations and Instructions 30 Test Administration 31 Data Preparation and Processing 32 Data Analysis 3 Comparison of the Regional Test Results With The Results from the National Standarizations 33 Item Characteristics of the Selected Tests 5 Inter-test Correlations 36 Preparation of Regional Norms 37 Chapter IV: RESULTS 39 Rate of ResponseDemographic Characteristics of The Obtained Samples 41 Grade Results 4Item AnalysisInter-test Correlations 48 Comparison of Regional Test Results To National Standardization Information 50 Percentile Rank Comparisons 51 . Reading 52. Mathematics 3 3. School Ability 54 Statistical Comparisons On Measures of Central Tendency and Variability 5Age Results 8 V Chapter V: SUMMARY AND DISCUSSION 59 Summary 5Limitations of the Study 63 Applicability of NLSD Norms 64 1 . NLSD Representativeness2. NLSD Population Stability 5 Availability of National Raw Score Data 66 Non-Normality of Test Score Distributions 68 Test Bias 69 Directions for Future Research 71 REFERENCE NOTES 7 4 REFERENCES  5 APPENDIX A: Test Administrators' Inservice 78 APPENDIX B: Tables of NLSD Norms on Achievement and Ability Tests 107 APPENDIX C: Investigation of the Reading and Mathematics Achievement of Student Absentees 158 APPENDIX D: NLSD Test Result Information by Age 170 vi List of Tables 1. Levels and Forms of the Achievement and Scholastic Ability Tests Administered to Students in the Four Grade and Four Age Groups in the Northern Lights School Division 27 2. Description of the Number of Students Assessed by Grade and Age in the Northern Lights School Division According to Gender, Affiliation, and School Size 40 3. Summary Item Statistics of Achievement and Scholastic Ability Tests Administered to Students in Four Grade Groups in the Northern Lights School Division 42 4. Test Items with Item Difficulty Levels Below the Chance Level or Item Discrimination Indices Below .20 on Tests of Achievement and Scholastic Ability Administered to Four Grade Groups in the Northern Lights School Division " 45 5. Pearson Correlation Coefficients Among Tests Adminis tered to Students in the Northern Lights School Div ision Within Four Grade Groups 49 6. Percentage of Students by Grade in the Northern Lights School Division Scoring at or Below Selected Percentile Ranks from the National Norm Tables on the Administered Tests 51 7. Results of the One Sample t-Tests, Kolmorgorov-Smirnov One Sample Tests, and the x2 Tests for Students Assessed by Grade in the Northern Lights School Division 55 vii List of Figures 1. Location of Northern Lights School Division Schools in the Province of Saskatchewan 9 2. Within-Level and Pertinent Out-of-Level Uses of Test Levels Used in This Study 29 Acknowledgement The author wishes to express his indebtedness to the Northern Lights School Division, the Saskatchewan School Trustees Association, and the Saskatchewan Teachers Federation for their support in funding this research project. An expression of gratitude is made to my research supervisor, Dr. Todd Rogers, for his unending support, and his professional expertise in guiding this study. I also wish to thank committee members, Dr. Don Allison and Dr. Bryan Clarke, for their support and valuable advice. Special thanks is also due to Dr. Roy Travis for his generous assistance and advice early in the study. Special appreciation is in order to the many teachers who volunteered their time to serve as test administrators. And to the superintendents, school administrators, classroom teachers, and consultants, recognition of their cooperation and support is made explicit. I am also extremely grateful for the excellent services provided by Sheila Flory in data processing; by Irene Roy, Trudy Duncan and Karen Dvorak as clerical assistants; and by Reid Spencer, for his technical assistance in preparing this manuscript to its final form. Finally, a sincere thank you is due to the many friends and acquaintances who offered their encouragement and help in this project in one way or another. 1 Chapter I INTRODUCTION For many years, school age children in various regions of Canada have been compared to reference groups on standardized norm-referenced tests. These reference groups, sometimes referred to as standardization groups or normative samples, have been purported to be representative of national and provincial populations, as well as smaller regional, local or special group populations. But, on many of the tests used in Canada, the reference groups have been foreign (quite often American). The types of reference groups used in individual and group comparisons can have a direct bearing on the power of test score interpretations and the value of consequent educational decisions (Angoff, 1971; Stanley & Hopkins, 1972; Salvia & Ysseldyke, 1981). The view held in the present study is that available norms are of limited usefulness for the population of interest, namely students in the Northern Lights School Division, Saskatchewan, Canada. If it was found that this particular population scored differently than that of the standardization groups on which national norms were constructed, then a prima facie case would exist for the construction of local norms. Use of these norms would greatly enhance the interpretations of these students' test scores on tests of achievement and ability. 2 Normative information is often provided in norm tables where its value lies in giving interpretive meaning to raw test scores. Raw scores are transformed to normative scales such as percentile ranks, standard scores, and age or grade equivalents based on the distribution of raw scores in the standardization sample. Norms should not be confused with standards or goals of performance; rather they are descriptions of typical performance of a standardization group representing a designated population to which an individual or group score can be compared (Hopkins & Stanley, 1981). Certain populations, national or otherwise, have been accurately represented on some standardized tests through carefully designed statistical sampling techniques. Similar populations have not been so well represented on other tests, even though these tests may be well-known and frequently used (see Salvia & Ysseldyke (1981) for reviews of the Wide Range Achievement Test, the Slosson Intelligence Test, and the Durrell Analysis of Reading Difficulty). In these cases, the norming samples used have been inadequate with respect to the characteristics of the populations they are purported to represent, or the populations themselves have remained ambiguous and undefined. Under these conditions meaningful interpretations of individual and group test scores can only be made with caution at best; at worst, they may be grossly inaccurate. Furthermore, even if the norm group is truly representative of some clearly defined population, comparisons with certain groups of individuals may be inappropriate or lacking in relevancy for 3 the particular kinds of information required. Examples of this may occur in comparing a Canadian student's test scores on the original Peabody Picture Vocabulary Test to its norming sample of 4000 chidren and adolescents in Nashville, Tennesee in 1958; or to the norming sample on the Raven's Standard Progressive Matrices which consisted of all the children living in Colchester, England, in 1943 and whose surnames began with the letters A or B (Holmes, 1981). Value of Local Versus National Norms Representative national or provincial norms are often useful in a broad perspective. For example, nationally normed tests of achievement and ability provide one important frame or reference to be used to assess the academic or intellectual level of a student with respect to his national peers. This may lead to predictions of where the child is heading academically, especially when in competition with other children on a national basis for the same academic goals or rewards. However, as stated by Angoff (1971), national norms are often too general for specific action. Furthermore, "to the extent that educational practices vary throughout the country, the problem remains that no single set of national norms would be entirely appropriate and applicable in a particular community" (Angoff, 1971, p. 538). 4 The value of local norms has been expressed elsewhere (Stanley & Hopkins, 1972; American Psychological Association, 1974; MacGinitie, 1980; Otis & Lennon, 1979b; Salvia & Ysseldyke, 1981). On the one hand they may be of particular value to a school division when current decisions are necessary on such matters as student programming and placement. In these cases the student's test scores are compared to those of his own group. Angoff (1971) stated that these norms have the advantage of being based on homogeneous groups of individuals who have characteristics that are familiar and meaningful to the user. Local norms are particularly valuable if a local group is quite different from the population at large. These differences between the two groups show up in the distributions of the test scores as significant deviations in central tendency or variability. The differences are produced by an atypical proportion of certain characteristics in the local group which systematically affect the test scores. Otis and Lennon (1979b) asserted that local norms are especially desirable for atypical groups because they ensure that local curricular emphasis and the characteristics of the local student population receive proper consideration. Basically, the value of local norms lies in improving the interpretation of test scores for decision making within the regional or local setting. 5 Issues Involving the Development and Use of Norms Salvia and Ysseldyke (1981) raised three important issues concerning norms which relate directly to the rationale behind the purpose for this study and the methods used to implement it. These are the number of subjects, representativeness, and relevancy. Number of Subjects In any norming study, the number of subjects used in each distinct norming group, be it by grade or age, has implications for the degree of confidence held in the resultant test norms. Ghiselli (1964) has stated that in order to maintain norm stability the norming group must not be too small; otherwise marked sampling fluctuations in the norms could occur. This problem is of particular concern when constructing norms for a local population. Some populations are of such a small size that an inadequate number of subjects is available to construct tables of norms for use across time. Previous years' norms may be inaccurate and inappropriate for use in subsequent years due to the instability created by the small size of the norm sample. Salvia and Ysseldyke (1981) drew attention to another matter of consequence resulting from norm group size. In order to reduce interpolations and extrapolations in norm tables, a sufficient number of subjects is required. Small normative samples may create large gaps in the norm tables as many test score values may not be observed and therefore have no actual 6 corresponding transformed score, other than through interpolation or extrapolation. Salvia and Ysseldyke stated that a minimum of 100 subjects is necessary in each norming group. They explained that this is the minimum number in which a full range of percentiles can be computed and standard scores computed between ±2.3 standard deviations of the mean without extrapolation in a normally distributed set of scores. Representativeness If a population group is complex, such as a national population, then stratifying the population according to key demographic variables to obtain a representative sample is often neccesary. If members from each strata are to be proportionately represented, and norms with acceptable levels of precision established, then a large number of subjects is required in each stratification cell. However, Hopkins and Stanley (1981) have asserted that the the size of the norming group is much less critical then the degree to which it is representative of a relevant population. Scores compared to a representative group can be interpreted with greater confidence then scores compared to a large sample of uncertain representativeness with its associated undetermined amount of error due to sampling bias. The basic concern here is whether or not characteristics of the norm sample which systematically affect test scores are present in the same proportion as the population it is purported to represent. In a large heterogeneous population, such as at 7 the national level, the population is stratified according to demographic variables believed most influential. Traditional variables used to stratify these large populations include geographic region, ethnicity, urban-rural residence, socio economic status, size of school or school system, and gender. These demographic variables probably form a complex intercorrelated network with other possible correlates of school success such as educational opportunities and experiences, motivation and attitude towards school learning, first language competence, pre-school readiness, and social-emotional well-being of the student. Relevancy The issue of relevancy as it concerns the use of norms ultimately involves the purpose for which the tests are to be used. Assuming norms have been derived for tests which possess adequate psychometric qualities of reliability and validity, the norms still may not provide for useful or meaningful interpretations even though these norms are stable and representative of a particular population. This is true when the comparative group fails to provide a perspective on individual or group performance which is deemed necessary in order to make certain decisions, such as those concerning programming and placement. Norm relevancy is often discussed in a context of comparing two types of norming groups, such as national versus local norms. Thus both types of norm groups can be relevant to a particular group, each adding a different dimension to the 8 picture. As Angoff (1971) stated, "the availability of a single norm table tends to obscure the fact that a percentile rank is not unique but represents only one of many possible evaluations of a test score" (p. 536). Perhaps one relevant feature of local norms is that together with national norms, they can lead to more comprehensive evaluations by skilled personnel, thereby leading to decisions more likely to produce successful outcomes. Region and Population of Interest in the Present Study The setting for this study was the northern part of Saskatchewan, specifically the Northern Administrative District of the province. This region consists of over 100,000 square miles of vast, sparsely inhabited tracts of forest and lakes. Communities tend to be relatively small and far apart, ranging in size from less than 100 to approximately 3,000 residents. Many communities are isolated to the extent that they are accessible only by air or winter road. The great majority of northern residents are of Native ancestry. Employment is mainly in mining and forestry and in the government and service sectors. Traditional occupations such as trapping and commercial fishing occupy a small percentage of the total labour force. Presently, the unemployment rate among northern residents is approximately 40 percent (Note 1). Close to 70 percent of the total northern Saskatchewan school population attend provincial schools, while the remaining 30 percent attend federally funded schools. Of particular 9 • N.L.S.D. Schools All Weather Roads y Winter Roads '• Major S. Sask. Centres Figure 1. Location of Northern Lights School Division schools in the province of Saskatchewan. 1 0 interest to this study are students within the Northern Lights School Division who account for about 80 percent of all students attending provincial schools in the Northern Administrative District and nearly 60 percent of the enti-re northern Saskatchewan school population (see Figure 1). Presently there are 30 schools within the Northern Lights School Division. School enrollment range from approximately 10 to 600 students. Grade ranges vary from kindergarten only, to kindergarten to grade 12 inclusive. Approximately 90 percent of the almost 4500 students within the Northern Lights School Division are Native, with the large majority designated as Non-Status or Metis aboriginal people. The Problem A major problem which has confronted assessment personnel such as educational psychologists in the Northern Lights School Division is centered upon the interpretation of northern students' test scores on nationally and/or provincially normed, standardized tests of achievement and ability. The available norms fail to give a complete picture of the direction to be taken for accurate and realistic programming and placement of students within the region or school division. This is the case when one realizes that using available norms, northern children's test scores are compared to standardization or norm groups which represent the national or provincial situation, but which fail to adequately represent the population to which these children belong and in which educational decisions regarding 11 programming and placement are made. Thus, the problem addressed in this present study was to examine the performance of Northern Lights School Division students with the performance of the national standardization samples. It was hypothesized that differences in test scores obtained from standardized achievement and school ability tests exist between the Northern Lights School Division student population and the national school population. If the hypothesis stated above was confirmed, the intent of the study was then to be directed towards the construction of regional norms. Regional norms accompanied with national norms would aid greatly in the interpretation of test scores and assist in decisions regarding further diagnosis, placement, and programming within the NLSD. The acquisition of regional norms would help serve the following specific purposes: 1) Regional norms would provide for comparisons of individuals within a relatively homogeneous group, that is, with similar experiences and background. This has implications with respect to how a child is developing in terms of other children from a similar setting. 2) Regional norms would be used for screening potential special needs students. These students are identified by their extreme percentile rank scores in the regional population, or by significant differences detected in their achievement scores 1 2 compared to their higher ability scores normed on the same sample from the regional population. This procedure of dual norming achievement and ability tests is especially desirable as it ensures that discrepancies between the two types of tests are based on comparable scales resulting from a common sample. This information will lead to decisions regarding further diagnosis and special programming. 3) Along with other important sources of information, regional norms would be used to help determine the appropriate group or classroom environment in which the student would work for maximum educational benefit. Whether or not a homogeneous or heterogeneous grouping scheme is desired, the regional norms should aid in deciding upon general placement arrangements. Used in conjunction with national norm figures, these norms should give an indication of the levels the students are at, and thus allow for real expectations despite grade placement. 4) Regional norms could be used to assess trends or changes in groups of students on achievement and ability tests over months and years in relation to their own increased skills and abilities, and relative to the 1982 regional norming group. 13 Limitations of the Study This study was limited to school age children in four grade and age groups attending schools operated by the Northern Lights School Division. Consequently, the regional norms provided in this study only apply to students in these age and grade levels attending schools under the jurisdiction of the Northern Lights School Division. 1 4 Chapter II REVIEW OF THE LITERATURE The literature reviewed in this chapter is organized into two sections. In the first, two recent Canadian norming studies which gave major impetus for the present study are examined. In both studies the case for relevant regional (provincial) norms was made which subsequently led to the construction of provincial norm tables. The second section contains reviews of three tests considered in the present study. Particular emphasis is placed on descriptions of the characteristics of the standardization samples and the psychometric properties of these tests. Two Canadian Regional Norming Studies Saskatchewan Norming Study Randhawa (1979a) conducted the 1978 Provincial Testing Program which led to the production of Saskatchewan provincial norms that "should provide a meaningful and realistic picture when a score of a pupil or an average of a school is compared with the representative Saskatchewan group of pupils or schools"(Randhawa, 1979b, p. 2). A ten percent random sample of classrooms by school jurisdiction in Saskatchewan in each of grades four, seven, and ten was selected for testing on achievement and ability tests. The tests chosen for grades four and seven students were appropriate levels of the Vocabulary, 1 5 Reading, Language, and Mathematics Subtests on the Canadian Tests of Basic Skills (CTBS), Form 4M; and the Otis-Lennon Mental Ability Test, Form J. Grade ten students were administered Reading, Mechanics of Writing, English Expression, Mathematics Computation, and Mathematics Basic Concepts on the Sequential Tests of Educational Progress (STEP), Series II, Form 2A; and the Otis-Lennon Mental Ability Test, Form J, Advanced Level. Results of statistical comparisons between the performance of the Saskatchewan sample and the Canadian or American standardization samples indicated that on almost all tests, means or variances were significantly different. The Saskatchewan sample showed higher mean scores on the achievement and ability tests and, generally, smaller variance. In providing the rationale for the use of Saskatchewan norms, Randhawa (1979b) stated that test interpretations could then be based in terms of Saskatchewan's own unique situation. Norming of Intelligence Tests in British Columbia Holmes (1981) assessed a large representative 1 British Columbia sample on five individually-administered intelligence tests. As part of an equating study, Holmes also wished to determine the relevancy of existing norms on these tests for use 1 Holmes delimited her study to exclude Native Indians, children not fluent in English, the physically handicapped, emotionally disturbed, and trainable mentally handicapped. 1 6 in British Columbia. The tests Holmes administered and analyzed were the Wechsler Intelligence Scale for Children - Revised (WISC-R), the Peabody Picture Vocabulary Test (PPVT), the Slosson Intelligence Test (SIT), the Standard Progressive Matrices (SPM), and the Mill House Vocabulary Scale (MHVS). She compared the B.C. sample, consisting of children aged seven and a half, nine and a half, and eleven and a half, to the published standardization samples from each of the five tests. In most cases significantly higher test score means and lower variances for the B.C. sample were found on statistical tests of central tendency and variability. Based on her findings, Holmes constructed current and relevant British Columbia provincial norms for use in that province. Technical Data of the Tests Used in this Study Two tests of academic achievement and one scholastic ability test were administered in this study. Both achievement tests, the Gates-MacGinitie Reading Tests - Canadian Edition (MacGinitie, 1979) and the Mathematics or Mathematics Computation subtests of the Canadian Tests of Basic Skills (King, 1981) were normed on a Canadian national population. The Otis-Lennon School Ability Test (Otis & Lennon, 1979a) was normed on a representative American population. 17 Gates-MacGinitie Reading Tests - Canadian Edition The Gates-MacGinitie Reading Tests - Canadian Edition are based on the American edition of the test published in 1978. The Canadian edition was empirically normed on a national sample of 46,000 students in the fall of 1978 on Form 1 of Basic R (Readiness) and Levels B through F. Level A, Form 2 was normed in the winter of 1979. Between 3000 and 4500 students at each level from kindergarten to grade twelve were tested. The norming group included only students attending schools in which most of the instruction was given in English. The provinces and territories were proportionately represented on the basis of their total school enrollment. In general, each of the above regions was stratified by size of population in urban -.rural areas and by the type of school board (public or separate). Students were tested from school boards which had been randomly selected from the defined strata. Comparisons of the Canadian score distributions with grade equivalent United States score distributions from the 1977-78 US standardization, and the results of equating studies conducted in conjunction with norming the tests in the United States were used to complete Canadian Fall, Midyear, and Spring norms on all levels and forms of the tests (MacGinitie, 1980). MacGinitie (1980) reported internal consistency reliability coefficients (K-R 20) for the various levels of the Gates-MacGinitie Reading Tests to range from .85 to .94. Indications were given that the tests are power tests; thus the tests should 18 not demonstrate inflated internal consistency reliability coefficients due to varying response rates of students. Evidence for test validity included careful selection of items based on special studies of words found in commonly used reading series and lists of words frequently used in school reading material; the inclusion of items to suit the knowledge and wide range of interests in various subject matter of children; and the use of items of appropriate difficulty and discrimination. The initial item pool was examined by a group of Canadian educators for item suitability, and several items for the final test forms were modified or rejected based on their recommendations. Furthermore, items were screened by minority consultants for offensive and inappropriate material. A major consideration in the selection of test items was that the content should be within the experience of most students of diverse cultural background, regional settings, and different sex (MacGinitie, 1980). Canadian Tests of Basic Skills The Canadian Tests of Basic Skills (CTBS) are adapted from the 1978 edition of the Iowa Test of Basic Skills. A national sample of approximately 3000 students per grade (K to 12) was used to empirically norm Form 5 of the Canadian Tests of Basic Skills in the fall of 1980. Only students attending schools where English was the major language of instruction were included in the norming sample. Generally, schools were stratified according to province and size of school and then 19 randomly sampled and weighted to obtain a final weighted sample. Raw score equivalence of Forms 5 and 6 were made in a separate study, and different levels and forms of the test were tied together through grade equivalents or expanded standard scores (Note 2) . Based on the 1980 Fall standardization, K-R 20 reliability coefficients for the Math Computation subtest ranged from .83 to .88 (Note 2). However, no indication was given whether or not speededness may have been a factor on this subtest. The case for content validity was presented through mention that the test was constructed to reflect currently accepted curriculum practices in Canada. The items were systematically reviewed by Canadian educators and curriculum specialists, and the placement and content of the test items reflect the contributions made by these consultants. All items were also evaluated by representatives of various cultural groups (King, 1982). Otis-Lennon School Ability Test The Otis-Lennon School Ability Test was standardized on 130,000 students within 70 American school systems in the fall, 1977. School systems were stratified according to geographic region, public school system size, and socio-economic characteristics. The socio-economic index was formed from a composite of median family income and the percentage of adults in the family with high school graduation. The sample was also shown to correspond proportionately to the United States 20 population broken down into four ethnic groups. Native groups were included in the two percent of the group designated 'Other' (Otis & Lennon, 1979b). The authors of the Otis-Lennon School Ability Test reported both internal consistency and test-retest reliability data. All K-R 20 reliability coefficients were above .90, while test-retest reliability estimates ranged from .84 to .92. This test was constructed to yield a power measure of school ability, minimally influenced by speed of work (Otis & Lennon, 1979b). Both predictive and concurrent validity studies were reported by the authors of the Otis-Lennon School Ability Test. Correlations with Fall test scores and end-of-year teacher assigned grades, as well as correlations of this test with a measure of achievement revealed that the test was measuring behaviors for which it was designed. Items were examined to meet rigorous specifications with respect to content, difficulty, and discrimination. During test development, the items were separately analyzed for ethnic bias using a modified chi-square analysis procedure; items showing a low probability of being unbiased were eliminated (Burrill & Wilson, 1980). As well, items were reviewed by an advisory panel of minority educators for appropriateness and possible ethnic bias. All items were also reviewed for sex bias, and balanced sex references were used in the test. According to the authors, "Regardless of prior exposure and of variations in motivation, Otis-Lennon scores for examinees at any given time reflect the examinees' status with 21 respect to those abilities that make for school learning success. It is a virtue of the Otis-Lennon tests that they capture the consequences of environmental limitations that affect students' ability to master schoolwork at the time of testing" (Otis & Lennon, 1979b, p. 4). The need and value of regional norms on scholastic tests expressed in the above two studies and elsewhere led to the development and implementation of the present norming study. The three tests selected met acceptable standards of test standardization, reliability, and content validity. 22 Chapter III METHODOLOGY The principal objective of this study was to construct regional grade and age norms based on statistical and psychometric analysis of test data collected from a northern Saskatchewan school population. Briefly, tests of achievement and scholastic ability were administered to four grade and four age groups throughout schools in the Northern Lights School Division. Completed test data were returned from each school to be checked, coded, scored and analyzed. Following data analysis, tables of norms were produced to facilitate the interpretation of test scores. Specific information pertaining to the methods utilized in this study is given under separate headings below. Subjects The subjects were comprised of students from northern Saskatchewan enrolled in the Northern Lights School Division. Four grades and four age groups were considered. The grade levels were three, five, seven, and nine; the age groups were 9 years 0 months to 9 years 5 months, 11 years 0 months to 11 years 5 months, 14 years 0 months to 14 years 5 months, and 16 years 0 months to 16 years 5 months. Initially a plan was conceived to administer tests to all students enrolled in Year 1 in each of Divisions I, II, III, and 23 IV. 2 However due to the level of difficulty of the tests, and the large number of children learning English as a second language or requiring an extra year of academic readiness, grade three was concluded to be a better time at which to begin norm-referenced testing. Similarly, upon consultation with teachers and consultants in the School Division regarding the Division II tests, it was generally agreed that the level of difficulty of the tests was more suited to grade five students in the Division. Grade nine, the last year of Division III, was chosen because of the few numbers of students enrolled in first year Division IV programs. The four age groups were selected on the basis of the likely typical age of children attending each of the above selected grades in the School Division. The value of obtaining age norms was thought to be important in providing yet another channel for comparative assessment of students in the School Division. Rather than testing a sample selected from the population of interest, it was decided that all subjects in the eight respective grade and age groups would be assessed. Utilizing a random sampling technique to obtain a representative sample suitable for norming from a population which itself is small and scattered across a large area would save little in terms of cost and time, and would create difficult organizational problems in 2 In Saskatchewan, Division I refers to grades one, two and three; Division II - grades four, five, and six; Division III - grades seven, eight and nine; and Division IV - grades ten, eleven and twelve. Year 1 refers to grades one, four, seven, and ten in each respective Division. 24 test administration. Also, testing the entire population benefits the School Division by providing individual and school performance information on all of its students in the designated test groups. The September Class List of Enrolled Students sent to the Division central office by each school administrator early in the school year was used to develop the lists of students to be assessed in the eight grade and age categories. In the few instances where the school enrollment lists were delayed in being sent to the central office, provisions were made to enable school staff to determine the proper students to assess. Student names were later verified by the author once the enrollment forms were received from these schools. Tests A search was made for suitable survey tests which would accurately reflect the present status in school achievement and ability of students attending schools operated by the Northern Lights School Division. The following criteria were used to identify and subsequently select the tests: A. Basic Criteria: 1) The tests must be frequently used or well known by test administrators or school staff in Saskatchewan. 2) The tests must be psychometrically sound, that is, possess adequate characteristics of reliablity and 25 validity, and exhibit representative national norms. 3) The tests must contain content largely appropriate or relevant to the Northern Lights School Divi sion. 4) The tests must be suitable for group administration. 5) The total test administration time should not exceed 200 minutes - a maximum of one full day's testing with adequate breaks. 6) The tests should not be overly complex with regards to format and style (i.e. long and figurative reading passages; math skills relying largely on reading comprehension or English language skills; ability testing in which the student is dependent mainly upon being able to read the directions). B. Secondary Criterion: 1) The tests meeting the basic criteria should already possess Canadian norms. The Gates-MacGinitie Reading Tests - Canadian Edition (MacGinitie, 1979) matched all the criteria and was chosen to measure achievement in reading in all grades and at all age levels. The Mathematics Computation subtest of the Canadian Tests of Basic Skills (CTBS) (King, 1981) also met all the criteria, and was chosen as an appropriate indicator of mathematics performance for grades three, five, seven, and the 26 three younger age groups. The Mathematics subtest of the CTBS chosen for the grade nines and 16 year olds extended the math skills to include math concepts and problem solving as well as computation. The Otis-Lennon School Ability Test (OLSAT) (Otis & Lennon, 1979a), was selected as the measure of scholastic ability or readiness for a particular level of academic work. While not conforming to the secondary criterion, nonetheless the OLSAT was chosen as no other comparable Canadian normed test of school learning ability offered the same shortened time frame required for test administration. Upon selection of these tests, the content of each was critically evaluated further by a selected group of teachers and consultants with previous working experience within the School Division. Generally the tests were well received by the seven reviewers. It was a result of the evaluators' comments pertaining to the degree of difficulty of some test levels and their concerns about the complexity of vocabulary in some levels that Division I and II tests, initially identified for grades two and four students, were administered to grade three and grade five students instead. The final selection of levels and forms of the achievement tests and the ability test is shown in Table 1. A particular level and form of each test was matched with the appropriate grade as well as an age level believed to be fairly typical for that grade in the School Division. Therefore, children in grade three and children aged from 9 years 0 months to 9 years 5 27 Table 1 Levels and Forms of the Achievement and Scholastic Ability Tests Administered to Students in the Four Grade and Four Age Groups in the Northern Lights School Division Grade 3 Grade 5 Grade 7 Grade 9 and and and and TESTS 9 yr Omo 11yr Omo 14yr Omo 16yr Omo to to to to 9 yr 5mo 11yr 5mo 14yr 5mo 16yr 5mo Gates-MacGinitie Reading Tests - Can Level B Level D Level D Level E adian Edition (1979) Form 1 Form 1 Form 2 Form 2 a. Vocabulary b. Comprehension Canadian Tests of Basic Skills (1981) a. Math. Computation Level 7 Level 10 Level 13 (levels 7,10,13) Form 5 Form 6 Form 6 b. Mathematics Level 15 (level 15 only) Form 5 Otis-Lennon School Pri. II Elemen. Intermed Advanced Ability Test (1979) Form R Form R Form R Form R 28 months were administered the same levels and forms of each respective test. Grade five children were given the same test levels and forms as childen aged 11 years 0 months to 11 years 5 months; similarly, grade seven students and students aged 14 years 0 months to 14 years 5 months were given the same tests, as were grade nine students and students aged 16 years 0 months to 16 years 5 months. This created a situation in which some students were scheduled to be tested twice on different test levels. For example, some children aged between 9 years 0 months and 9 years 5 months were in grade five, thus they were administered the Gates-MacGinitie Reading Test, Level B, during one testing session, and the the Gates-MacGinitie Reading Tests, Level D Form 1, during another testing session. Of the test levels chosen to assess students in the selected grade and age categories, four were administered out-of-level with respect to the usual or intended administration range. However, wherever this occurred, test levels did contain the appropriate out-of-level national norms. Figure 2 shows the commonly intended ranges and the pertinent out-of-level uses of the test levels used in this study. It can be seen that the grade three students were administered out-of-level reading and math tests. Grade five students were administered an out-of-level math test, while grade seven students were administered an out-of-level reading test. It was expected that students would have somewhat less difficulty with these out-of-level tests. On the otherhand, more difficulty could be expected for students taking within-level tests, especially those where the student's 29 I. Gates-MacGintie Reading Tests - Canadian Edition Grade: 1 2 3456789 101112 I—I—I—I—I—I—I—I—I—I—f-H—I Level B OXO Level D 0 X X X 0 Level E  X X X 0 (no national age norms) II Canadian Test of Basic Skills Grade: 1 2 3 4 5 6 7 8 9 101112 I 1 1 1 1 1 1 1 1 1 1 1 1 Level 7 X 0 Level 10  0 Level 13 X Level 15(no national age norms) III Otis-Lennon School Ability Test Grade: 1 2 3 4 5 6 7 8 9 101112 Primary II XX Elementary  X Intermediate XXX Advanced  X X X (national age norms are all within-level with respect to test levels used to assess age groups in this study) KEYS: X - intended or usual assessment range 0 - out-of-level use Figure 2. Within-level and pertinent out-of-level uses of test levels used in this study. 30 grade occurred at the beginning of the intended grade range (for example, grade nine students taking the Advanced level of the OLSAT) . Testing Preparations and Instructions Once student lists for each school were completed, test booklets and answer sheets were counted and packaged for each school based on the number of students in each list. Where class enrollment lists were delayed in being sent out from certain schools, an estimate of the number of students to be assessed in each category was made based on the previous year's enrollment. The corresponding number of tests was then packaged and prepared for delivery. All test administrators, consisting of teachers from Division schools, attended a half day inservice on October 14, 1983. The purpose of this inservice was to help familiarize the testers with the tests, the testing schedule, and the test administration procedures. The test instructions provided to the testers and the answer sheets to be used by the students were carefully reviewed. Test administrators were also provided with the student lists (if available) for their schools, the test booklets, and the answer sheets. In the few cases where student lists were not available due to class enrollment lists not being received prior to the inservice, the testers were provided with information to determine which students to test. Particulars concerning the Test Administrators' Inservice are available in 31 Appendix A. Test Administration With two exceptions, test administration was conducted by school staff between the third week of October and the end of November, 1983. In order for the author to acquaint himself directly with any complexities involved in the test administration for the purpose of presentation at the inservice, he administered tests to students enrolled in one of the Division's small schools in mid-September, and later that month, to students in a larger school. Originally it was hoped that testing could be completed in a two week period in October; however this proved difficult considering each test administrator's own unique teaching schedule and each school's particular organization. For similar reasons, combinations and variations of the two daily test schedules presented at the Inservice (see Appendix A) were accomodated to meet individual school needs. The presented test schedules called for a particular order and time of test administration, either to be completed in one day, or over a four day period. In some cases different times of the day than those specified were used; in other cases, a different ordering in test presentation was utilized. During the fall testing period, the author frequently corresponded with test personnel from the schools to determine how the testing was proceeding, to answer concerns, and to 32 attempt to solve any problems which arose. The fall testing allowed the students at least three weeks time to settle into their new year's work. Fall assessment had the advantage of making it possible to provide student norms early in the school year for a formative evaluation of the students progress. All students except those in grade three and those students aged 9 years 0 months to 9 years 5 months taking Division I tests levels coded their test answers on NCS F4521 Answer Sheets. Grade three students and the 9 year olds recorded their answers directly in the test booklets. Completed test data were returned to the central district office from each school as soon as possible after the completion of testing. As the materials were received from the schools, a letter of acknowledgement was forwarded back to the schools indicating that the materials had been received. Data Preparation and Processing Clerical assistants coded the grade three and nine year old answers from the returned test booklets onto answer sheets. As well, each answer sheet coded by the older students was carefully checked to make sure answer spaces were properly filled in, pencil marks were reasonably dark, and stray pencil marks removed. The identification information including name, age, grade, and test code number was checked for accuracy with the prepared student lists. The remaining values associated with the descriptive variables of group, student and school 33 identification, size of school, and affiliation were then coded on the answer sheets in final preparation for machine scoring. The raw data on the answer sheets were scored and processed using the NCS Scanner and computer facilities at the University of Saskatchewan. Eight different data sets corresponding to each grade and age were created and stored on magnetic tape in eight sequential files. Data Analysis Comparison of the Regional Test Results with the  Results from the National Standardizations The raw test score means for each group were compared to the data supplied by the test publishers and authors of the national norming studies. The one sample t-test was used to test for differences between the regional sample means and the mean or median values of the national standardizations. The formula that was followed is given as t = X.- a (Glass & Stanley, 1970, p.293) s //n where X.is the regional sample mean found in this study, a is a value designating the national mean or median, s is the standard deviation of the regional sample, and n is the regional sample size. The one sample t-test allows for a probabalistic statement regarding the degree of confidence which can be made about the equality of a regional population mean (estimated by X.) compared to a value (a) depicting a national mean or median. 34 Only those group test results which differed significantly from national standards were considered in the construction of regional norms. Statistical significance was set at the .05 level for all comparisons. Reported national mean values were used whenever possible for comparative purposes. However, as national norms for the groups of interest were not empirically derived for several of the test forms or levels used in this study, national raw score mean values were not always available. For these tests, the median values obtained from the published norm tables were substituted for the unavailable means. The median was deemed a suitable and close approximation to the mean as it was assumed that the national distributions tended towards normality. The x2 statistic was used to compare the equality of the regional sample variances with the corresponding national variances, whenever the latter information was available. The formula used was X2=(n-1)s2 (Glass & Stanley,1970, p.301) b where s2 is the regional sample variance, b is the national variance, and n is the regional sample size. As the condition of normality is an important consideration before making this comparative test, the Kolmorgorov-Smirnov One Sample Test was used to determine whether or not the regional test data could reasonably have come from a normal distribution (Glass & Stanley, 1970; Hull & Nie, 1981). The .20 level of 35 significance was used in this case to protect against committing a type II error. The type II error, failure to reject the hypothesis of normality when it is false in actuality, was deemed the more serious error in this case. The level of significance set for the x2 statistic was .05. In addition to making statistical comparisons on measures of central tendency and variability, the NLSD test data were compared to the national test data at selected percentile points. The percentile points chosen represented the national 10th, 20th, 50th, and 80th percentiles. Regional percentile ranks corresponding to the selected national percentile points were recorded and compared. Performing this kind of comparison permitted an assessment of the degree of difference between NLSD and national results at different points along the test di str ibut ions. Item Characteristics of the Selected Tests Item analysis procedures were used to help determine the usefulness or suitability of each test for assessment purposes in the Northern Lights School Division. The average item difficulty (p) and average item discrimination (D) (point-biserial correlation) of each test were computed, and individual test items were investigated for low indices of item difficulty and item discrimination. Items with indices of discrimination below .20 were recorded; Ebel (Hopkins & Stanley, 1981) considered items with indices of discrimination below .10 as 36 being poor and subject to rejection or revision, while items with discrimination indices between .10 and .19 were considered marginal, usually requiring some improvement. Items showing difficulty values at or below the chance values, assuming all options are equally attractive, were also recorded. The chance value for each test and therefore each item was derived by c = 1/A where A is the number of options offered for each test item (Hopkins & Stanley, 1981). Closely tied in with each test's item difficulty and item discrimination indices is its internal consistency reliability coefficient. Because of the potential value these tests hold for screening purposes, a K-R 20 reliability coefficient of .80 was regarded as minimum before preparation of regional norm tables would be considered (Nunnally, 1970; Salvia & Ysseldyke, 1981). Inter-test Correlations Pearson product-moment correlations were calculated between pairs of all possible test combinations within each group. Of particular interest was the degree of relationship shown between the achievement tests results and the school ability test results. 37 Preparation of Regional Norms Regional norm tables were prepared for seven of the eight grade and age groups on all tests, as the statistical and psychometric conditions mentioned in the previous subsections were met. Norm tables were not established for the 16 year old group as the number of cases taking each test was too low -below a predetermined cut-off set at 100 students (Salvia & Ysseldyke, 1981). Separate norm tables are provided in Appendix B for the Vocabulary Subtest; Comprehension Subtest, and the total test of the Gates-MacGinitie Reading Tests. Also in Appendix B are the regional norm tables for the Mathematics Computation Subtest (Mathematics Subtest for grade nine students) of the Canadian Tests of Basic Skills, and norm tables for the Otis-Lennon School Abi1ity•Test. Percentile ranks, stanines and T-scores were calculated by computer to correspond with the raw scores in the tables. The midpoint percentile ranks were calculated using the following formula adapted from Ferguson (1976): PR = CP + .5P, where CP is the cumulative percentage of cases occurring below the raw score in question and P is the percentage of cases occurring at the raw score in question. Values were rounded off to the nearest whole number. Stanines for percentile rank values were obtained from Ferguson (1976) and applied in ascending 38 order, thus giving the bottom four percent of cases a stanine of one. T-scores rounded to the nearest whole number were calculated for each raw score to transform the test mean to 50 and its standard deviation to 10. All computations were completed utilizing the computer facilities at the University of Saskatchewan. Several SPSS (Statistical Package for the Social Sciences (Nie et all, 1975; Hull & Nie, 1981)) programs, including frequencies, crosstabs, breakdown, condescriptive, Pearson correlations, and non parametric tests were used. As well, programs developed at the University of Saskatchewan, such as the item analysis program were used. Some programs were developed especially for this study, including programs to obtain the norms tables and the students reports. 39 Chapter IV RESULTS This chapter begins with the presentation of summary information on the rate of response and demographic characteristics of each of the groups tested in this study. This is followed by presentation of the test results by grade. In this section results of item analysis and test correlation analysis within each of the four NLSD grade groups are examined, followed by results of the comparisons between the NLSD and national grade groups. Finally, information is presented on the test results of the four age groups. . Rate of Response As previously mentioned in Chapter III, the decision was made to test all subjects in the Northern Lights School Division who belonged to the eight selected grade and age groups. However to the extent that student absenteeism and organizational problems in testing some students within a few schools arose, a certain percentage of students from each group could not be assessed. Table 2 shows the actual number and percentage of students tested in each grade and age category. Greater difficulty in assessing older students by age is indicated by the lower total percentages of students assessed in the three older age groups. Table 2 Description of the Number of Students Assessed by Grade and Age in the Northern Lights School Division According to Gender., Affiliation, and School Size Percentage tested out of total poss i b1e Gender Aff i1i at i on Schoo1 S i ze Group Number Tested Ma 1 es Femal es. Nat i ve: Non-Treaty Nat i ve: Treaty Non-Nat i ve Large: > 150 Students Smal1: < 1 10 Students Grade 3 366 (84.9%) 186 180 252 67 47 309 57 Grade 5 324 (92.6%) 17 1 153 220 55 49 276 48 Grade 7 219 (84.6%) 1 13 106 156 30 33 198 21 Grade 9 165 (88.2%) 75 90 101 35 29 146 19 9-0 to 9-5 years of age 158 (87.8%) 73 85 1 15 27 16 138 20 11-0 to 11-5 years of age 131 (72.8%) 59 72 96 17 18 108 23 14-0 to 14-5 years of age 115 (73.7%) 69 46 81 22 12 96 19 16-0 to 16-5 years of age 57 (69.5%) 32 25 4 1 10 6 52 5 41 Achievement information was later gathered on the absent students and examined in relation to the results of the tested students to ascertain the impact of this form of nonresponse. This information is presented in Appendix C. As discussed there, had the absent students been involved in the testing, a slight lowering of test means probably would have occurred in most groups. Demographic Characteristics of the Obtained Samples Table 2 gives a further breakdown of the number of students tested in each of the eight grade and age categories by gender, affiliation and school size. As shown in this table, more males than females were assessed in five of the eight categories, and slightly more males were assessed overall. Most of the students tested were Native; of these, the large majority were Non-Treaty or Metis. From 82 to 91 percent of all students assessed in each group were registered in schools exceeding 150 enrolled pupils. GRADE RESULTS  Item Analysis Results of item analysis showed the large majority of items on all tests administered to the students in the four grade groups to be satisfactory both with respect to item difficulty and item discrimination. Table 3 provides a summary of the item characteristics of each test administered at the four grade 42 Table 3 Summary Item Statistics of Achievement and Scholastic Ability Tests Administered to Students in Four Grade Groups in the Northern Lights School Division Test Number of Test I tems Ave. I tern Dif f . Average I tern Discrim. Reliability Coef f ic ient (K-R 20) Standard Error of Meas. Grade Three (n=361,361,359,351,347) 1 Gates-MacGinitie Vocabulary, B1 45 .549 .583 .928 2.72 Gates-MacGini t ie Comprehension, Bl 40 . 565 .560 .913 2.55 Gates-MacGinitie Reading Total, Bl 85 . 558 .552 .957 3.75 CTBS Mathematics Computation, 7-5 26 .850 .349 .844 1 .60 Otis-Lennon School Ability, Pri.II-R 75 .569 .445 .919 3.72 Grade Five (n = 321,316,316 , 316 , 317) 1 Gates-MacGinit ie Vocabulary, D1 45 .369 .451 .891 2.69 Gates-MacGin it ie Comprehension, D1 43 .383 .416 .845 2.86 Gates-MacGinitie Reading Total, D1 88 .377 .396 .923 3.97 CTBS Mathematics, 10-6 42 .571 .539 .919 2.48 Otis-Lennon School Ability, Ele-R 70 .468 .447 .918 3.57 43 Table 3 (cont'd) Test Number of Test I tems Ave . I tern Dif f . Average I tern Di scr im. Reliability Coef f ic ient (K-R 20) Standard Error of Meas. Grade Seven (n=2l6, 215,215,216,206) 1 Gates-MacGinitie Vocabulary, D2 45 .579 .526 .914 2.69 Gates-MacGinit ie Comprehension, D2 43 . 587 .500 .887 2.81 Gates-MacGinitie Reading Total, D2 88 .584 .481 .943 3.90 CTBS Mathematics Computation, 13-6 45 .378 .454 .897 2.58 Otis-Lennon School Ability, Int-R 80 .443 .419 .922 3.75 Grade Nine (n=157, 57,155,159,152) 1 Gates-MacGinit ie Vocabulary, E2 45 .487 .520 .904 2.86 Gates-MacGini t ie Comprehension, E2 43 .613 .445 .867 2.71 Gates-MacGinit ie Reading Total, E2 88 .548 .448 .935 3.78 CTBS Mathematics, 15-5 48 .406 .446 .874 3.01 Otis-Lennon School Ability, Adv-R 80 .358 .327 .885 3.78 1 n's are reported for each test in the same order as the tests are presented within each group. 44 levels. As shown, all but one of the tests possess difficulty levels ranging within ±.15 of .50, the level of difficulty which holds the greatest potential for maximally discriminating respondents. Greater difficulty was shown by grade five students on the reading tests, and by grade seven and nine students on the mathematics and ability tests. More difficulty was expected on these tests for students from the above grades as the content of the tests was designed to be more difficult for these students relative .to the content of the tests used to assess other subject areas or students from other grades (this is explained in Chapter III). The CTBS Mathematics Computation subtest, level seven, form five, proved to be an easy test for NLSD grade three students who scored an average of 85 percent correct responses. This test was essentially designed as a grade two test, but which also possessed out-of-level national grade three norms. Indices of item discrimination were generally positive for all tests. As shown in Table 3, the average item discriminations (point-biserial correlations) for all but three tests were above .40. For the remaining three tests, the corresponding values were above .34. Table 4 contains a list of the few items from each test which revealed either item difficulty levels below the chance level for each test (.25 for four-choice item tests; .20 for five-choice item tests) or item discrimination indices below .20. Of the items listed, several were easy items with associated low indices of item discrimination (for example, almost all items on the grade seven reading test). Table 4 Test Items with Item Difficulty Levels Below the Chance Level1 or Item Discrimination Indices2 Below .20 on Tests of Achievement and Scholastic Ability Administered to Four Grade Groups in the Northern Lights School Division Grade 3 Grade 5 Grade 7 Grade 9 Gates-Mac Reading Total, B1 Gates-Mac Reading Total, D1 Gates-Mac Reading Total, D2 Gates-Mac Reading Total, E2 I tern Number I tern Di f f . I tern Discrim. I tern Number I tem Dif f . I tem D i scr i m. I tem Number I tem Dif f . I tem D i scr i m. I tem Number I tem Di f f . I tem D i scr i m. 1V. 14C . 39C . 40V. . 955 . 293 . 164 . 203 . 133 . 178 . 278 . 378 25C. 30C . 42V. 13C . 33C . 40C. 30V. 44V. 39V. 37V. 3 1C . 35C. 41V. 29C. 26C. 27C . 1V. 6C . . 161 . 304 .051 . 478 . 136 . 193 . 228 .095 . 142 . 149 .171 . 209 .117 . 165 . 279 . 402 .902 . 247 .051 .063 .089 .089 . 101 . 101 .114 . 127 . 165 . 165 . 152 . 152 . 177 . 177 . 190 . 190 . 190 . 430 1V . 2V . 21C . 44V . 7C . .949 . 954 .651 . 209 .902 .000 . 130 . 130 . 185 . 185 15C. . 748 7C . . 955 19C . . 890 33C . . 303 31C. .923 3V. .897 3C. . 542 9C. . 877 1 3V . . 703 36V. . 181 43V . . 181 • . 154 .07 7 . 077 . 128 . 154 . 154 . 154 . 180 . 180 . 230 . 330 cn CTBS Math Computation, 7-5 CTBS Math Computation, 10-6 CTBS Math Computation, 13-6 CTBS Mathematics, 15-5 1 1 . 13 . 17 . .932 .940 . 946 125 182 182 14 . 13 . 18 . .848 .914 .918 139 177 190 74 . 87 . 89 . 90. 91 . 92 . 93 . 94 . 95 . 96 . 97 . 98 . 99. . 125 . 102 . 102 .093 . 189 . 199 .088 .062 .074 .051 .056 .032 .051 . 204 .315 . 204 .074 .315 . 389 . 148 . 333 .093 .093 . 204 .074 .093 16 . 41 . 15 . 29 . 9 . 1 . 27 . . 239 . 277 . 164 . 170 . 384 . 774 . 151 - .025 . 125 . 175 . 175 . 175 . 150 . 250 Table 4 (cont'd) Grade 3 Grade 5 Grade 7 Grade 9 0-L School Ability, Prill-R 0-L School Ability, Ele-R 0-L School Ability, Int-R 0-L School Ability, Adv-R I tern Number 3 I tem Di f f . I tem D i scr i m. I tem Number I tem Dif f . I tem D i scr i m. I tem Number I tem Dif f . I tem D i scr i m. I tem Number I tem Dif f . I tem D i scr i m. 27c . 32c . 1a. 2a . 5a . .141 .219 .931 .948 .870 .115 .218 .092 .092 . 184 G9 . 58 . 68 . 50. G3 . 62. 70. . 170 . 208 . 243 . 170 . 174 . 104 . 148 . 076 . 139 . 152 . 177 . 177 . 24 1 . 268 50. 78 . 80. 1 . 9 . 73 . 76 . 71 . 6 . 72 . . 228 .151 .068 .898 . 733 .063 . 223 .15 1 .913 . 136 .015 .114 .115 .119 . 142 . 173 . 189 . 191 . 197 . 365 42 . 37 . 57 . 61 . 63 . 67 . 70. 7 1 . 72 . 73 . 74 . 75 . 76 . 77 . 78 . 79 . 80. . 286 . 270 .086 . 125 .118 .086 .118 .112 . 105 .065 . 204 .072 . 132 .079 .099 .066 .086 .026 . 132 . 239 .316 . 237 . 237 . 158 .079 .026 . 026 . 105 .080 - .015 .000 - .026 .079 . 105 Note: C designates an item from the Comprehension test; V designates an item from the Vocabulary test. 1 Based on 1/A, where A is the number of choices for an item, and assuming all choices are equally attractive. * The point-biserial correlation of the examinees' scores on an item with the their total test scores. 3 An item with subscript 'a' has come from Part I of this test; an item with subscript 'c' has come from Part III. 47 Only four items affected the discriminatory power of their respective tests adversely through slightly negative values of discrimination. These items, all occurring in tests at the grade nine level, were number 15 of the Comprehension test, number 16 of the Mathematics test, and numbers 76 and 78 of the Ability test. Items 76 and 78 belong to a cluster of 11 items appearing at the end of the test which showed little contribution to the test's overall usefulness towards discriminating high from low ability students. Subsequent analysis revealed that approximately 50 percent of both low-scoring and high-scoring students failed to respond to these items, suggesting a speed factor confounding the basic power element of this test. A cluster of 11 items forming the last part of the Mathematics Computation test administered to grade seven students also revealed either low levels of item difficulty or item discrimination. The no response rate for these items ranged from 50 to 75 percent. Indications are that speed of response may have contributed to the test scores to some degree along with the ability of students to solve varying and increasingly more difficult arithmetic operations. Internal consistency K-R 20 reliability estimates for the tests ranged from .84 (CTBS Math Computation - grade 3) to .96 (G-Mac Rd. Total - grade 3), as indicated in Table 3. The speed factor noted above not withstanding, the reliability values are comparable to those reported by either the test authors or publishers. 48 Inter-test Correlations Correlation matrices for test pairs within each grade group are presented in Table 5. High correlations were found between the two reading subtests. As the Vocabulary subtest at the grade three level is actually a decoding test, this may account for its higher correlation with the Comprehension subtest compared to the coefficients for other grades. The computation tests showed a fair degree of correlation with the reading tests inspite of being virtually nonverbal. However, the Mathematics test at the grade nine level, comprising of math concepts, verbal math problems, and computational problems showed a considerably higher correlation with the reading tests. The ability tests revealed strong correlations with all achievement tests, although lower correlations with reading at the grade three level and math computation at grades three, five, and seven were noted. The Ability test at the grade three level is completely free from student reading involvement which may have accounted for its lower correlation with reading. Ability tests at other grade levels have a large verbal component including a reading element; this may account for the higher correlation with the reading tests and the Mathematics test at the grade nine level. 49 Table 5 Pearson Correlation Coefficients Among Tests Administered to Students in the Northern Lights School Division Within Four Grade Groups Test Test 1 Test 2 Test 3 Test 4 Test 5 a. (10.15) ( 8.65) (18.08) ( 4.05) (13. 07) b. ( 8.15) ( 7.27) (14.29) ( 8.74) (12. 45) c. ( 9.17) ( 8.35) (16.35) ( 8.05) (13. 44) d. ( 9.04) ( 7.51 ) (15.44) ( 8. 36) (10. 75) 1 . G -Mac Vocabulary a. Grade 3 (x=24.70) b. Grade 5 (x=16.62) c. Grade 7 (x=26.06) d. Grade 9 (X=21.90) 2 . G -Mac Comprehension a. Grade 3 (x=22.60) • 8495 b. Grade 5 (x=16.49) • 7177 c. Grade 7 (x=25.25) • 7549 d. Grade 9 (x=26.36) • 7415 3 . G -Mac Reading Total a. Grade 3 (x = 47.41 ) .9676 .9551 b. Grade 5 (x=33.17) . 9352 .9178 c. Grade 7 (x=51.39) .9423 .9310 d. Grade 9 (x=48.23) .9457 .9193 4 . CTBS Mathematics a. Grade 3 (x = 22.11) . 371 9 .3810 .3865 b. Gradev5 (x=23.97) .3247 .3762 .3749 c. Grade 7 (x=17.02) .3942 .3947 .41 46 d. Grade 9 (x=19.47) .6819 .6528 .7177 5 . 0 -L School Ability a. Grade 3 (x=42.67) .4979 .5966 .5638 .4784 b. Grade 5 (x=32.73) .7113 .6389 .7307 . 4943 c. Grade 7 (x=35.46) .6822 .7380 .7579 .5384 d. Grade 9 (x=28.63) .7283 .6702 .7448 .7825 Note: Means are reported in parentheses in rows, standard deviations are reported in parentheses in columns. 50 Comparison of Regional Test Results to  National Standardization Information Comparisons of the NLSD test results to national standardization information were made in two basic ways. Firstly, NLSD test data were compared to national test data at selected percentile points. Secondly, test means and variances from the NLSD samples were statistically compared to corresponding measures from the national studies where possible. The results of these comparisons are discussed below. Percentile Rank Comparisons Selected percentile ranks from national norms were compared to corresponding Northern Lights School Division percentile ranks representing the same test score. This information for each test at each of the four grade levels is presented in Table 6. It can be seen that generally a much larger proportion of NLSD students compared to the national samples scored at or below test scores represented by the national percentile ranks. 1. Reading. As illustrated in Table 6, approximately 40 percent of NLSD students in grades seven and nine corresponded to a national percentile rank of ten on the reading tests. This percentage increases by about ten percent for grade five students and decreases by about ten percent for grade three students. This result indicates that approximately 30 to 50 percent of NLSD students scored at or below a test score in reading which only ten percent of the national sample scored at 51 Table 6 Percentage1 of Students by Grade in the Northern Lights School Division Scoring at or Below Selected Percentile Ranks from the National Norm Tables on the Administered Tests Selected Percentile Ranks from the National Norm Tables Test 10 20 50 80 Grade Three Gates-MacGinitie Vocabulary, B1 31 48 74 92 Gates-MacGinitie Comprehension, B1 29 45 81 93 Gates-MacGinite Reading Total, B1 32 50 80 93 CTBS Mathematics Computation, 7-5 36 45 71 93 Otis-Lennon School Ability, Pri.II-R 21 36 75 94 Grade Five Gates-MacGinit ie Vocabulary, D1 48 66 86 95 Gates-MacGi ni t ie Comprehension, D1 43 65 87 95 Gates-MacGinitie Reading Total, DI 48 69 87 96 CTBS Mathematics, 10-6 25 39 71 92 Otis-Lennon School Ability, Ele-R 23 42 81 96 52 Table 6 (cont'd) Selected Percentile Ranks from the National Norm Tables Test 1 0 20 50 80 Grade Seven Gates-MacGini t ie Vocabulary, D2 42 56 82 93 Gates-MacGin i tie Comprehension, D2 36 60 80 92 Gates-MacGinite Reading Total, D2 40 62 82 92 CTBS Mathematics Computation, 13-6 33 43 72 90 Otis-Lennon School Ability, Int-R 1 3 28 75 91 Grade Nine Gates-MacGinitie Vocabulary, E2 42 61 80 93 Gates-MacGinitie Comprehension, E2 36 58 80 93 Gates-MacGinitie Reading Total, E2 44 64 81 92 CTBS Mathematics, 15-5 38 54 84 93 Otis-Lennon School Ability, Adv-R 9 25 69 94 1 The percentages given are equivalent to the NLSD midpoint percentile ranks. 53 or below. As the national tenth percentile is indicative of very low reading achievement for these grades, a substantial proportion of NLSD students showed very low reading performance. At the national 20th percentile, NLSD percentages of students increased to nearly 60 percent for grades seven and nine. Again grade five percentages are higher, while grade three percentages are lower than grade seven and nine values. Approximately 50 to 70 percent of NLSD students showed low reading scores indicated by their correspondence with the national 20th percentile. Results from Table 6 further reveal that only between 13 and 26 percent of NLSD students scored above the national 50th percentile or median (approximating the national average) in reading. And again, only five to eight percent of the NLSD students tested scored above the national 80th percentile, indicative of high achievement. Taken together, these results show considerable differences between the NLSD groups and the national standardizations, with the NLSD results displaced towards lower achievement in reading. 2. Mathematics. The results of the mathematics tests at the grade nine level, and to a lesser extent at the grade three level, revealed percentages similar to those found for reading at the selected national percentiles. The mathematics computation tests at the grade five and seven levels however, showed smaller proportions of NLSD students scoring at or below the selected national percentiles compared to the percentages 54 indicated on the reading tests. Nonetheless, compared to the national values, the NLSD results for these two grade levels continued to reveal consistently higher proportions of students at the selected national percentiles, indicating lower performance in mathematics. 3. School Ability. On the ability tests, the Northern Lights School Division students tested generally showed less of a discrepancy with their national (American) counterparts, particularly at the grade seven and nine levels, than was the case for the achievement measures. Approximately the same proportion of students in the grade seven and nine NLSD groups and national groups showed very low school ability scores at or below the national tenth percentile. A larger percentage of NLSD students at the grade three and five levels was indicated at the tenth percentile. Somewhat larger proportions of NLSD students than national students were indicated at the national 20th percentile; while approximately 70 to 80 percent of NLSD students scored below the national median. Only four to nine percent of NLSD students indicated high school ability scores, over the national 80th percentile. Statistical Comparisons on Measures of Central Tendency and  Variability NLSD and national test means and standard deviations, degrees of freedom, and values for t, Kolmorgorov-Smirnov Z, and chi-square are reported in Table 7. Regional means were compared to national means whenever possible. National median scores were 55 Table 7 Results of the One Sample t-Tests, Kolmorgorov-Smirnov One Sample Tests, and the x2 Tests for Students Assessed by Grade in the Northern Lights School Division Means Stan. Dev. Test NLSD Nat. 1 NLSD Nat. 2 df KSz 3 X2 Grade Three Gates-MacGinitie Vocabulary, B1 24.70 (33) 10.15 - 360 15.5 1 .34* -Gates-MacGinitie Comprehension, B1 22.60 (31 ) 8.65 - 360 18.5 1.13* -Gates-MacGinitie Reading Total, B1 47.41 (64.5) 18.08 - 358 17.9 1.12* -CTBS Mathematics Computation, 7-5 22.11 (24.7) 4.05 - 350 12.0 3.52* -Otis-Lennon School Ability, Pri.II-R 42.67 50.68 1 3.07 13.57 346 11.4 .99 32 1.0 Grade Five Gates-MacGinitie Vocabulary, D1 16. 62 26.41" 8. 1 5 8.38" 320 21 .5 2 .22* -Gates-MacGinitie Comprehension, D1 16. 49 25.11" 7. 27 8.08" 315 21 . 1 2 .05* -Gates-MacGinitie Reading Total, D1 33. 1 7 51.56" 14. 29 15.6" 315 22 .9 2 .40* -CTBS Mathematics Computation, 10-6 23. 97 (30) 8. 74 - 315 1 2 .3 1 . 18* -Otis-Lennon School Ability, Ele-R 32. 73 43. 16 12. 45 14.67 316 1 4 .9 1 .03 227** 56 Table 7 (cont'd) Means . Stan. Dev. Test NLSD Nat. 1 NLSD Nat. 2 df t** KSz 3 X2 Grade Seven Gates-MacGinitie Vocabulary, D2 26. 06 (35) 9. 17 215 14. 3 .76 -Gates-MacGin i tie Comprehension, D2 25. 25 (33) 8. 35 214 13. 6 .70 -Gates-MacGin it ie Reading Total, D2 51 . 39 (68.5) 16. 35 - 214 15. 3 .68 -CTBS Mathematics Computation, 13-6 17. 02 22.685 8. 05 8. 305 215 10. 3 1 .01 202.2 Otis-Lennon School Ability, Int-R 35. 46 42.77 13. 44 15 .94 205 7. 8 1 . 1 5* -Grade Nine Gates-MacGin i tie Vocabulary, E2 21 . 90 (31 ) 9. 04 - 1 56 12. 6 1 . 27* -Gates-MacGin it ie Comprehension, E2 26. 36 (33) 7. 51 - 1 56 1 1 . 1 1 .02 -Gates-MacGinitie Reading Total, E2 48. 23 (63.5) 15. 44 - 1 54 12. 3 1 .08* -CTBS Mathematics, 15-5 19. 47 26.346 8. 36 8. 1 1 6 1 58 10. 4 .98 1 68.0 Otis-Lennon School Ability, Adv-R 28 . 63 34.26 10. 75 1 4 .00 151 6. 5 1 . 18* -1 Median values are given in parentheses where means were unavailable. Median scores were obtained from norm tables in the test manuals, and from out-of-level norm tables for the G-Mac B1 and D2 supplied by Cameron (Note 3). 2 Standard deviations from the national standardizations are given where available. 3 Kolmorgorov-Smirnov Z (Hull & Nie, 1981)). 4 Values supplied by MacGinitie (Note 4). 5 Values were supplied by Hieronymus (Note 2) based on national equating studies. 6 Values supplied by Hieronymus (Note 5). * p<.20 ** p<.00l 57 used for comparison wherever national raw score means were unavailable. Results of one sample t-tests revealed significant differences between the NLSD test means and corresponding national mean or median values on all tests at all grade levels. In all instances, NLSD means were lower than comparative national values. Regional variances were compared to national variances where the latter were available; no statistical comparisons of variance were made in the remaining instances. National variances for the reading tests at the grade five level, the mathematics tests at the grade seven and nine levels, and ability tests at all grade levels were available for comparison. Prior to determining whether or not the observed variances were comparable to the corresponding values, tests for normality were performed on all test distributions. As shown in Table 7, the observed distributions of the grade five reading tests and the ability tests at the grade seven and nine levels fell below the .20 probability level (based on the Kolomorgorov-Smirnov Z) set as the critical value for making necessary assumptions of normality (see Chapter III). Consequently, only the variances of the mathematics tests at grades seven and nine and the ability test at grades three and five were compared to national values. The results of the x2 tests revealed only one significant difference between regional and national variances. For the ability test at the grade five level, the NLSD variance was 58 lower than the national variance. Based on the above findings and results from previous subsections, regional norm tables were constructed for all four grades on all tests administered. These tables are supplied in Appendix B. AGE RESULTS Tabular information pertaining to the test results of the four age groups can be found in Appendix D. It should be noted that national standardization information by age was available only for the Otis-Lennon School Ability Test. Since raw score means and standard deviations were not supplied for the specific age ranges tested, NLSD means were compared to the median values for these age groups found in the norm tables. Comparisons of variances were not conducted. Results of one sample t-tests (see Appendix D) show the NLSD means to be significantly lower than values corresponding to national medians. Given the overlap between grade and age samples, it is not unexpected that the results of item analysis and of the regional and national comparisons were comparable. Norm tables were constructed for three of the four age groups on achievement and ability tests. Tables were not constructed for the 16 year old age group as the number of students in that group was considered too small to provide useful percentile information. Norm tables by age have been placed together with the norm tables by grade in Appendix B. 59 Chapter V SUMMARY AND DISCUSSION This final chapter begins with a summary of the purpose, procedures, and results of the study. Topics of particular interest and concern are discussed, and finally, a view towards further research is given. Summary This study was introduced with a brief account of the fundamental importance of reference groups associated with norm-referenced tests. The relative merits of national and regional norms were examined, and three issues concerning norms were addressed: size of sample, representativeness, and relevancy. Of particular concern was that existing national norms on tests of achievement and scholastic ability by themselves often lacked the necessary interpretive information required to make effective programming decisions for students enrolled in the Northern Lights School Division. It was hypothesized that because NLSD students are atypical with respect to national norming groups on essential school success-related variables, these differences would become apparent on measures of central tendency and variability of student test score distributions. The purpose of the present study, then, was to construct regional norms on suitable tests of achievement and school ability wherever significant differences were found between the national norms and NLSD group test scores, thus greatly 60 facilitating the interpretation of student test scores. The subjects were students enrolled in the Northern Lights School Division in grades three, five, seven, and nine; and students aged 9 years 0 months to 9 years 5 months, 11 years 0 months to 11 years 5 months, 14 years 0 months to 14 years 5 months, and 16 years 0 months to 16 years 5 months. Efforts were made to assess all students in each of the eight respective grade and age groups. In actuality, from 69.5 percent of the 16 year olds to 92.5 percent of grade fives were tested. Among the criteria used to identify and select suitable tests were that the tests possess adequate characteristics of reliability and validity, and exhibit representative national norms. The tests finally selected for the study were appropriate levels and forms of the Gates-MacGinitie Reading Tests Canadian Edition, the Mathematics or Mathematics Computation subtest of the Canadian Tests of Basic Skills, and the Otis-Lennon School Ability Test. The testing period occurred in the fall of 1982. The raw test score means for each group in this study were compared to the data supplied by the test publishers and authors of the national norming studies. The one sample t-test was used to test for differences between the regional sample means in this study and the mean or median values of the national standardizations. The median values from the national norm tables were used for comparative purposes whenever national means were unavailable. Similarly, the chi-square statistic was 61 used to compare the equality of the regional sample variances with the corresponding national variances whenever the latter information was reported. As the condition of normality is an important consideration when making this comparative test, the Kolmorgorov-Smirnov One Sample Test was used to determine whether or not the regional test data could reasonably have come from a normal distribution. The regional data were analyzed further by determining the percentage of students in each group scoring at or below selected national percentile ranks on each test. This kind of analysis allowed for more detailed information as to the extent of the differences between the regional and national groups. Furthermore, item analysis procedures were used to help determine the usefulness or suitability of each test for future assessment purposes in the Northern Lights School Division, and thus the value of constructing regional norms on these tests. Finally, Pearson product-moment correlations between pairs of all possible test combinations within each group were calculated. Results of the one sample t-test revealed significant differences between the NLSD test means and corresponding national mean or median values at all grade levels. Also, significant differences were found on the Otis-Lennon School Ability Test at all age levels (no national test data were supplied by age for the two achievement tests). In all instances, NLSD means were lower than the comparative national 62 values. Only the variances of the mathematics tests at the grade seven and nine levels and the ability tests at the grade three and five levels were compared. The results of the chi-square tests showed only one significant difference between NLSD and national variances. This occurred on the ability test at the grade five level, where the NLSD variance was smaller. Further analysis of the results revealed the following: 1) Only from 13 percent (Reading Comprehension and Reading Total - grade five) to 26 percent (Reading Vocabulary -grade three) of the NLSD students scored above the national median on the reading tests. 2) Generally, the NLSD students performed better on the mathematics computation tests compared to reading; however, only 28 to 29 percent of the NLSD students scored above the national median in grades three, five, and seven. 3) On the grade nine mathematics test involving math problem solving, math concepts, and computation, only 16 percent of NLSD students scored above the national median; this result is very similar to how the grade nines performed on the reading tests. 4) NLSD school ability levels, while lower than the national (American) standards, were generally higher than expected when related to the students' reading scores (compared to Canadian norms). 5) Nearly one-third or greater of NLSD students scored at or below the national tenth percentile on achievement 63 scores, while approximately one-half or greater scored at or below the national 20th percentile. 6) From four percent (Reading Total - grade five) to ten percent (Math Computation - grade seven) of NLSD students scored in the top 20 percent nationally. 7) With respect to the test results of the four age groups, significantly lower scores on the school ability tests . by all four groups compared to national (American) norms indicate that achievement scores for these groups are also much below what would be expected nationally for these age groups. Based on the above findings and on an analysis of the psychometric properties of the tests with the NLSD groups, regional norm tables were constructed for seven of the eight grade and age groups on the achievement and ability tests. Tables with derived scores were not constructed for the 16 year old age group as the number of students in that group was considered too small to provide useful percentile information. Limitations of the Study This study was influenced by several limiting factors which affected both the methodology and the results to some degree. These factors included student absenteeism, small group sizes, absence of desired national data, and test score distributions which deviated from the normal. Issues pertaining to these limiting factors are discussed below. 64 Applicability of NLSD Norms This section discusses two limitations of the study which resulted in some concern regarding the interpretive use or applicability of the obtained Northern Lights School Division norms. These limitations, missing cases and small group population sizes, are discussed below in terms of their affect on the representativeness and stability of the newly constructed norms. 1. NLSD Representativeness. It was planned that all NLSD students from each of the eight grade and age groups would be assessed. However, mainly due to student absenteeism, not every student was assessed. Students unavailable for testing in each of the groups ranged from approximately ten percent for the grade five, seven, and the nine year olds groups to approximately 30 percent for the three older age groups. Whether or not this resulted in systematic bias in the group norms (for example, by raising the obtained NLSD norms due to a disproportionate number of low achievers being absent from testing) was investigated in a nonresponse bias study presented in Appendix C. It was concluded from the substudy that it was possible some degree of nonresponse bias may have occurred in some of the groups, particularly in the grade three and seven groups, and the three younger age groups. Had absent students from these groups been assessed, probably some lowering of the obtained norms would have resulted. Because the evidence indicated that the downward swing in the present norms could be 65 expected to be slight, this researcher is confident that the existing norms, barring minor sampling error, are for all intents and purposes representative of the NLSD populations studied. Nonetheless, it must be cautioned that the norms developed in the present study are limited to NLSD students in the designated grade and age groups, and prudence is advised when assessing students of very limited attendance within the Division. 2. NLSD Population Stability. Of concern here is how applicable the 1982 NLSD norms constructed in this study will be for use in subsequent years. As the population of students in each group was a relatively small number for norming, ranging from 82 (16 year olds) to 431 (grade threes), the matter of norm stability takes on special significance. Every year, the number and types of students entering each grade or age category in the NLSD can be expected to vary somewhat. Particularly as innovations occur in School Division policies, curricula, and instructional methods and techniques, as well as changes in the attitudes and expectations of teachers, parents, students, and community members, student progress can be expected to shift. Where size of population is small to begin with, any slight change in population composition is more likely to have an affect on the existing norms. As the Northern Lights School Division and the communities it serves are perhaps more involved in a state of transition or flux compared to most areas of the country, change in academic status might therefore be more likely to occur. In light of this, student progress in the 66 Division should be monitored and evaluated on a continuing basis for any possible changes in the existing norms. It is recommended that the 1982 norms be frequently evaluated, and if necessary, the tests renormed within a five year period. Availability of National Raw Score Data An intention of the present study was to treat empirical data by comparing raw score means and variances of the NLSD and national samples. However, it was subsequently discovered that much of the data produced for the national groups were theoretically derived through equating studies and further statistical manipulations. Therefore direct comparisons of raw score means and variances were not always possible. In such cases, the one sample t-test was used to compare the NLSD group mean to a value representing the median of the corresponding national group. The assumption was made that the national sample distributions were approximately normal, therefore their mean and median values could be expected to be very similar. Consideration was given to using an estimate of the standard error of the median for each group test and constructing confidence intervals around these values, thus directly comparing the NLSD medians to the national medians. But as normality of the parent population for each NLSD group is an important assumption when using this statistic (Ferguson, 1976), this approach was not tried. On the otherhand, finding estimates of the standard errors of the mean and utilizing a one sample t-67 test requires no such assumption of normality. According to the central limit theorum, "the sampling distribution of means gets closer and closer to the normal form as sample size increases despite departures from normality in the population distributions...except perhaps in the case of gross departures in the population from normality" (Ferguson, 1976, p.141). In the case of comparing variances, national variance data in raw score form were available for only nine out of a total of 20 tests used to assess students in the various groups. Furthermore, chi-square analyses were performed on only four of the nine tests, as the important assumption of normality was violated on the remaining five tests based on a conservative probability level. With the prevalent use of theoretical methods used to norm tests in national testing programs, the researcher would be wise to consider transforming his experimental raw score data to the standard scores used in the large national studies..This would allow for direct comparisons of such parameters as means and variances using more sophisticated statistical tests such as the two sample t-test for means and the F-test for variances (Kirk, 1968; .Glass & Stanley, 1970). In cases where within-group cell data are complete in comparing two populations on multiple test scores, the two-sample multivariate statistic utilizing Hotelling's T2 would be most appropriate (Tatsuoka, 1971). 68 Non-Normality of Test Score Distributions Approximately one-half of the test distributions in this study showed marked deviations from normality (see Table 7, Chapter IV for the test distributions which deviated significantly from normality in each of the grade groups). Most of these non-normal test score distributions were either positively skewed or platykurtic. The positive skewness was no doubt an artifact of the fairly high degree of difficulty posed by the tests for the population being measured, rather than an indication of the distribution of the trait itself. Where positive skewness was pronounced, future norming of somewhat easier tests should reduce skewness, as well as raise the test difficulty index and the students' interest in the tests by allowing for more successful outcomes. The platykurtic distributions of some of the test scores were possibly produced by test items generally being of moderate difficulty. While non-normal, this kind of distribution can indicate tests of good discriminating power (Allen & Yen, 1979). Two test distributions of some concern were those of the grade three and nine year olds on the CTBS Math Computation, Level 7, Form 5. In both cases, the distributions were greatly negatively skewed and highly leptokurtic. The gross departure from normality placed some question on the value of the t-test statistic, used to compare the NLSD mean to the national median at the grade three level. However, while the statistical result may be in question, Table 6 in Chapter IV revealed how different 69 the NLSD and national distributions were with respect to four major corresponding percentile values; thus the need for regional norm tables becomes apparent. Although this test produced a ceiling effect with the two groups it was administered to, it does offer good discrimination of the low-achievers in each group. In order to spread the test score distributions of these groups to a form approximating the normal, the CTBS Math Computation, Level 8, should be considered for norming at a future date. Test Bias An issue which commonly arises when tests are administered to a population consisting mainly of members from a minority culture is that the tests may demonstrate bias against the minority group. Arguments are made that the content of the test items is largely outside the range of the examinees experiences, the vocabulary of the test instructions is too difficult, or the examinees have not had an equal opportunity to learn the appropriate subject matter. This places these students at a disadvantage with respect to how well they will perform on certain tests. Therefore measures such as those of achievement and ability are likely to be underestimated. Hopkins and Stanley (1981) have considered this issue and stress content validity or content relevance as the critical element in determining the fairness of achievement tests. "Ideally, the items on an achievement test should be a 70 representative sample of the content and process objectives of the curriculum" (Hopkins & Stanley, 1981, p. 387). Furthermore, they have asserted that "an achievement test is biased if a given cultural or social group would be expected to perform differently on this set of items than on a representative set of items" (Hopkins & Stanley, 1981, p. 414). Chapter II reviewed the considerations taken by the test developers of the tests used in the present study to assure that test content was relevant and appropriate to both school curriculums and to minority groups. In every case content validation was concluded to be favourable. Furthermore, qualified representatives from within the Northern Lights School Division were acquired to review the tests' content for inappropriate material. These reviews proved to be generally favourable, and as a result the tests were used in the study. When considering scholastic aptitude tests, Hopkins and Stanley (1981) have emphasized construct validity as being of most importance. "In construct validation...many criteria are required to confirm what the test does and does not measure" (Hopkins & Stanley, 1981, p. 105). The developers of the Otis-Lennon School Ability Test (OLSAT) presented predictive, concurrent, and content validation studies to provide convincing evidence that the test is a valid measure of school ability (see Chapter II). Further evidence of concurrent validity for the OLSAT was provided in the present study through an analysis of the relationship of scores on this ability test to scores on 71 each of the achievement measures. Moderately high to high correlations were observed between all OLSAT and achievement pairings (see Table 5 of Chapter IV and Appendix D). As the major purpose of administering the OLSAT is to assist in identifying students who exhibit a scholastic potential significantly exceeding their current achievement levels in the School Division, this test appears to hold some merit. Directions for Future Research An obvious follow-up to this study would be the norming of the presented tests for all grade and age groups in the School Division. Procedures could be used to tie tests together in each separate domain by extended scale scores. Midyear and end-of-year norms could be constructed through techniques involving interpolation and extrapolation. A systematic study of this nature would provide the School Division with a comprehensive set of norms to monitor and evaluate its entire student body. In Saskatchewan, preferential high-cost funding is provided for students who meet criteria established for at least one of eight severely handicapped designations (Saskatchewan Education, 1982). One of these designations, the learning-disabled category, requires that students show a significant discrepancy between their measured I.Q. and their academic achievement as determined by individually-administered government approved tests. As it can be quite time consuming for authorized personnel to assess large numbers of students in this manner, it 72 is imperative that adequate screening occur so that the most likely learning-disabled students can be identified. The group tests used in the study offer potential in this regard. There is a need for correlational and discriminant analysis studies to substantiate the value of using these group tests to accurately predict and identify learning disabled students defined by the individually- administered tests. In-depth studies of the test items and the NLSD population would provide valuable information as to why many of the test score distributions took on non-normal shapes. Indications were that many of the distributions exhibited bimodal or multimodal appearances. Exploring this avenue could give more meaning to the test score findings. Further, studies examining the comparability of the variances of the NLSD and national populations in the domains of reading, mathematics, and scholastic ability would be of some value. The topic of test bias often tends to be a controversial if not a strongly emotional subject. Certain statistical and psychometric approaches applied to test data further collected on NLSD students and other groups would help provide evidence for or against the presence of bias in tests and provide further assurances in content validation. Utilization of such techniques as Angoff's delta plot procedures, item-test correlations for different groups, modified chi-square analysis, and methods based on latent trait theory are some approaches that could be tried (see Burrill & Wilson, 1980). The value of removing as 73 much uncertainty as possible in this sensitive area has scientific as well as pragmatic appeal. 74 REFERENCE NOTES 1) Department of Northern Saskatchewan, Extension Services Branch, Unpublished Report, 1981. 2) Heironymus, A.N. Personal Communication, March 22, 1 983. 3) Cameron, P.D. (for Nelson Canada) Personal Communication, March 22, 1983. 4) MacGinitie, W.H. Personal Communication, March 28, 1983. 5) Hieronymus, A.N. Personal Communication, March 29, 1983. 75 REFERENCES Allen, M.J. & Yen, W.M. Introduction to measurement theory. Monterey, Ca.: Brooks/Cole Pub. Co., 1979. American Psychological Association, American Educational Research Association, & National Council for Measurement in Education. Standards for educational and psychological tests. Washington, D.C: American Psychological Association, 1974. Angoff, W.H. Scales, norms, and equivalent scores. In R.L. Thorndike (Ed.), Educational measurement (2nd Edition). Washington, D.C: American Council on Education, 1971. Burrill, L.E. & Wilson, R. Test servie notebook 36: Fairness and the matter of bias. New York, N.Y: The Psychological Corporation, 1980. Ferguson, G.A. Statistical analysis in psychology and education (4th Edition). Toronto, Ont.: McGraw-Hill, 1976. Glass, G.V & Stanley, J.C. Statistical methods in education and  psychology. Toronto, Ont.: Prentice Hall, Inc. 1970. Ghiselli,E. Theory of psychological measurement. Toronto, Ont.: McGraw-Hill, 1964. Holmes, B.J. Individually-administered intelligence tests: an application of Anchor test norming and equating procedures in British Columbia. Unpublished doctoral dissertation, University of British Columbia, Vancouver, B.C., 1981. Hopkins, K.D. & Stanley, J.C Educational and psychological  measurement and evaluation (6th Edition). Toronto, Ont.: Prentice-Hall, Inc., 1981. Hull, C.H. & Nie, N.H. SPSS Update 7-9: New procedures and  facilities for releases 7-9. Toronto, Ont.: McGraw-Hill, 1981. King, E.M. (Ed.) Canadian tests of basic skills. Toronto, Ont.: Nelson Canada Ltd., 1981. 76 King, E.M (Ed.) Canadian tests of basic skills; teacher's guide. Toronto, Ont.: Nelson Canada Ltd., 1982. Kirk, R.E. Experimental design: procedures for the behavioral  sciences. Belmont, Ca.: Brooks/Cole Pub. Co., 1968. MacGinitie, W.H. Gates-MacGinitie reading tests- Canadian edition. Toronto, Ont.: Nelson Canada Ltd., 1979. MacGinitie, W.H. Gates-MacGinitie reading tests- Canadian edition: teacher's manual. Toronto, Ont.: Nelson Canada Ltd., 1 980. Nie, N.H., Hull, C.H., Jenkins, J.G., Steinbrenner, K. & Bent, D.H. Statistical package for the social sciences. (2nd Edition). New York, N.Y.: McGraw-Hill, 1975. Nunnally, (Jr.), J.C. Introduction to psychological measurement. Toronto, Ont.: McGraw-Hill, 1970. Otis, A.S. & Lennon, R.T. Otis-Lennon school ability test. New York, N.Y.: Harcourt Brace Jovanovich, Inc., 1979a. Otis, A.S. & Lennon, R.T. Otis-Lennon school ability test:  manual for administering and interpreting. New York, N.Y.: Harcourt Brace Jovanovich, Inc, 1979b. Randhawa, B.S. Achievement and ability status of grades four, seven and ten pupils in Saskatchewan. Department of Education, Regina, Sask., 1979a. Randhawa, B.S. Norms booklet: 1978 provincial testing program. Department of Education, Regina, Sask., 1979b. Salvia, J. & Ysseldyke, J.E. Assessment in special and remedial  educat ion (2nd Edition). Boston, Mass.: Houghton Mifflin Co., 1981 . Saskatchewan Education. Special Education: A manual of legis  lation, regulations, policies and guidelines - revised. Regina, Sask.: Department of Education, 1982. 77 Stanley, J.C. & Hopkins, K.D. Educational and psychological  measurement and evaluation. Toronto, Ont.: Prentice-Hall of Canada Ltd., 1972. Tatsuoka, M.M. Multivariate analysis: techniques for psycho  logical and educational research. Toronto, Ont.: John Wiley & Sons, Inc., 1971. 78 APPENDIX A TEST ADMINISTRATORS' INSERVICE 79 AGENDA TEST ADMINISTRATORS' INSERVICE 1982 N.L.S.D. SCHOOL TESTING PROGRAM THURSDAY, OCTOBER 14, 9:30 A.M. ROOM E (SASKATOON ROOM), HOLIDAY INN 9:30 A.M. Welcome to test Administrators. Pass out test.packages. 9:35 A.M. Purpose of the 1982 School Testing Program. 10:00 A.M. Familiarization with the tests used in the Testing Program: Nature of tests 10:15 A.M. - 11:45 A.M. Standardization of test Administration procedures: Instructions for administering the tests used in the 1982 School Testing Program. 11:45 A.M. Discussion ********* 80 Below is the list of tests used in the 1982 School Testing Program. LIST A. Grade 3, and ages 9 years 0 months to 9 years 5 months. i) Gates-MacGinitie Reading Tests, Level B, Form 1 (1979) ii) Mathematics Computation subtest of CTBS, Level 7 Form 5 (1981) iii) Otis-Lennon School Ability Test, Primary II, Form R (1979) Actual Test Time Vocab. 20 Min. Comp. 35 Min. Part I Part II Part III -22 Min. 15 Min. 12 Min. -20 Min. LIST B. Grade 5 and ages 11 years 0 months to- 11 years 5 months. i) Gates-MacGinitie Reading Tests, Level D, Vocab. 20 Min. Form 1 (1979) Comp. 35 Min. ii) Mathematics Computation subtest of CTBS, Level 10, Form 6 (1981) ' 20 Min. iii) Otis-Lennon School Ability Test, Elementary, Form R (1979) V> Min. LIST C. Grade 7 and ages 14 years 0 months to 14 years 5 months. i) Gates-MacGinitie Reading Tests, Level D, Vocab. 20 Min. Form 2 (1979) Comp. 35 Min. ii) Mathematics Computation subtest of CTBS, Level 13, Form 6 (1981) 20 Min. iii) Otis-Lennon School Ability Test, Intermediate, Form R (1979) 45 Min. LIST D. Grade 9 and ages 16 years 0 months to l6 years 5 months. i) Gates-MacGinitie Reading Tests, Level E, Form 2 (1979) ii) Mathematics subtest of the CTBS, Level 15, Form 5 (1981) iii) Otis-Lennon School Ability Test, Advanced, Form R (1981) Vocab. 20 Min. Comp. _35 Min. hO Min. 40 Min. ANSWER SHEET CODTMC J=l G-Mac Vocabulary J=2 G-Mac Comprehension J=3 CTBS Math J=4 Otis-Lennon School Ability 81 TEST ADMINISTRATORS' INSERVICE 1982 N.L.S.D. SCHOOL TESTING PROGRAM POINTS TO NOTE: 1. Make sure that there will be no disturbances in the testing room during the time of testing. 2. Use HB pencils when the answer sheets are used. The marks should be dark and neat. 3. If any students arrive after the comnencement of instructions, they should not be included in that test session. 4. The inaximum number of students tested at one sitting should not exceed 20. If there are more than 12 students being tested at a sitting, a monitor should be used to help the test administrator. If by chance the number at a sitting exceeds 24, two monitors should be used. 5. Make a systematic check of the students during the testing session. Make sure students have started with the correct item and are marking their answers in the appropriate place on the answer sheet. Check to see if the answer sheets are being marked dark and neatly. Help students if they don't quite understand what they're supposed to do, but don't give away the answers. ; 2 83 TEST ADMINISTRATORS' INSERVICE 1982 N.L.S.D. SCHOOL TESTING PROGRAM TESTING DATES: The tests should be administered to the selected students between October 18th and October 29th, 1982. Please try as much as possible to conform with these dates. TEST ADMINISTRATION TIME: Total time to administer tests to a group of selected students is approximately 31 hours. EXAMPLES OF DAILY TEST SCHEDULES MODEL 1 Grade 5, 7, or 9: Grade 3: Day 1: 9:15 A.M. - 10:00 A.M. : Vocabulary 9:15 A.M. - 10:10 A.M! : Vocabulary (Step 1-3) (5 minute break) (1 hour break) 10:15 A.M. - 11:05 A.M.: Comprehension (Step 4) 11:00 A.M. (15 minute break) 11:40 A.M. - 12:00 NOON: Math (Step 5) * Make sure grade 9's are ready to start absolutely no later than this time. 1:15 P.M. - 2:15 P.M. Ability 1:10 P.M. 12:00 NOON: Ccnprehen-sion 1:35 P.M.: Math 1:40P.M. - 2:30P.M.: Ability (Parts I & II) Day 2: 9:15A.M. - 9:45A.M.: Ability (Parts III) MODEL 2 Grade 5, 7, or 9: Day 1: 9:15 A.M. Day 2: 9:15 A.M. Day 3: 9:15 A.M. Day 4: 9:15 A.M. Grade 3: 10:10 A.M.: Vocabulary (Step 1-3) Day 1: 9:15 A.M. - 10:00 A.M. Vocabulary Day 2: 9:15 A.M. - 10:15 A.M. 10:05 A.M.: Math Corrprehension Day 3: 9:15 A.M. - 9:40 A.M. Math 10:05 A.M.: Comprehension - 10:15 A.M.: Ability Grade 3: •4: 9:15 A.M. - 10:05 A.M.: Ability (Parts I & II) 11:00 A.M. - 11:30 A.M.: Ability (Parts III) 85 TEST ADMINISTRATOR'S COMMENT TEST Comment if any problems occurred during testing. 86 TEST ADMINISTRATORS' INSERVICE 1982 N.L.S.D. SCHOOL TESTING PROGRAM STEP 1: Presented to - Students of all grade and age groups being tested. Mention that they have been selected to take part in the N.L.S.D. Testing Program. Students in all other N.L.S.D. schools of the same grade or age as themselves will also be tested. These tests have nothing to do with whether or not they pass or fail the year, but will provide the N.L.S.D. with some valuable information of its students. Mention to the students not to worry, but to try to do the best they can by answering as many questions as they can. Mention that these tests are also used with older students and students in higher grades, therefore they are not expected to answer all of the questions or even most of the questions. Some students may even find all of the questions difficult. Say, "Remember, just try your best and not to worry, if you're not sure of an answer, its o.k. to make a guess at the best answer after you've thought about it." 87 TEST ADMINISTRATORS1 INSERVICE 1982 N.L.S.D. SCHOOL TESTING PROGRAM STEP 2: Using computer scored answer sheets - Presented to Grade 5 (and 11 year olds),  Grade 7 (and 14 year olds), and Grade 9  (and 16 year olds). Pass out an answer sheet to each student. While passing out the answer sheets, mention to the students not to mark the answer sheet in any way until told to do so. Mention that these are special answer sheets which are going to be computer scored, therefore it is important not to fold, bend, tear or improperly mark the answer sheets. With an HB pencil, tell the students to write their name (last name first, then first name) in the blocks at the top of the page, one letter per block. Demonstrate on board: give example H A N s E N J 0 E 1 © © © O © Have monitors check to see that this is being done correctly. When this seems to be in order, show students how to fill in the circles below the block letters. Note that the letters in the circles are in Alphabetical order. Demonstrate on board: IMPORTANT: MENTION TO THE STUDENTS TO FILL IN THE WHOLE CIRCLE, NOT MDRE OR LESS. THE CIRCLES MUST BE FILLED IN DARK ENOUGH THAT THE LETTERS INSIDE ARE HARDLY VISIBLE. IF A MISTAKE IS MADE, ERASE THE ERROR AS MUCH AS POSSIBLE, THEN FILL IN THE CORRECT CIRCLE. Tell the students to be as neat as possible. 88 - 2 -Have monitors check and help students out. Don't go on until students have this done correctly. Now ask the students to look at the place on the answer sheet marked SEX (point). Tell them to fill in the appropriate circle; M for male or boy, F for female or girl. Monitors check. Next, ask them to find the section on their answer sheet where they see the letters A B C D ... etc. above the blocks (point). Tell them to find the block below the letter J and write the number 1 inside the block. Demonstrate on board: Then, ask them to fill in the circle with the number 1 inside it neat and dark. Mention this means that this answer sheet will be used to record their answers to the first test. © © © ® Monitors check. Proceed to test 1. 89 TEST ADMINISTRATORS' INSERVICE 1982 N.L.S.D. SCHOOL TESTING PROGRAM STEP 3: Test 1: Instructions for administering the Gates-MacGinitie VO^ULARY subtest. IMPORTANT: DOUBLE CHECK TO MAKE SURE THAT YOU HAVE THE CORRECT TEST LEVEL AND FORM FOR THE GROUP YOU ARE TESTING. A. ActaLnistering test 1 to Grade 3's and 9 year olds: Test time: 20 mins. Follow the instructions given on the yellow copies taken from the manual. Please read the test instructions over carefully before administering the test. B. Administering test 1 to Grade 5 and 11 year olds, OR Grade 7 and  14 year olds, OR Grade 9 and 16 year olds: Test time: 20 mins. Write the following examples on the board exactly as given below: V-l REPAIR V-2 WEAK i(A) replace > © heavy >.© cut in two I(G) skinny j© fix J(H) crying ••(g) pull out <(l) gone J(E) lift not strong 101. Q © © © (D 102. @ (2) © 0 © Before passing out the test booklets, ask students to turn to SIDE 2 of their answer sheets (it is marked on the sheet). 90 - 2 -Ask the students to point to ROW 101. Mention that this is where they will mark their answer to the first example when its time to begin. Pass out the test booklets. Mention to the students not to open their test booklets. When everyone has a copy, ask them to turn over their booklets to the back cover. Mention that this is the sample page. Say, "The test which you are going to do is a Vocabulary test which measures your knowledge of words -your vocabulary.11 Ask them to look at Row V-l, the first example, and the dark word beside it. Say, "This word is repair. There are 5 words or phrases below this word. They are A - replace, B - cut in two, C - fix, D - pull out, and E - lift. One of these choices means the same or nearly the same as repair. Raise your hand if you know the answer." (Ask someone) "Yes the answer is C - fix (point to the answer on the board). Fix means the same as repair. Now look. Fix is one-two-three (point while counting) .... three choices down, therefore you should fill the circle with the 3 inside it in Row 101." (Do this on the board) Ask everyone to fill in circle 3 in Row 101 of answer sheet. Answer any questions. Monitors check. When everything is in order say, "o.k., let's do example V-2 using Row 102 of our answer sheet. The dark word is weak (point). The 5 choices are F - heavy, G - skinny, H - crying, I - gone, and J - not strong. Which choice means the sane or nearly the same as weak? 91 - 3 -Raise your hand il you know." (Ask someone) "Yes, the answer is J - not strong (point). Not strong is one-two-three-four-five (point while counting) .... five choices down, therefore you should fill in the circle with the number 5 inside it in Row 102." (Demonstrate on board) Check to see that this is done correctly by all students. Now tell the students to turn their answer sheets over to SIDE 1 and flip their test booklets over to the front cover. Mention to them that the work they are to do in the test booklet is just like the two examples they have completed. Mention that they start at Row 1 on their answer sheet and work down and across to Row number 45,(point). (Open Administrator's test booklet). Point to item number 1, indicate that they are to start here and mark their answer on Row 1, and do all the others the same way. Then turn page and indicate where they are to go and where to stop. Tell them not  to write in the test booklets. Tell students they have 20 minutes to work on this. Stress that they work carefully, yet not spend too much time on any one question. Remind them to try their best. Tell them that if they have any questions about what they are supposed to do to raise their hand, and they will be assisted. If they are finished early, they should check over their work and wait quietly for the others to finish. Tell them to open their booklets and start. TIME STARTED: (Record) 4 92 - 4 -Test AcJiunistrators and monitors should walk through test area making sure students understand what they are to do. After 10 minutes write time remaining on the board. After exactly 20 minutes collect answer sheets. Tell students to close test booklets. 93 TEST ADMINISTRATORS' INSERVICE 1982 N.L.S.D. SCHOOL TESTING PROGRAM STEP 4: Test 2: Instructions for Administering the Gates-MacGinitie COMPREHENSION subtest. IMPORTANT: DOUBLE CHECK TO MAKE SURE THAT YOU HAVE THE CORRECT TEST LEVEL AND FORM FOR THE GROUP YOU ARE TESTING. A. Acirninistering test 2 to Grade 3's and 9 year olds: Test time: 35 minutes. Follow the instructions given on the yellow copies taken from the manual. Please read the test instructions over carefully before administering the test. B. Administering test 2 to Grade 5's and 11 year olds, OR Grade 7's and 14  year olds, OR Grade 9 and 16 year olds: Test time: 35 minutes. Write the following on the board exactly as given below before proceeding to the examples: C-l C-2 .© *© © (D 101. ® © (3) © © 2 94 - 2 -Pass out an answer sheet to each student. Tell students to write their name in the blocks at the top of the sheet as done previously (they need not fill in the circles below). Then, in the block below the letter J, write in 2, finally filling in the circle with the number 2 inside it below this block. (Demonstrate on board). Monitors check. Ask students to turn to SIDE 2 of their answer sheets. Ask students to find ROW 101. This is where they will answer the first example. Now pass out test booklets (if they were collected previously) and ask students to turn them over to the sample page on the back cover. Say, "This test measures how well you understand what you read - your reading comprehension." (Hold up your copy of the test booklet, point to the comprehension sample and say) "Read the sample story to yourself as I read it aloud." (Read passage) Then read question C-l and each of the 4 choices. Say, "Raise your hand if you know the answer." (Ask someone). "Yes, the answer is D - the baby bird breaks open the shell. This choice is one-two-three-four (point on the board while counting) .... four choices down, therefore you should fill the circle with the 4 inside it in ROW 101." (Do this on the board). Monitors check. 3 95 - 3 -When everything is in order say, "o.k., let's do example C-2 (point to it in test booklet). (Read it and the 4 chorees). "Raise your hand if you know the answer." (Ask someone). "Yes, the answer is F-tooth. This choice is one-two (point on the board while counting) ... two choices along, so fill in the circle with the 2 inside it in ROW 102." (Demonstrate). Monitors check. "Now let's pretend that the answer was H-pip. This is one-two-three-four (point on the board while counting) ... the fourth choice, therefore you would fill in the circle with the 4 inside it. (Demonstrate on ROW 102 on board). Watch again how I count. (Count again). You must always count this way too when doing your work here. Does everyone understand?" Now tell the students to turn their answer sheets over to SIDE 1 and flip their test booklets over to the front cover. Mention to them that the work they are to do in the test booklet is just like the two examples they have completed. Mention that they start at Row 1 on their answer sheet and work down and across to Row number 43 (point). (Open Administrator's test booklet). Point to item number 1, indicate that they are to start here and mark their answer on Row 1, and do all the others the same way. Then turn page and indicate where they are to go and where to stop. Tell them not to write in the test booklets. Tell students they have 35 minutes to work on this. Stress that they work carefully, yet not spend too much time on any one question. Remind them to try their best. Tell them that if they have any questions about what they are supposed to do to raise their hand, and they will be assisted. If they are finished early, they should check over their work and wait quietly for the others to finish 4 96 - 4 -Tell them to open their booklets and start. TIME STARTED: (Record) Test Administrators and monitors should walk through test area making sure students understand what they are to do, and are marking their answers in dark and neatly. After 15 minutes write time remaining on the board. After exactly 35 minutes collect answer sheets. Tell students to close test booklets. 97 TEST ADMINISTRATORS' INSERVICE 1982 N.L.S.D. SCHOOL TESTING PROGRAM STEP 5: Test 3: Instructions for Administering the CTBS Mathematics  subtest. (Make sure students have scratch paper) IMPORTANT: DOUBLE CHECK TO MAKE SURE THAT YOU HAVE THE CORRECT TEST LEVEL AND FORM FOR THE GROUP YOU ARE TESTING. A. Administering test 3 to Grade 3's and 9 years olds: Test time: approx. 22 minutes. Follow the instructions given on the yellow copy taken from the manual. Please read the test instructions over carefully before administering the test. B. Administering test 3 to Grade 9's and 16 year olds: Test time: 40 minutes. Follow the instructions given on the pink copy taken from the manual. Please read over the test instructions carefully before administering the test. C. Administering test 3 to Grade 5's and 11 year olds, OR Grade 7's and  14 year olds: Test time: 20 minutes Write the following examples on the board: Ex. 1 2 + 3 = 1) 8 2) 1 3) 5 4) N 101. ® (2) © 0 © 2 98 - 2 • -Ex. 2 4 - 1 = 1) 6 2) 2 3) 0 4) N 102. Q (2) (3) Q CD Pass out an answer sheet to each student. Tell students to write their name in the blocks at the top of the sheet as done previously. Then, in the block below the letter J, write in 3, finally filling in the circle with the number 3 inside it below this block. (Demonstrate on board). Monitors check. Ask students to turn to SIDE 2 of their answer sheets. Ask students to find ROW 101. This is where they will answer the first example. Now pass out the math test booklets (face down). Say, "We are going to work on our third test - the math test." Point to the first example on the board. Say, "Look at this example. Two plus three equals .... (point to answers) ... eight, ... one, ... five, ... none of the above choices. Raise your hand if you know the answer." (Ask someone). "Yes, the answer is five ... which is the third choice (point to numerical choice marker), therefore you should fill in the third circle in Row 101, the circle with the 3 inside it. Monitors check. 3 99 - 3 -"Let's do the second example, answer. Four minus one equals .... zero, ... none of the above choices (Ask someone). Use ROW 102 of your answer sheet for your (point to answers) .... six, ... two, ... Raise your hand if you know the answer." "Yes, the answer is N, none of the above choices. Four minus one equals three, and three is not given - it is not shown. Look, N is the fourth choice (point to numerical choice marker), therefore you should fill in the fourth circle in row 102, the circle with the 4 inside it." Monitors check. Grade 5 or 11 years olds "Now turn your answer sheet to SIDE 1. Point to Row 13. (Monitors check). This is where you are going to start, because you are starting with number 13 in your test booklet." (Show the students where"no. 13 is in the test booklet by holding up your booklet and pointing). Say, "You see, no. 13 is right below where it says, Level 10 start here." Now ask students to open their test booklets and find no. 54. Say, "Right here where it says level 10 stop here." Grade 7 or 14 years olds "Now turn your answer sheet to SIDE 1. Point to Row 55. (Monitors check). This is where you are going to start, because you are starting with number 55 in your test booklet." (Show the students where no. 55 is in the test booklet by holding up your booklet and pointing). Say, "You see, no. 55 is right below where is says, Level 13 begin here." Now ask students to open their test booklets and find no. 99. Say, "Right here where it says level 13 stop here." 4 100 - 4 Tell them that this is the last problem they will solve if they have time. Monitors check. Now ask them to turn to no. 13 in their test booklet. Tell them when it is time to begin, to work carefully, yet quickly, not spending too much time on any one problem. If they have any questions about what they are to do, to raise their hand. Remind them to begin at no. 13 of their answer sheet. Write 20 minutes on the board and tell tell them to begin: Time started: (Record) Monitors check. Write time left after 10 minutes. STOP exactly after 20 minutes. Tell them that this is the last problem they will solve if they have time. Monitors check. Now ask them to turn to no. 55 in their test booklet. Tell them when it is time to begin, to work carefully, yet quickly, not spending too much time on any one problem. If they have any questions about what they are to do, to raise their hand. Remind them to begin at no. 55 of their answer sheet. Write 20 minutes on the board and tell them to begin: Time started: (Record) Monitors check. Write time left after 10 minutes. STOP exactly after 20 minutes. 101 Circulate among the students to make certain that they have started with the correct item and are marking their answers in the approptiate'place orVthe answer sheet. If any students finish extremely early, check to be certain, that they have worked to the end of their test. YCKJ should make a systematic check of the students' work during the testing period. / / / / After students have worked exactly 40,min, say: Slop/Close your-test booklet and ti/rn your answer smear over, yfe w)fl take a r^st o^__Z_ minutes Before /starting Ahe new tes/ PINK / UM (/>• taste Mow lo find wtw your hr»l It to BEGIN and STOP on thl* tmt. You will luva 40 rrtin-utm lor this tot. Mathematics LEVEL BEGIN With STOP Atief Page 22, Item 1 Item 48, Page 27~) 16 Page 23, Item 17 Item 64, Page 29 17 Page 25, Item 33 Item 80, Page 31 18 Page 27, Item 49 Item 96. Page 32 DIRECTIONS FOR ADMINISTERING Mathematics When the students have returned to their assigned places, distribute two sheets of scratch paper to each student. Say: Now you are going to take a mathematics test. First, find your place on the answer sheet for Test 2: Mathematics. Pause. Now turn to page 21 in your test booklet. Read the directions silently while I read them aloud. & MATHEMATICS Directions: The purpose of this test Is to find out how well you are developing your knowledge, un derstanding, and skills in the area of mathematics. In many of tho items, you will need to do some computations. Use the scratch paper which you have been given for your computations; do not make any marks in the test booklet. Read the item and do any necessary computations. Then select the answer which is correct or clearly better than the others and mark your choice on your answer sheet. Always be sure to mark your answer sheet on the numbered space corresponding to the item you are working on. K you da not understand an Item, omit it and go on to those which you do know. You may return to omitted Items if time permits. Now study the Sample Item on this page. It is an example of the Items In this test. Pause. SI. SAMPU ITEM What is 75<7c of $12.00? SAMPLE ANSWER SHEET * VUVon bo«.ro A. B. C. •. E. S3.00 S4.00 S8.0U sy.ou None of the above K B C 0 fc <D ®(3)® ® 9). a ui m UJ / If you finish early, you may check your answers. Then dose the test booklet, turn your answer sheet over, and sit quietly until the end of the test is announced. You may not look at any other test. Now look at the table at the end of the directions. Find where your level is to begin, [if all students in the room are in the same level, say: Turn to page _AA and begin with item number, i (See table.)] Pause. Remember tf you do any figuring, do it qn scratch paper and not in the booklet. When everyone is ready to start, say: Turn to the correct page for your level. Ready, you may begin. Start your stopwatch, or set your wristwatch on an even hour. Also note the time on the wall clock, and jot it down. Start Ttana Stop Tim* . Circulate among the students to make certain that they have started with the correct item and are marking their answers in the appropriate place on the answer sheet. If any students finish extremely early, check lo be certain that they have worked to the end of their test. You should make a systematic check of the students' work during the testing period. Administration After students have worked exactly 40 min, say: Stop. Close your test booklet and turn your answer sheet over. This completes the first session. Sit quietly while the test booklets and answer sheets are collected. Collect the booklets and answer sheets according to the procedure you developed as part of your advance preparations. DIRECTIONS FOR ADMINISTERING Written Expression , / If you are following the- recommended schedule, Jest 3 begins the ,seconc)-testing schedule. Whervthe /students have relumed to their assigned places/pass / ouj/the te^bocWets.^swer^heets^, anc^ratph paper. 13 102 TEST ADMINISTRATORS1 INSERVICE 1982 N.L.S.D. SCHOOL TESTING PROGRAM STEP 6: Test 4: Instructions for Administering the Otis-Lennon School  Ability Test. IMPORTANT: DOUBLE CHECK TO MAKE SURE THAT YOU HAVE THE CORRECT TEST LEVEL AND FORM FOR THE GROUP YOU ARE TESTING. A. Administering test 4 to Grade 3's and 9 year olds: Test time: Part I -15 minutes; Part 11-12 minutes; Part III - approximately 20 minutes. Follow the instructions given on the yellow copies taken from the manual. Please read the test instructions over carefully before administering the test. B. Administering test 4 to Grade 5's and 11 year olds, OR Grade 7's and 14 year olds: Test time: 45 minutes. OR grade 9's and 16 year olds: Test time: 40 minutes. Write the following on the board exactly as given before proceeding to the examples: a b c d e 1 2 3 4 5 f g h j k v ioi.Q (D © CD © W 102.@ © CD © X 103 0 © 0 Q © 2 103 - 2 -Y 104. Q (2) © (4) © Z 105.0 <D (3) @ © Pass out an answer sheet to each student. Tell students to write their name in the blocks at the top of the sheet as done previously. Then in the block below the letter J, write in 4, finally filling in the circle with the number 4 inside it below this block. (Demonstrate on board) Monitors check. Ask students to turn to SIDE 2 of their answer sheets. Ask student to find ROW 101. This is where they will answer the first example. Now pass out the test booklets and ask students to turn over to the practice page on the back cover. Mention that this is a test in which the students are asked to solve different kinds of problems. Proceed to the instructions on the pink copies. IMPORTANT: THESE EXAMPLES AND THE EXPLANATIONS GIVEN FOR THE ANSWERS AND HOW TO MARK THE ANSWER SHEET ARE VERY IMPORTANT. PROCEED SLOWLY. NOTE: Example V is different for the grade 5's and 11 year olds compared to example V for the grade 7's (and 14 year olds) and the grade 9's (and 16 year olds). All other examples are common to these 3 groups. 104 Manual for Administering and Interpreting us. :PINK O-LSA T Elementary Practice Problems: Directions Common to All Answer Documents Conlinue by SAYING: Find the place on the back cover of your booklet where it says "Practice Problems." Look at the first practice problem, the one with the letter V in front Read it to yourself as I read it aloud. Do not call out the answer. "The opposite of big is— a good b large c little d soft e light" In this problem, we are looking for a word that means the opposite of "big." Opposites are words like old and new, up and down, beginning and end, open and close. Now think about which one of the five words in Problem V is the opposite of big. (Pause.) The correct word is "little," which is choice c. "flow t »S +Wt one ... +wo ••• •VV»r«c ( pi<rt •Yo "t^>< board wWi\< caunf. r.^ ... + V,t -vVird tow ^o»lJ {'.II Rovo to I 1«r \ *t,c t.^U tWlCt , ttitrt4-ort . +Vt ^V.r«J cUo'.te i<% .1.4*. «V. IO««~»I*T«+O ft. 1 V-In a series, the letters, numbers, or drawings go to gether by following a rule. In this series, every third letter of the alphabet has been skipped. Notice how the series starts with AB, skips C, continues with DE, skips F, continues with GH, skips 1, and continues with JK. If this series were continued beyond K. the L would be skipped, and the next letter would be M, which is choice b. Does anyone have a question? "View cKo'.tt t i's VWc »<\e ...+wo VWc sec°«J eV..ic« , V-v t "X--.lt iV,<.cV. (. &t.*-~"*\\-rJr<l \ MJ«V»r\ 1 Answer students' questions. If students seem to have dif ficulty understanding what opposites are, you may give the following examples: north and south, fat and skinny, happy and sad. come and go. pretty and ugly. Then SAY: Now look at Problem W. Read it to yourself as I read it aloud. Do not call out the answer. "The numbers in the box go together in a certain way. Find the number that goes where you see the question mark (?) in the box." Pause while students try to figure out the answer. Then SAY: To find out what number is missing in the box, you must discover how the numbers go together. You can do this by looking either across or down the box. If you look across the box, you will notice that each number is 1 more than the one before it, so the missing number is 6. Eachnumber going down the box is 1 less than the number above it, so again the answer is 6. -Now cA.-.c j + C.»'*l+ ^totr ..-4 \+ il o"* • •• 'J. • • • V*.'tt-• (p.-.«+ V. *t.< Woo./-4 vVlt c -i->- K-N 1 • • +*»«• V*- tViict , +>iuitori l».'.«IJ w Now look at Problem Y. Read it to yourself as I read it aloud. "Hat is to head as shoe is to—f sock g toe h buckle i leg k foot" ~RoAie ^ow v>»-r,J ;{ >^o>» k"o »-< t*M o-nwlr- (^^w l^t««tl MO -mt c»|u(r is +oo-T , cWo'.c* Vs •* . - - - since a shoe is worn on the foot just as a hat is worn on the head. lU.^li ctV-tlc 5 i« fiOvO lo^. •Co-*' Answer students' questions. Then go on to Problem X. SAYING: Now look at Problem X. Read it to yourself as I read it aloud. Try to figure out the problem, but do not mark or call out the answer. "What letter comes next in this series? A B D E G H JK?" Pause while students look over the problem. Then SAY: 14 Answer students' questions: Then go on to Problem Z. SAYING: Now look at Problem Z. Read it to yourself as I read it aloud. "The drawings in the first part of the row go together to form a series. In the next part of the row, find the drawing that goes where you see the question mark (?) in the series." "Mark your answer in row IOS C»US«) You should have filled in ti«t\t I . In each box of the series, there are three triangles. In the first box, the first triangle is black; in the second box, the middle triangle is black; in the third box, the last triangle is black; and in the fourth box, the series starts to repeat itself. If this series were continued, the first triangle in the fifth box would be white, the middle triangle would be black, and the last triangle would be white, so drawing a is correct. SU.jJ ViA*«- ^.-.\\LJ.V» Ci>ll«- I flo<-> IOS. it -you H«'W«J t-K* o«*»->^ circle: or rV< wro-^ rou , ( r« i e ijou/ ""'"f f: il • - -M. c*rf*c4-105 Manual for Administering and Interpreting PINK Practice Problems: Directions Common to AU Answer Documents Continue by SAYING: Find the place on the back cover or your booklet marked "Directions." Read the directions silently as I read them aloud. "This is a test to see how well you can do different kinds of problems. "First, do the practice problems below, choosing the best answer from among the five choices. The answers to the first two problems have already been marked." Now read the first practice problem, the one with the letter V in front, and decide on an answer. Pause. Then SAY: The large white triangle is to the small black triangle as the large white square is to the small black square, so choice c is correct. U TEU *r\ OTl$ - T GftKOfc -7 ...j H.,r..U, In a series, the letters, numbers, or drawings go to gether by following a rule. In this series, every third letter of the alphabet has been skipped. Notice how the series starts with AB. skips C, continues with DE, skips F, continues with GH, skips I, and continues with JK. If this series were continued beyond K, the L would be skipped, and the next letter would be M, which is choice b. Does anyone have a question? •• Wow «V»o*<ct t I'S VWe ont — +w o sV.*cA. (. Oc ~\X-rJc<L \ w.ue , v*»e.»i n J -v-V.«. \»o*i-4 wW.\« CO««VIMJ.„ there any questions? Answer students' questions, explaining further if nec essary. Then SAY: Now look at Problem W. Read it to yourself as I read it aloud. Do not call out the answer. "The numbers in the box go together in a certain way. Find the number that goes where you see the question mark (?) in the box." Pause while students try to figure out the answer. Then SAY: To find out what number is missing in the box, you must discover how the numbers go together. You can do this by looking either across or down the box. If you look across the box, you will notice that each number is 1 more than the one before it, so the missing number is 6. Each number going down the box is 1 less than the number above it, so again the answer is 6. -Mow cV.-.c ^ -,.«J -'•+v> -*•>*» c.••«•«.+ (po. V- »w< w»«-<-4 v^-t«- Co-»->*->-i^ . • • ^o««V»- t>«lt >'~ ftO^-» loi . •' fOt»*A>V»KJ Now look at Problem Y. Read it to yourself as I read it aloud. "Hat is to head as shoe is to —f sock g toe h buckle j_ leg k foot" Ro.»i« ^onr U«-oJ «|o>* k"«<-> V^c \\ <-oo+. crto*-<* K-* • • • • since a shoe is worn on the foot just as a hat is worn on the head. ^ <^ ... i— n ^ • ^ Answer students' questions. Then go on to Problem Z. SAYING: >oUi><\ \ Rovo >o1. to-Answer students' questions. Then go on to Problem X, SAYING: Now look at Problem X. Read it to yourself as I read it aloud. Try to figure out the problem, but do not mark or call out the answer. "What letter comes next in this series? A B D E G H JK?" Pause while students look over the problem. Then SAY: Now look at Problem Z. Read it to yourself as I read it aloud. "The drawings in the first part of the row go together to form a series. In the next part of the row, find the drawing that goes where you see the question mark (?) in the series." 'Mark your answer in row IOS (P«"<v0 You should have filled in ci«.\c •>•• * . In each box of the series, there are three triangles. In the first box, the first triangle is black; in the second box, the middle triangle is black; in the third box, the last triangle is black; and in the fourth box, the series starts to repeat itself. If this series were continued, the first triangle in the fifth box would be white, the middle triangle would be black, and the last triangle would be white, so drawing a is correct. \ Dr«.w'.«\* o, ,"j Hi «.k..ce +Wr«!•••* <jow 14 f.H -•cit (GO TO ?KG£ \V* 106 Manual for Administering and Interpreting PINK O-LSA T Elementary Answer students' questions. Then SAY: You are to do all the problems in this booklet the same way as you did the practice problems. Read each prob lem carefully and choose the one answer that you think is best. Then find the row that has the same number as the problem you are working on, and fill in the cofcx c\«-c\C rat tmi *>M'iMitiiMi' wi ilir ni<ii)w Answer students' questions. If students are using a separate answer sheet, remind them that they are to mark all their answers on the answer sheet and that they are not to mark on their booklets. Then SAY: Now open your booklet to page 2 and begin working. "Enter the starting and ending times in the box below. Use a pencil to mark all your answers. Make your mark heavy and dark and completely fill the answer space. You may mark an answer even if you are not absolutely sure it is correct, but do not guess , . blindly. To change an answer, you must erase your^o**^ mark completely and then fill in the space for thejQ? answer you think is right You will have(4£ minutes to work on this test. You are not expected to be able to do all the problems, but try to get as many problems right as you can. If you get stuck on a problem, skip it and go on to the next one. When you come to the end of a page, go on to the next page and continue working until you come to the end of the test Raise your hand if you need another pencil, and I will give you one. If you finish before I say "Stop," you may go back to work on any problem you skipped, or you may go over the ones you have done. Are there any questions? Hour Starting Time Time Limit Ending Time Minutes 45 During the test, move quietly about the room to make sure that, students are marking their answers correctly. Also make sure that students are continuing with the test after they come to the end of a page. At the end of 45 minutes. SAY: Stop work now. Put your pencil down. Close your book let and leave it on your desk with the front cover up. Collect the test booklets and answer sheets, if used. This concludes the testing session. AFTER TESTING ~— See the After Testing section in the "General Di rections for Administering" for the type of answer document being used. 15 1 07 APPENDIX B TABLES OF NLSD NORMS ON ACHIEVEMENT AND ABILITY TESTS 108 Grade 3 TABLE 1 Northern Lights School Division Norms for Grade 3 Students tested in the Fall on the Vocabulary Subtest of the Gates-MacGin itie Reading Tests - Canadian Edition, Level B, Form 1. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta T Raw Score PR Sta T 1 1 1 27 26 56 5 51 2 1 1 28 27 58 5 52 3 1 1 29 28 61 6 53 4 1 1 30 29 65 6 54 5 2 1 31 30 68 6 55 6 2 1 32 31 70 6 56 7 4 1 33 32 71 6 57 8 5 2 34 33 74 6 58 9 -6 2 35 34 76 6 59 10 7 2 36 35 80 7 60 11 8 2 37 36 84 7 61 12 11 2 37 37 87 7 62 13 14 3 38 38 89 7 63 14 16 3 39 39 91 8 64 15 19 3 40 40 93 8 65 16 23 3 41 41 95 8 66 17 26 4 42 42 97 9 67 18 29 4 43 43 98 9 68 . 19 31 . 4 44 44 99 9 ' '69 20 35 4 45 45 99 9 70 21 38 4 46 22 43 5 47 23 46 5 48 24 49 5 49 25 53 5 50 Raw Score: Mean 24.70 S.D. 10.15 Nunber of Students - 361 (84% of entire NSLD grade 3 population) 109 Grade 3 TABLE 2 Northern Lights School Division Norms for Grade 3 Students tested in the Fall on the Comprehension Subtest of the Gates-MacGinitie Reading Tests - Canadian Edition, Level B,.Form 1. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw '• Score PR Sta T Raw Score PR Sta T 1 1 1 25 26 64 6 54 2 1 1 26 27 67 6 55 3 1 1 27 28 71 6 56 4 1 1 29 29 75 6 57 5 1 1 30 30 78 7 59 6 1 1 31 31 81 7 60 7 2 1 32 32 84 7 61 8 3 1 33 33 87 7 62 9 5 2 34 34 90 8 63 10 8 2 35 35 92 8 64 11 10 2 37 36 94 8 65 12 13 3 38 37 95 8 67 13 16 3 39 38 97 9 68 14 20 3 40 39 98 9 69 15 23 3 41 40 99 9 70 16 26 4 42 17 29 4 44 18 32 4 45 19 35 . 4 46 20 38 4 47 21 42. 5 48 22 46 5 49 23 50 ' 5 50 24 54 5 52 25 60 5 53 Raw Score: l\fean 22.60 S.D. 8.65 Number of Students - 361 (84% of entire NLSD grade 3 population) 110 Grade 3 TABLE 3 Northern Lights School Division Norms for Grade 3 Students tested in the Fall on the Reading Total of the Gates-MacGinitie Reading Tests -Canadian Edition, Level B, Form 1. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta T" Raw Score PR Sta T 1 26 14 3 38 2 27 16 3 39 3 28 17 3 39* 4 29 19 3 40 5 1 1 27 30 21 3 40 6 1 1 27 31 23 3 41 7 .1 1 28 32 25 4 41 8 1 1 28 33 26 4 42 9 1 1 29 34 26 4 43 10 1 1 29 35 27 4 43 11 1 1 30 36 28 4 44 12 2 1 30 37 30 4 44 13 2 1 31 38 32 4 45 14 2 •1 ' 32 39 34 4 45 15 3 1 32 40 35 4 46 16 3 1 33 41 36 4 46 17 4 1' 33 42 38 4 • 47 18 5 2 34 43 40 4 48 19 5 •2 34 44 42 5 • 48 20 6 2 35 45 45 5 49 21 7 2 35 46 47 5 49 22 8 2 36 47 49 5 50 23 9 2 36 48 51 5 50 24 . 11 2 37 49 55 5 51 25 13 3 38 50 57 5 51 Grade 3 111 TABLE 3 (cont'd) Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw '• Score PR Sta T Raw Score PR Sta T 51 58 5 52 76 94 8 66 52 59 5 53 77 96 8 66 53 61 6 53 78 97 9 67 54 62 6 54 79 98 9 67 55 63 6 54 80 99 9 68 56 64 6 ' 55 81 99 9 69 57 66 6 55 82 99 9 69 58 69 6 56 83 99 9 70 59 71 6 56 84 99 9 70 60 73 6 57 85 99 9 71 61 75 6 58 62 77 6 58 63 79 7 59 64 80 7 59 65 81 7 60 66 81 7 60 67 83 7 61 68 84 7 61 69 86 . 7 62 70 87 7 63 71 88. 7 63 72 89 . 7 64 73 90 8 64 74 92 8 65 75 93 8 65 Raw Score: Mean 47.41 S.D. 18.08 Number of Students - 359 ( 83% of entire NLSD grade 3 population) Blanks for derived scores at the extreme ranges of the scale indicate that the corresponding raw scores were not obtained in the standard ization. 112 Grade 3 TABUS 4 Northern Lights School Division Norms for Grade 3 Students tested in the Fall on the Mathematics Computation Subtest of the Canadian Tests' of Basic Skills, Level 7, Form 5. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta Raw Score PR Sta 1 2 1 1 1 3 1 1 3 4 1 1 5 5 . 1 1 8 6 i ' 1 10 7 l 1 13 8 l 1 15 9 2 1 18 10 2 1 20 11 3 1 23 12 3 1 25 13 3 1 27 14 4 1 30 15 6 2 32 16 9 2 35 17 12 3 37 18 14 . 3 40 19 17 3 42 20 21 3 45 21 28. 4 47 22 36 4 50 23 45 5 52 24 59 5 55 25 77 6 57 26 93 60 Raw Score: Mean 22.11 S.D. 4.04 Number of Students - 351 (81% of entire NLSD grade 3 population) Blanks for derived scores at the extreme ranges of the scale indicate that the corresponding raw scores were not obtained in.the standard ization. 113 Grade 3 TABLE 5 Northern Lights School Division Norms for Grade 3 Students tested in the Fall on the Otis-Lennon School Ability Test, Primary II, Form R. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw • Score PR Sta T Raw Score PR Sta T 1 1 1 26 12 3 37 2 1 1 27 13 3 38 3 1 1 28 15 3 39 4 1 1 29 17 3 40 5 1 1 30 19 3 40 6 1 1 31 21 3 41 7 1. 1 32 23 3 42 8 1 1 33 26 4 43 9 1 1 34 28 4 43 10 1 1 35 30 4 44 11 1 1 36 32 4 45 12 1 1 37 34 4 46 13 1 1 38 37 4 46 14 1 1 39 40 4 47 15 1 1 <30 40 43 5 48 16 1 1 30 41 45 5 49 17 1 1 30 42 47 5 49 18 2 1 31 43 50 5 50 19 3 1 32 44 52 5 • 51 20 4 1 33 45 55 5 52 21 4 1 33 46 58 5 53 22 5 2 34 47 60 5 53 23 6 2 35 48 63 6 54 24 8 2 36 49 66 6 55 25 9 2 36 50 68 6 56 Grade 3 114 TABLE 5 (Cont'd) Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta Raw Score PR Sta 51 71 6 56 52 . 75 6 57 53 78 7 58 54 81 7 59 55 83 7 59 56 85 n 60 57 86 7 61 58 87 • 7 62 59 88 7 62 60 89 7 63 61 90 8 64 62 92 8 65 63 94 8 66 64 95 8 66 65 96 8 67 66 96 8 68 67 98 9 69 68 98 9 69 69 99 9 70 70 99 9 71 71 99. 9 72 72 73 74 75 Raw Score: IVfean 42.67 S.D. 13.07 Number of Students - 347 (81% of entire NLSD grade 3 population) Blanks for derived scores at the extreme ranges of the scale indicate that the corresponding raw scores were not obtained in the standard ization . 115 GRADE 5 TABLE 6 Northern Lights School Division Norms for Grade 5 Students tested in the Fall on the Vocabulary Subtest of the Gates-MacGinitie Reading Tests - Canadian Edition, Level D, Form 1. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta T Raw Score PR Sta T 1 1 31 26 86 7 62 2 1 1 32 27 88 7 63 3 1 1 33 28 90 8 64 4 2 1 35 29 91 8' 65 . 5 3 1 36 30 92 8 66 •6 .4 1 37. 31 93 8 68 7 7 2 38 32 94 8 69 8 10 2 39 33 95 8 70 9 16 3 41 34 96 8 71 10 ' 23 3 42 35 97 9 73 11 28 4 43 36 98 9 74 12 33 4 44 37 98 9 75 13 39 4 46 38 99 9 76 14 45 5 47 39 99 • 9 77 15 51 5 48 40 99 9 79 16 56 5 49 41 99 9 80 17 60 5 50 42 99 9 81 18 64 6 52 43 99 • 9 82 19 67 6 53 44 99 9 • 84 20 70 .6 54 45 21 73 6 55 22 76 6 57 23 79 7 58 • 24 82 7 59 25 85 ' 7 60 Raw Score: Mean 16.62 S.D. 8.15 Number of Students - 321 (92% of entire NLSD grade 5 population) Blanks for derived scores at the extreme ranges of the scale indicate that the corresponding raw scores were not obtained in the standard ization. 116 GRADE 5 TABLE 7 Northern Lights School Division Norms for Grade 5 Students tested in the Fall on the Comprehension Subtest of the Gates-MacGinitie Reading Tests - Canadian Edition, Level D, Form 1. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta T Raw Score PR Sta T 1 26 89 7 63 2 1 1 30 27 90 8 64 3 1 1 31 28 92 8 66 4 1 1 33 29 93 8 67 5 2 1 34 30 94 8 69 6 • .3 1 36 31 94 8 70 7 4 1 37 32 95 8 71 8 ' 7 2 38 33 96 8 73 9 12 3 40 34 97 9 74 10 17 3 41 35 97 9 75 11 23 3 42 36 98 9 77 12 29 4 44 37 99 9 78 13 36 4 45 38 99 9 80 14 43 5 47 39 99 • 9 81 15 49 5 48 40 99 9 82 16 55 5 49 41 99 9 84 17 62 6 51 42 99 9 85 18 67 6 52 43 19 71 6 53 20 75 6 55 21 78 7 56 22 80 7 58 23 83 7 59 24 86 7 60 25 87 7 62 Raw Score: Blanks for derived scores at the 16 49 extreme ranges opf the scale indicate ean * that the corresponding raw scores S.D. 7.27 ' were not obtained in the standard-Number of Students - 316 (90% of ization. entire NLSD grade 5 population) 117 GRADE 5 TABLE 8 Northern Lights School Division Norms for Grade 5 Students tested in the Fall on the Reading Total of the Gates-MacGintie Reading Tests - Canadian -Edition, Level D, Form 1. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta T Raw Score PR Sta T 1 26 36 4 45 2 27 40 4 46 3 28 44 5 46 • ' 4 29 48 5 47 . 5 1 1 30 30 52 5 48 6 1 1 31 31 • 55 5 48 7 • 1 . 1 32 32 57 5 49 8 1 1 32 33 . 60 5 50 9 1 1 33 34 64 6 51 10 1 1 34 35 66 6 51 11 2 1 34 36 . 67 6 52 12 2 1 35 37 70 6 53 13 2 1 36 38 72 6 53 14 3 . 1 37 39 73 6 ' 54 15 4 1 37 40 75 6 55 - 16 5 2 38 41 76 6 55 • 17 6 2- 39 42 77 6 56 18 9 2 39 43 79 7 57 19 11 2 40 44 81 7 - 58 20 . 14 3 41 45 83 7 58 21 . 17 3 41 46 84 7 59 22 20 3 42 47 85 7 60 23 24 4 43 48 86 7 60 24 . 28 4 44 49 86 7 61 25 31 4 44 50 87 7 62 118 TABLE 8 (Cont'd) Percentile Ranks, Stanines, and T-Scores' Corresponding to Raw Scores. Raw Score PR Sta T ' Raw Score PR Sta T 51 87 7 62 76 99 9 80 52 87 7 63 77 99 9 81 53 88 7 64 78 99 9 81 54 90 8 65 79 99 9 • 82 55 90 8 65 80 99 9 83 56 91 8 66 81 99 9 83 57 91 8 67 82 99 9 84 58 92 8 67 83 99 . 9 85 ' 59 93 8 68 84 99 9 86 60 .94- 8 . 69 85 99 9 86 61 94 8 69 86 99 9 87 62 95 8 70 87 63- 95 8 71 88 64 96 8 72 65 96 8 72 66 96 8 73 67 96 8 74 68 97 9 74 69 97 9 75 70 98 9 76 71 98 9 76 72 98 9 77 .73 98 9 78 74 98 9 79 75 99 9 79 Raw Score: Mean S.D. 33.17 14.29 Number of Students - 316 (90% of entire NLSD grade 5 population) Blanks for derived scores at the extreme ranges of the scale indicate that the corresponding raw scores were-not obtained in the' standard ization . 119 GRADE 5 TABLE 9 Northern Lights School Division Norms for Grade 5 Students tested in the Fall on the Mathematics Computation Subtest of the Canadian Tests of Basic Skills, Level 10, Form 6. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta T Raw Score PR Sta T 1 26 , 57 5 52 2 27 61 6 53 3 1 1 26 28 65 6 55 4 1 1 27 29 68 6 56 5 1 1 28 30 71 6 57 6 1 1 29 •• 31 75 6 58 7 2 . 1 31 32 79 7 59 8 '3 1 32 33 82 7 60 9 4 1 33 34 86 7 61 10 6 2 34 35 89 7 63 11 8 2 35 36 92 8 64 12 10 2 36 37 94 8 65 13 13 3 37 38 95 8 66 14 17 . 3 • 39 39 97 9 67 15 19 3 40 40 99 9 68 16 22 3 41 41 99 9 69 17 25 4 42 42 18 28 4 43 19 32 4 44 20 . 34 4 45 21 37 • 4 47 22 40 4 48 23 43 5 49 24 46 5 50 25 • 52 5 51 Raw Score: Mean 23.97 S.D. 8.74 Number of Students - 316 (90% of entire NLSD grade 5 population) Blanks for derived scores at the extreme ranges of the scale indicate that the corresponding raw scores were not obtained in the standard ization. 120 GRADE 5 TABLE 10 Northern Lights School Division Norms for Grade 5 Students tested in the Fall on the Otis Lennon School Ability Test, Elementary, Form R. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta T Raw Score PR Sta T 1 26 33 4 45 2 27 36 4 45 3 1 1 26 28 39 4 46 4 1 1 27 29 42 5 • 47 5 1 1 28 30 44 5 48 6 1 1 . 29 .. 31 48 5 49 7 1 1 29 32 50 5 49 8 1 1 30 33 53 5 50 9 1 1 31 34 56 5 51 10 2 1 . 32 35 59 5 52 11 2 1 33 36 62 6 53 12 2 1 33 37 65 6 53 13 3 1 34 38 68 6 54 14 4 1 • 35 39 69 6 55 15 5 2 36 40 72 6 56 16 -7 2 37 41 75 6 57 17 8 2- 37 42 77 6 57 18 11 2 38 43 78 7 58 19 15 3 39 44 80 7 . 59 20 18 3 40 45 82 7 60 21 20 3 41 46 84 7 61 22 •23 3 41 47 86 7 61 23 26 4 42 48 88 7 62 24 ' 28 4 43 49 • 89 7 63 25' • 30 4 44 50 90 S 64 121 TABLE 10 (Cont'd) Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta Raw Score PR Sta 51 92 8 65 52 93 8 65 53 93 8 66 54 94 8 67 55 95 8 68 56 96 8 69 57 96 8 69 58 97 9 70 59 97 9 71 60 98 9 72 61 99 9 73 62 99 9 74 63 99 9 74 64 99 9 75 65 99 9 76 66 99 9 77 67 68 69 70 Raw Score: Mean 32.73 S.D. 12.45 Number of Students - 317 (9U of entire NLSD grade 5 population) Blanks for derived scores at the extreme ranges of the scale indicate that the corresponding raw scores were-not obtained in the standard ization . 122 Grade 7 TABLE 11 Northern Lights School Division Norms for Grade 7 Students tested in the Fall on the Vocabulary Subtest of the Gates-MacGinitie Reading Tests - Canadian Edition, Level D, Form 2. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. ' Raw Score PR Sta T Raw Score PR Sta T 1 26 51 5 50 2 1 1 24 27 53 5 51 3 1 1 25 28 56 5 . 52 4 1 1 26 29 60 5 53 5 1 1 27 30 65 6 54 6 1 ' 1 28 31 69 6 55 7 1 1 29 32 73 6 56 8 2 1 30 33 76 6 58 9 4. 1 31 34 79 7 59 10 6' 2 32 35 82 7 60 11 6 2 34 36 84 7 61 12 7 2 35 37 85 7 62 13 9 2 : 36 38 87 7 63 14 11 2 37 39 91 ' 8 64 15 13 3 38 40 93 8 65 16 14 3 39 41 96 8 66 17 17 3 40 42 98 9 ' 67 18 20 3 41 43 99 9 68 19 23 3 42 44 99 9 • 70 20 26 4 43 45 99 9 • 71 21 28 4 44 22 32 4 46 23 36 4 47 24 42 5 48 25 48 5 49 Raw Score: Mean 26.06 S.D. 9.17 Number of Students - 216 (83% of entire NLSD grade 7 population) Blanks for derived scores at the extreme ranges of the scale indicate that the corresponding raw scores were not obtained in the standard ization. 123 Grade 7 TABLE 12 Northern Lights School Division Norms for Grade 7 Students tested in the Fall on the Comprehension Subtest of the Gates-MacGinitie Reading Tests - Canadian Edition, Level D, Form 2. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta T Raw Score PR Sta T 1 26 55 5 51 2 27 60 5 52 3 28 62 • 6 53 4 29 66 6 54 . 5 30 70 6 56 6 31 73 6 57 7 1 1 28 32 76 6 58 8 2 1 29 33 80 7 59 9 3 1 31 34 82 7 60 10 4 1 32 35 85 7 62 11 5 2 33 36 87 7 63 12 6 2 34 37 90 8 64 13 7 2 35 38 92 8 65 14 9 2 37 39 95 8 66 15 11 2 38 40 98 9 68 16 - 14 3 39 41 99 9 69 17 17 3 40 42 99 9 . 70 18 20 3 41 43 19 25 4 43 20 29 . 4 44 - 21 32 4 45 22 36 4 46 23 41 5 47 24 46 5 49 25 50 5 50 Raw Score: Mean 25.25 S.D. 8.35 Number of Students - 215 (83% of entire NLSD grade 7 population) Blanks for derived scores at the extreme ranges of the scale indicate that the corresponding raw scores were not obtained in the standard ization . 124 Grade 7 TABLE 13 Northern Lights School Division Norms for Grade 7 Students tested in the Fall on the Reading Total of the Gates-MacGinitie Reading Tests - Canadian Edition, Level D, Form 2. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. ' Raw Score PR Sta T Raw Score PR Sta T 1 26 7 2 34 2 27 7 2 35 3 28 8 2 . 36 4 29 10 2 36 5 30 10 2 37 6 31 10 2 38 7 32 12 3 38 8 33 14 3 39 9 s 34 16 3 39 10 35 17 3 40 11 36 18 3 41 12 37 20 3 41 13 1 1 27 38 21 3 42 14 1 1 27 39 22 ' 3 42 15 1 1 28 40 23 3 43 16 1 1 28 41 25 • 4 44 17 1 1 29 42 29 4 44 18 1 1 30 43 31 4 45 19 1 1 30 44 33 4 45 20 2 1 31 45 35 4 ' 46 21 2 1 31 46 38 4 47 . 22 4 1 32 47 40 4 47 23 5 2 33 48 43 5 48 24 6 2 33 49 47 5 49 25 7 2 34 50 50 5 49 125 TABLE 13 (Cont'd) Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta T ' Raw Score PR Sta T 51 52 5 50 76 91 8 65 52 54 5' 50 77 92 8 66 53 56 5 51 78 94 8 66 54 59 5 52 79 96 8 67 55 60 5 52 80 97 9 67 56 62 6 53 81 98 9 68 57 65 6 53 82 98 9 69 58 67 6 54 83 99 9 69 59 68 6 55 84 99 9 70 60 70 6 55 85 99 9 71 61 73 6 56 86 99 9 71 62 75 6 56 87 99 9 72 63 76 6 57 88 64 77 6 5.8 65 77 6 58 66 79 7 59 67. 80 7 60 68 82 7 60 .69 82 7 . 61 70 83 7 61 .. 71 85 7 62 72 86 7 63 73 87 7 63 74 88 7 64 75 89 7 64 Raw Score:. Mean 51.39 S.D. 16.35 Number of Students - 215 (83% of entire NLSD grade 7 population) Blanks for derived scores at the extreme ranges of the scale indicate that the corresponding raw scores were- not obtained in the standard ization. 126 Grade 7 TABLE 14 Northern Lights School Division Norms for Grade 7 Students tested in the Fall on the Mathematics Computation Subtest of the Canadian Tests of Basic Skills, Level 13, Form 6. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. ' Raw Score PR Sta T Raw Score PR Sta T 1 1 1 30 26 86 7 61 2 1 1 31 27 88 7 62 3 3 1 33 28 90 8 64 4 4 1 34 29 92 8 • 65 5 6 2 35 30 94 8 66 6 8 . 2 36 31 95 8 67 7 10 2 38 32 96 8 ' 69 8 12 3 39 33 97 9 70 9 16 3 40 34 98 9 71 10 •20' 3 41 . 35 98 9 72 11 23 3 43 36 98 9 74 12 28 4 44 37 98 9 75 13 33 4 45 38 99 9 76 14 38 4 46 39 99 . 9 77 15 43 5 47 40 99 9 79 16 49 5 49 41 99 9 80 17 54 5 50 42 99 9 81 18 58 5 51 43 99 9 82 19 62 6 52 44 20 67 6 54 45 21 72 6 55 i 22 75 6 56 23 78 7 57 24 • 81 7 59 25 83 7 60 Raw Score: Mean 17.02 S.D. 8.05 Number of Students - 216 (83% of • entire NLSD grade 7 population) Blanks for derived scores at the extreme ranges of the scale indicate that the corresponding raw scores were not obtained in the standard ization. 127 Grade 7 TABLE 15 Northern Lights School Division Norms for Grade 7 Students tested in the Fall on the Otis-Lennon School Ability Test, Intermediate, Form R Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta T Raw Score PR Sta T 1 26 25 4 43 2 27 28 4 44 3 28 31 4 44. 4 1 1 27 29 33 4 45 5 1 1 27 30 36 4 46 6 1 1 28 31 38 4 47 7 1 1 29 32 41 5 47 8 1 1 30 33 44 5 48 9 1 1 30 34 48 5 49 10 1 1 31 35 52 5 50 11 1 1 32 36 55 5 50 12 1 1 33 37 58 5 51 13 1 1 33 38 61 6 52 14 2 1 34 39 66 6 53 15 3 1 35 40 69 6 53 16 4 1 36 41 72 6 54 17' 6 2 36 42 75 6 55 18 8 2 37 43 77 6 56 19 9 2 38 44 79 7 ' 56 20 10 2 38 45 80 7 57 " 21 13 3 39 46 82 7 58 22 17 3 40 47 84 7 59 23 20 3 41 48 86 7 59 24 22 3 41 49 87 7 60 25 23 3 42 50 87 7 61 128 TABLE 15 (Cont'd) Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta Raw Score PR Sta 51 88 7 62 76 99 9 80 52 89 7 62 77 99 9 81 53 90 8 63 78 99 9 82 54 90 8 64 79 99 9 82 55 90 8 65 80 99 9 83 56 57 58 59 60 61 62 63-64 65 66 67 68 69 70 71 72 73 74 75 90 91 92 93 94 95 95 95 96 96 96 97 97 98 99 99 99 99 99 99 8 8 8 8 8 8 8 8 8 8 8 9 9 9 9 9 9 9 9 9 65 66 67 68 68 69 70 70 71 72 73 73 74 75 76 76 77 78 79 79 Raw Score: Mean 35.46 S.D. 13.44 Number of Students - 206 (80% of entire NLSD grade 7 population) Blanks for derived scores at the extreme ranges of the scale indicate that the corresponding raw scores were-not obtained in the' standard ization . 129 Grade 9 TABLE 16 Northern Lights School Division Norms for Grade 9 Students tested in the Fall on the Vocabulary Subtest of the Gates-MacGinitie Reading Tests - Canadian Edition, Level E, Form 2 Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta T Raw Score PR Sta T 1 26 73 6 55 2 27 74 6 56 3 28 76 6 57 4 1 1 30 29 78 7 58 5 1 1 31 30 79 7 59 6 1 1 32 31 80 7 60 7 1 1 34 32 82 7 61 8 1 1 35 33 83 7 62 9 2 1 36 34 86 7 63 10 4 1 37 35 89 7 64 11 8 2 38 36 91 8 66 12 13 3 39 37 92 8 67 13 18 3 40 38 94 8 68 14 23 3 41 39 95 •8 69 15 27 4 42 40 96 8 70 16 30 4 43 41 97 9 71 17 33 4 45 42 98 9 72 18 37 4 46 43 98 9 73 19 42 5 47 44 99 9 74 20 49 5 48 45 21 53 5 49 22 57 5 50 23 62 6 51 24 67 6 52 25 70 6 53 Raw Score: Mean 21.90 S.D. 9.04 Number of Students - 157 (84% of entire NLSD grade 9 population) Blanks for derived scores at the extreme ranges of the scale indicate that the corresponding raw scores were not obtained in the standard ization. 130 Grade 9 TABLE 17 Northern Lights School Division Norms for Grade 9 Students tested in the Fall on the Comprehension Subtest of the Gates-MacGinitie Reading Tests - Canadian Edition, Level E, Form 2. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. ' Raw Score PR Sta T Raw Score PR Sta T 1 26 53 5 50 2 27 56 5 51 3 28 59 5 52 4 29 62 6 ' 54 5 30 64 6 55 6 31 68 6 56 7 32 74 6 58 8 33 80 • 7 59 9 34 83 7 60 10 1 1 28 35 85 7 62 11 • 1 . 1 30 36 87 7 63 12 2 1 31 37 90 8 64 13 3 1 32 38 93 8 66 14 4 1 34 39 95 • 8 67 15 6 2 35 40 97 9 68 16 9 2 36 41 99 9 69 17 11 2 38 42 99 9 71 18 13 3 39 43 99 9 72 19 17 3 40 20 21 3 42 21 27 4 43 22 32 4 44 23 36 4 46 24 41 5 47 25 47 5 48 Raw Score: Mean 26.36 S.D. 7.51 Number of Students - 157 (84% of . entire NLSD grade 9 population) Blanks for derived scores at the extreme ranges of the scale indicate that the corresponding raw scores were not obtained in the standard ization. 131 Grade 9 TABLE 18 Northern Lights School Division Norms for Grade 9 Students tested in the Fall on the Reading Total of the Gates-MacGinitie Reading Tests -Canadian Edition, Level E, Form 2. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw-Score PR Sta T Raw Score PR Sta T 1 26 5 2 36 2 27 5 2 36 3 28 7 2 37 4 29 10 2 38 5 30 12 3 38 6 31 13 3 39 7 32 14 3 39 8 33 15 3 40 9 34 17 3 41 10 35 18 3 41 11 36 21 3 42 12 37 26 4 43 13 38 29 4 43 14 39 32 4 44 15 40 36 4 45 16 41 39 4 45 17 42 42 5 46 18 43 44 5 ' 47 . 19 44 46 5 47 20 45 47 5 ' 48 21 46 49 5 49 22 1 1 33 47 51 5 49 23 2 1 34 48 53 5 50 24 3 1 34 49 56 5 50 25 . 4 1 35 50 60 5 51 132 Grade 9 TABLE 18 (Cont'd) Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta T ' Raw Score PR Sta T 51 63 6 52 76 93 8 68 52 65 6 52 77 94 8 69 53 67 6 53 78 94 8 69 54 70 6 54 79 95 8 70 55 72 6 54 80 96 8 71 56 73 6 •55 81 97 9 71 57 74 ' 6 56 82 98 9 72 58 75 6 56 83 99 9 73 . 59 76 6 57 84 99 9 73 60 76 m 6 58 85 61 77 •6 58 86 62 78 7 59 87 63 81 7 60 88 64 83 7 60 65 84 7 61 66 84 7 62 67 85 7 62 68 85 7 63 69 87 7 63 70 88 7 64 71 90 8 65 72 91 8 65 73 91 8 66 74 92 8 67 75 93 8 67 Raw Score: Mean 48.23 S.D. 15.46 Number of Students - 155 (83% of entire NLSD grade 9 population) Blanks for derived scores at the extreme ranges of the scale indicate that the corresponding raw scores were-hot obtained in the' standard ization . 133 Grade 9 TABLE 19 Northern Lights School Division Norms for Grade 9 Students tested in the Fall on the Mathematics Subtest of the Canadian Testsof Basic Skills, Level 15, Form 5. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta T Raw Score PR Sta . T 1 26 81 7 58 2 27 85 7 59 3 28 87 7 60 4 29 89 7 61 5 1 1 33 30 91 8 63 6 3 1 34 31 91 8 64 7 5 2 35 32 92 8 65 8 8 2 36 33 92 8 66 9 10 2 37 34 93 8 67 10 13 3 39 35 94 8 69 11 16 3 40 36 95 8 70 12 19 3 41 37 97 9 71 13 24 4 42 38 97 9 72 14 28 4 43 39 97 9 73 15 33 4 45 40 98 9 75 16 38 4 46 41 98 9 76 17. 42 5 47 42 98 9 77 18 46 5 48 43 99 9 ' 78 . 19 50 5 49 44 20 55 5 51 45 . 21 58 5 52 46 22 62 6 53 47 23 68 6 54 48 24 73 6 55 25 . 77 6 57 Raw Score: Mean 19.47 S.D. 8.36 Number of Students - 159 (85% of entire NLSD grade 9 population) Blanks for derived scores at the extreme ranges of the scale indicate that the corresponding raw scores were not obtained in the standard ization. 134 Grade 9 TABLE 20 Northern Lights School Division Norms for Grade 9 Students tested in the Fall on the Otis-Lennon School Ability Test, Advanced, Form R. Precentile Ranks', Stanines, and T-Scores Corresponding-to Raw Scores. Raw Score PR Sta T Raw Score PR Sta T 1 26 44 5 48 2 27 49 5 48 3 28 54 5 -49 4 29 59 5 50 5 30 62 6 51 6 31 64 6 52 7 1 1 30 32 67 6 53 8 1 1 31 33 70 6 54 9 1 1 32 34 72 6 55 10 1 ' 1 33 35 75 6 56 11 2 1 34 36 80 7 57 12 2 1 35 37 84 7 58 13 3 1 35 38 86 7 59 14 4 1 36 39 87 . 7 60 15 5 2 37 40 88 7 61 16 9 2 38 41 88 7 62 17 12 3 39 42 89 7 62 18 14 3 40 43 89 7 63 19 17 3 41 44 90 8 64 20 22 3 42 45 90 8 65 21 27 4 43 46 92 8 66 22 31 4 44 47 94 8 67 23 34 4 45 48 • 95 8 68 24 36 4 46 49 95 8 69 25 39 4 47 50 95 8 70 135 TABLE 20 (Cont'd) Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Raw Score PR Sta T ' Score PR Sta T 51 96 8 71 76 52 96 8 72 77 53 96 8 73 78 54 97 9 74 79 55 98 9 75 80 56 98 9 75 57 98 9 76 58 98 9 77 • 59 98 9 78 60 99 9 79 61 99 9 80 62 99 9 81 63 99 9 82 64 99 9 83 65 99 9 84 66 99 9 85 67 68 69 70 71 -• 72 73 74 75 Raw Score: Mean 28.63 S.D. 10.75 Number of Students - 152 (81% of entire NLSD grade 9 population) Blanks for derived scores at the extreme ranges of the scale indicate that the corresponding raw scores were-not obtained in the standard ization . 136 9 Years 0 Months to 9 Years 5 Months TABLE 21 Northern Lights School Division Norms for Students Aged 9 Years 0 Months to 9 Years 5 Months tested in the Fall on the Vocabulary Subtest of the Gates-MacGinitie Reading Tests - Canadian Edition, Level B, Form 1. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw • Score PR Sta T Raw Score PR Sta T 1 1 1 32 26 57 5 53 2 2 1 33 27 59 5 54 3 4 1 34 28 62 6 55 4 5 2 35 29 64 6 56 5 7 2 36 30 67 6 56 6 9 2 37 31 70 6 57 7 11 . 2 37 32 73 6 58 8 12 3 38 33 77 6 59 9 13 3 39 34 80 7 60 10 17 3 40 35 82 7 60 11 22 3 41 36 84 7 61 12 25 4 42 37 87 7 62 13 28 4 42 38 88 7 63 14 32 4 43 39 89 7 64 15 35 4 44 40 90 8 65 16 39 4 45 41 92 8 65 17 42 5 46 42 94 8 66 18 44 5 46 43 97 9 67 IS 46 5 47 44 99 9 • 68 20 48 5 48 45 99 9 69 21 50 . 5 49 22 52 5 50 23 53 • 5 51 24 54 5 51 25 56 5 52 Raw Score: Mean 22.32 S.D. 12.14 Number of Students - 150 (83% of entire NLSD population aged 9 years 0 months to 9 years 5 months.) 9 Years 0 Months 137 _ to ~ ' 9 Years 5 Months TABLE 22 Northern Lights School Division Norms for Students Aged 9 Years 0 Months to 9 Years 5 Months tested in the Pall on the Comprehension Subtest of the Gates-MacGinitie Reading Tests - Canadian Edition Level B, Form 1. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta T Raw Score PR Sta T 1 1 1 30 26 67 6 55 2 1 1 31 27 71 6 56 3 1 1 32 28 74 6 57 4 1 1 33 29 79 7 58 5 2 1 34 30 83 7 59 6 3 1 35 31 83 7 60 7 8 2 36 32 84 7 61 8 11 2 37 33 85 7 62 9 14 3 38 34 87 7 63 10 20 3 39 35 90 8 64 11 24 4 40 36 92 8 65 12 27 4 41 37 94 8 66 13 31 4 42 38 96 8 67 14 32 4 43 39 98 9 68 15 35 4 44 40 99 9 69 16 38 4 45 17 ' 41 5 46 18 , 44 5 47 19 45 5 48 20 47 5 49 21 49 5 50 22 53 5 51 23 56 5 52 24 59 5 53 25 63 6 54 • • Raw Score: Mean 20.63 S.D. 10.01 Number of Students - 149 (83% of entire NLSD population aged 9 years 0 months to 9 years 5 months.) 138 9 Years 0 Months to 9 Years 5 Months TABLE 23 Northern Lights School Division Norms for Students Aged 9 Years 0 Months to 9 Years 5 Months tested in the Fall on the Reading Total of the Gates-MacGinitie Reading Tests - Canadian Edition, Level B, Form 1. -Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta T Raw Score PR Sta T 1 26 30 4 42 2 1 1 31 27 31 4 43 3 1 1 31 28 33 4 43 4 1 . 1 32 29 34 4 43 5 1 1 32 30 36 4 44 6 i 1 33 31 39 4 44 7 i 1 33 32 40 4 45 8 2 1 34 33 41 5 45 9 .3 1 34 34 42 5 46 10 4 1 35 35 43 5 46. 11 4 1 35 36 44 5 47 12 4 1 36 37 45 5 47 13 5 2 36 38 46 5 48 14 7 2 37 39 46 5 48 15 9 2 37 40 47 5 49 16 9 2 37 41 47 5 49 17 • 10 2 38 42 49 5 50 18 12 3 38 43 50 5 50 . 19 14 . 3 39 44 51 5 ' 50 20 17 3 39 45 52 5 51 21 20 3 40 46 53 5 51 22 22 3 40 47 55 5 52 23 25 4 41 48 57 5 52 24 27 4 41 49 59 5 53 25 28 4 42 50 59 5 53 139 TABLE 23 (Cont'd) Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw '• Score PR Sta T Raw Score PR Sta T 51 60 5 54 76 92 8 65 52 61 6 54 77 94 8 66 53 62 6 55 78 95 8 66 54 62 6 55 79 96 8 67 55 63 6 56 80 96 8 67 56 64 6. 56 • 81 97 9 68 57 66 . 6 56 82 98 9 68 58 68 6 57 83 99 9 69 59 72 6 57 84 60 75 6 58 85 61 78 7- 58 62 81 7 59 63 82 7 59 64 82 7 60 65 82 7 60 66 82 7 61 67 83 7 61 68 83 7 62 69 84 7 62 70 87 7 62 71 88 7 63 72 89 7 63 73 90 8 64 74 91 8 64 75 91 8 65 Raw Score: Blanks for derived scores at the Mean S.D. 43.05 21.57 Number of Students - 147 (82% of entire NLSD population aged 9 years 0 months to 9 years 5 months.) extreme ranges of the scale indicate that the corresponding raw scores were not obtained in the standard ization. 140 9 Years 0 Months to 9 Years' 5 Months TABLE 24 Northern Lights School Division Norms for Students Aged 9 Years 0 Months to 9 Years 5 Months tested in the Fall on the Mathematics Computation Subtest of the Canadian Testsof Basic Skills, Level 7, Form 5. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta Raw Score PR Sta 1 2 3 4 1 1 15 5 1 1 17 6 1 1 19 7 1 1 21 8 2 1 23 9 3 1 25 10 4 1 27 11 6 2 29 12 8 2 31 13 10 2 33 14 11 2 35 15 12 3 37 16 14 3 40 17 17 3 42 18 20 3 44 19 26 . 4 46 20 32 4 48 21 39 4 50 22 47 5 52 23 55 ' 5 54 24 65 6 56 25 80 7 58 26 94 60 Raw Score: Mean 21.03 S.D. 4.82 Number of Students - 145 entire NLSD population aged 9 years 0 months to 9 years 5 months.) % of Blanks for derived scores at the extreme ranges of the scale indicate that the corresponding raw scores were not obtained in the standard ization. 141 9 Years 0 Months to * 9 Years 5 Months TABLE 25 Northern Lights School Division Norms for Students Aged 9 Years 0 Months to 9 Years 5 Months tested in the Fall on the Otis-Lennon School Ability Test, Primary II, Form R. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta T Raw Score PR Sta T 1 26 15 3 40 2 27 17 3 40 3 28 19 3 41 4 29 21 3 42 5. 30 25 4 43 6 i • 1 26 31 28 4 43 7 1 1 27 32 29 4 44 8 1 1 28 33 33 4 45 9 • 1 1 28 34 36 4 45 10 1 1 29 35 37 4 46 11 1 1 30 36 39 4 47 12 1 1 30 37 43 5 47 13 2 1 31 38 47 5 48 14 2 1 32 39 50 5 49 15 3 1 32 40 53 5 49 16 3 1 33 41 55 5 50 17 4 1 34 42 57 5 51 18 5 2 34 43 58 5 51 . 19 8 . 2 35 44 59 5 ' 52 20 8 2 36 45 62 6 53 21 8 2 36 46 64 6 53 22 9 2 37 47 65 6 54 23 10 2 38 48 66 6 55 24 12 3 38 49 69 6 55 25 14 3 39 50 70 6 56 142 TABLE 25 (Cont'd) Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta Raw Score PR Sta T 51 72 6 57 52 74 6 57 53 77 6 58 54 78 7 59 55 80 7 60 56 81 7- 60 57 83 7 61 58 85 7 62 59 88 7.. 62 60 90 8 63 61 90 8 64 62 91 8 64 63 92 8 . 65 64 93 8 66 65 94 8 66 66 95 8 67 67 95 8 68 68 95 8 68 69 96 8 69 70 98 9 70 71 99 9 70 72 99 9 71 73 99 9 72 74 99 9 72 75 Raw Score: Mean 41.0 S.D. 14.71 Number of Students - 140 (78% of entire NLSD population aged 9 years 0 months to 9 years 5 months.) Blanks for derived scores at the extreme ranges of the scale indicate that the corresponding raw scores were•not obtained in the standard ization. 143 11 Years 0 Months to 11 Years 5" Months TABLE 26 Northern Lights School Division Norms for Students Aged 11 Years 0 Months to 11 Years 5 Months tested in the Fall on the Vocabulary Subtest of the Gates-MacGinitie Reading Tests, Level D, Form 1. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta T Raw Score PR Sta T 1 1 1 33 26 79 7 58 2 3 1 34 27 82 7 59 3 5 2 35 28 83 7 60 4 6 2 36 29 84 7 61 5 8 2 37 30 86 7 62 6 12 3 38 31 88 7 63 7 15 3 39 32 89 7 64 8 17 3 40 33 92 8 65 9 20 3 41 34 93 8 66 10 25 4 42 35 93 8 67 11 28 4 43 36 95 8 68 12 32 4 44 37 97 9 69 13 36 4 45 38 98 9 70 14 40 4 . 46 39 99 9 71 15 45 5 47 40 99 9 72 16 50 5 48 41 99 9 73 . 17 52 5. 49 42 99 9 74 18 57 5 50 43 99 9 75 19 61 6 51 44 99 9 76 20 64 6 52 45 21 66 6 53 22 68 6 54 23 69 6 55 24 72 6 56 25 76 6 57 Raw Score: Mean 17.70 S.D. 9.94 Number of Students - 130 (72% of entire NLSD population aged 11 years 0 months to 11 years 5 months.) Blanks for derived scores at the extreme ranges of the scale indicate that the corresponding raw scores were not obtained in the standard ization. 144 11 Years 0 Months to 11 Years 5 Months TABLE 27 Northern Lights School Division Norms for Students Aged 11 Years 0 Months to 11 Years 5 Months tested in the Fall on the Comprehension Subtest of the Gates-lfecGinitie Reading Tests, Level D, Form 1. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta T Raw Score PR Sta T 1 2 1 32 26 85 7 62 2 3 1 33 27 86 7 63 3 5 2 34 28 87 7 64 4 6 2 35 29 88 7 65 5 7 2 37 30 90 8 66 6 8 2 38 31 92 8 68 7 10 2 39 32 93 8 69 8 12 3 40 33 95 8 70 9 17 3 41 34 97 9 71 10 23 3 43 35 98 9 72 11 27 4 44 36 99 9 73 12 31 4 45 37 99 9 75 13 35 4 46 38 99 9 76 14 41 5 ' 47 39 99 9 77 15 48 5 48 40 99 9 78 16 55 5 50 41 99 9 79 17 61 6 51 42 18 67 6 52 43 19 71 6 53 20 • 75 6 54 21 78 7 56 22 81 7 57 23 84 7 58 24. 85 7 59 25 85 7 60 0 Raw Score: Mean 16.28 S.D. 8.41 Number of Students - 130 (72% of entire NLSD population aged 11 years 0 months to 11 years 5 months.) Blanks for derived scores at the extreme ranges of the scale indicate that the corresponding raw scores were not obtained in the standard ization . 145 11 Years 0 Months to " 11 Years 5 Months TABLE 28 Northern Lights School Division Norms for Students Aged 11 Years 0 Months to 11 Years 5 Months tested in the Eall on the Reading Total of the Gates MacGinitie Reading Tests, Level D, Form 1. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta T Raw Score PR Sta T 1 26 34 4 45 2 1 1 32 27 36 4 46 3 1 1 32 28 39 4 . 47 4 2 1 • 33 29 43 5 47 5 3 1 33 30 46 5 48 6 3 1 34 31 50 5 48 7 3 1 34 32 52 5 49 8 3 1 35 33 55 5 49 9 . 4 1 36 34 57 5 50 10 5 2 36 35 58 5 51 11 5 2 37 36 61 6 51 12 6 2 37 37 63 6 52 13 7 2 38 38 65 6 52 14 10 2 38 39 68 ' 6 53 15 14 3 39 40 71 6 53 16 16 3 40 41 73 6 54 17- 17 3 40 42 74 6 55 18 20 3 41 43 74 6 55 19 22 3 41 44 75 6 ' 56 20 22 3 42 45 77 6 56 21 24 4 43 46 78 7 57 22 27 4 43 47 80 7 58 23 28 4 44 48 81 7 58 24 30 4 44 49 82 7 59 25 33 4 45 50 82 7 59 146 TABLE 28 (Cont'd) Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta T ' Raw Score PR Sta T 51 82 7 • 60 76 98 9 74 52 83 7 60 77 99 9 74 53 85 7 61 78 99 9 75 54 86 7 62 79 99 9 75 55 86 7 62 80 99 9 76 56 86 7 63 81 99 9 77 57 86 7 63 82 99 9 77 58 86 7 64 83 99 9 78 ' 59 87 7 64 84 99 9 78 60 89 7 65 85 99 9 79 61 90 8 66 86 62 91 8 66 87 63 92 8 67 88 64 92 8 67 65 94 8 68 66 95 8 68 67 95 8 69 68 96 8 70 69 97 9 70 70 97 9 71 .71 98 9 71 72 98 9 72 73 98 9 72 74 98 9 73 75 98 9 73 Raw Score: Mean 33.98 S.D. 17.36 Number of Students - 130 (72% of entire NLSD population aged 11 years 0 months to 11 years 5 months.) Blanks for derived scores at the extreme ranges of the scale indicate that the corresponding raw scores were•not obtained in the standard ization . 147 11 Years 0 Months to 11 Years 5 Months TABLE 29 Northern Lights School Division Norms for Students Aged 11 Years 0 Months to 11 Years 5 Months tested in the Pall on the Mathematics Computation Subtest of the Canadian Tests of Basic Skills, Level 10, Form 6. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta T Raw Score PR Sta T 1 1 1 30 26 52 5 52 2 2 1 31 27 57 5 53 . 3 3 1 32 28 61 6 54 4 6 2 33 29 63 6 55 5 7 2 33 30 65 6 55 6 8 2 34 31 ' 67 6 56 7 9 2 35 32 70 6 57 8 11 2 36 33 72 6 58 9 13 3 37 34 74 6 59 10 15 3 38 35 78 7 60 11 16 3 39 36 83 7 61 12 18 3 40 37 85 7 62 13 20 3 41 38 87 7 62 14 23 3 41 39 90 8 63 15 26 4 42 40 95 8 64 16 29 4 43 41 99 9 65 17 31 4 44 42 18 33 4 45 19 35 4 46 20 39 4 47 21 42 5 48 22 44 5 48 23 45 5 49 24- 46 5 50 25 49 5 51 Raw Score: . Mean 23.79 S.D. 11.39 Number of Students - 126 (70% of entire NLSD population aged 11 years 0 months to 11 years 5 months.) Blanks for derived scores at the extreme ranges of the scale indicate that the corresponding raw scores were not obtained in the standard ization . 0 148 11 Years 0 Months to ll Years 5 Months TABLE 30 Northern Lights School Division Norms for Students Aged 11 Years 0 Months to 11 Years 5 Months tested in the Fall on the Otis-Lennon School Ability Test, Elementary, Form R. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta T Raw Score PR Sta T 1 26 37 4 45 2 1 1 30 27 40 4 46 3 2 1 31 28 41 5 . 46 4 2 1 32 29 43 5 47 . 5 3 1 32 30 46 5 48 6 ' 3 1 33 31 48 5 48 7 3 1 33 32 50 5 49 8 4 1 • 34 33 52 5 49 9 . 4 1 35 34 52 5 50 10 4 1 35 35 52 5 51 11 5 2 36 36 54 5 51 12 6 2 36 37 56 5 52 13 7 2 37 38 57 5 53 14 8 2 38 39 58 5 53 15 11 2 38 40 60 5 54 16 14 3 39 41 63 6 54 17' 17 3 40 42 64 6 55 18 19 3 40 43 65 6 56 19 24 4 41 44 67 6 ' 56 20 28 4 41 45 70 6 57 21 31 4 42 46 73 6 58 22 33 4 43 47 75 6 58 23 34 4' 43 48 76 6 59 24 35 4 44 49 77 6 59 25 35 4 45 50 79 7 60 149 TABLE 30 (Cont'd) Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta Raw Score PR Sta 51 81 7 ' 61 52 83 7 61 53 86 7 62 54 88 7 63 55 89 7 63 56 90 8 64 57 91 8 64 58 92 8 65 59 93 8 66 60 94 8 66 61 97 9 67 62 99 9 67 63 64 65 66 67. 68 69 70 Raw Score: Mean 33.82 S.D. 16.12 Number of Students - 127 (71% of entire NLSD population aged 11 years 0 months to 11 years 5 months.) Blanks for derived scores at the extreme ranges of the scale indicate that the corresponding raw scores were-not•obtained in the standard ization . 14 Years 0 Months 150 ' to 14 Years 5 Months TABLE 31 Northern Lights School Division Norms for Students Aged 14 Years 0 Months to 14 Years 5 Months tested in the Fall on the Vocabulary Subtest of the Gates-MacGinitie Reading Tests, Level D, Form 2. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw •. Score PR Sta T Raw Score PR Sta T 1 2 1 32 26 54 5 52 2 3 1 33 27 55 5 53 3 4 1- 34 28 57 5 54 4 4 1 34 29 61 6 55 5 5 2 35 30 64 6 56 6 6 2 36 31 68 6 ' 56 7 10 . 2 37 32 72 6 57 8 14 3 38 33 75 6 58 9 17 3 38 34 78 7 59 10 20 3 39 35 80 7 60 11 22 3 40 36 82 7 60 12 24 4 41 37 83 7 61 13 26 4 42 38 85 7 62 14 29 4 ' 43 39 87 7 63 15 32 4 43 40 90 8 64 16 34 4 44 41 93 8 64 17 38 4 45 42 95 8 65 18 40 4 46 43 97 9 66 19 42 5 47 44 98 9 ' 67 20 43 5 47 45 99 9 68 21 45 5 48 22 47 '. 5 49 23 48 • 5 50 24 50 5 51 25 53 5 51 Raw Score: Mean 23.18 S.D. 12.33 Number of Students - 112 (72% of entire NLSD population Aged 14 years 0 months to 14 years 5 months.) 151 14 Years 0 Months to -.14 Years 5 iMonths TABLE 32 Northern Lights School Division Norms for Students Aged 14 Years 0 Months to 14 Years 5 Months tested in the Fall on the Comprehension Subtest of the Gates-MacGinitie Reading Tests, Level D, Form 2. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta T Raw Score PR Sta T 1 1 1 28 26 56 5 53 2 1 1 29 27 61 6 54 3 1 1 30 28 65 6 55 4 1 1 31 29 68 6 56 5 2 1 32 30 71 6 57 6 2 1 33 31 74 6 57 7 4 1 34 32 75 6 58 8 6 2 35 33 78 7 59 9 "9 2 36 34 82 7 60 10 11 2 37 35 86 7 61 11 12 3 38 36 88 7 62 12 13 3 39 37 90 8 63 13 16 3 40 38 92 8 64 14 19 3 41 39 94 8 65 15 23 3 42 40 96 8 66 16 . 27 4 43 41 99 9 67 17 32 4 44 42 99 9 68 18 36 4 45 43 19 39 4 46 20 41 ' 5 47 ' 21 43 5 48 22 45 5 49 23 46 5 50 24 47 5 51 25 51 5 52 Raw Score: Mean 23.49 S.D. 10.01 Number of Students - 112 (72% of entire NLSD population Aged 14 years 0 months to 14 years 5 months.) Blanks for derived scores at the extreme ranges of the scale indicate that the corresponding raw scores were-not obtained in the standard ization. 152 14 Years 0 Months to 14 Years 5 Months TABLE 33 Northern Lights School Division Norms for Students Aged 14 Years 0 Months to 14 Years 5 Months tested in the Pall on the Reading Total of the Gates-MacGinitie Reading Tests, Level D, Form 2. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw . Score PR Sta T Raw Score PR Sta T 1 1 1 29 26 24 4 41 2 1 1 30 27 24 4 41 3 1 1 30 28 25 4 41 4 1 1 30 29 26 4 42 5 1 1 31 30 27 4 42 6 1 1 31 31 29 4 43 7 2 1 32 32 31 4 43 8 3 1 32 33 34 4 44 9 3 1 33 34 36 4 44 10 4 1 33 35 37 4 45 11 4 1 34 36 38 4 45 12 4 1 34 37 38 4 46 13 4 1 35 38 39 4 46 14 5 2 35 39 • 41 5 47 15 6 2 36 40 43 5 47 16 6 2 36 41 45 5 47 17 7 2 36 42 46 5 48 18 8 2 37 43 46 5 48 19- 10 2 37 44 46 5 ' 49 20 11 2 38 45 47 5 49 21 15 3 38 46 48 5 50 22 17 ' 3 39 47 49 5 50 23 19 ; 3 39 48 50 5 51 24 20 3 40 49 50 5 51 25 22 3 40 50 51 5 52 153 TABLE 33 (Cont'd) Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta T Raw Score PR Sta • T 51 53 5 52 76 90 8 64 52 55 5 52 77 92 8 64 53 55 5 53 78 94 8 64 54 58 5 53 79 95 8 65 55 59 5 54 80 96 9 65 56 60 5 54 81 96 8 66 57 64 6 55 82 96 9 66 58 67 6 55 83 96 9 67 59 68 6 56 84 97 9 67 60 70 6 56 85 99 9 68 61 70 6 57 86 99 9 68 62 7.1 6 57 87 63 73 6 . 58 99 64 74 6 58 65 75 6 58 66 77 6 59 67 79 7 59 68 80 7 60 69 80 7 60 70 81 7 61 71 81 7 61 72 84 • 7 62 73 86 7 62 74 88 7 63 75 89 7 63 Raw Score: Mean 46.56 S.D. 21.8G Number of Students - Ho (71% of entire NLSD population Aged 14 years 0 months to 14 years 5 months.) Blanks for derived scores at the extreme ranges of the scale indicate that the corresponding raw scores were-not•obtained in the standard ization . 154 .14 Years 0 Months to 14 Years 5 'Months TABLE 34 Northern Lights School Division Norms for Students Aged 14 Years 0 Months to 14 Years 5 Months tested in the Fall on the Mathematics Computation Subtest of the Canadian Tests of Basic Skills, Level 13, Form 6. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score PR Sta T Raw Score PR Sta T 1 3 1 34 26 76 6 58 2 5 2 35 27 79 7 59 3 7 2 36 28 82 7 60 4 8 2 37 29 85 7 61 5 10 2 38 30 86 7 62 6 13 3 39 31 87 7 63 7 15 3 40 32 89 7 64 8 18 3 41 33 89 7 65 9 24 4,. 42 34 90 8 66 10 30 4 43 35 93 8 67 11 34 4 ' 44 36 94 8 68 12 37 4 45 37 96 8 69 13 40 4 . 46 38 97 9 70 14 43 5 47 39 98 9 71 15 46 5 48 40 98 9 72 .16 48 5 49 41 99 9 73 17 50 5 50 42 99 9 74 18 54 5 51 43 19 60 5 52 44 20 65 6 52 45 '21 68 6 53 22 70 6 54 23 72 6 55 24 73 6 56 25 75 6 57 Raw Score: Mean 17.41 S.D. 10.43 Number of Students - llo (71% of entire NLSD population Aged 14 years 0 months to 14 years 5 months.) Blanks for derived scores at the extreme ranges of the scale indicate that the corresponding raw scores were-not•obtained in the standard ization . 155 14 Years 0 Months to 14 Years 5 Months TABLE 35 Northern Lights School Division Norms for Students Aged 14 Years 0 Months to 14 Years 5 Months tested in the Fall on the Otis-Lennon School Ability Test, Intermediate, Form R. Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw . Score PR Sta T Raw Score PR Sta T 1 26 33 4 45 2 27 35 4 45 3 1 1 31 28 36 4 46 4 1 1 31 29 38 4 46 5 1 1 32 30 . 40 4 47 6 2 1 33 31 42 5 48 7 2 1 33 32 45 5 48 8 2 1 34 33 48 5 49 9 3 1 34 34 50 5 49 10 3 1 35 35 52 5 50 11 4 1 36 36 55 5 50 12 5 2 36 37 57 5 51 13 7 2 37 38 59 5 52 14 10 2 37 39 60 5 52 15 12 3 38 40 62 6 53 16 14 3 39 41 65 6 53 17 16 3 39 42 67 6 54 18 19 3 40 43 68 6 55 19 22 3 40 44 68 6 ' 55 20 24 4 41 45 70 6 56 21 24 4 42 46 72 6 56 22 26 . 4 42 47 74 6 57 23 28 • 4 43 48 77 6 58 24 30 4 43 49 79 7 58 25 32 4 44 50 81 7 59 156 TABLE 35 (Cont'd) Percentile Ranks, Stanines, and T-Scores Corresponding to Raw Scores. Raw Score 'PR Sta Raw Score PR Sta 51 83 7 59 52 84 7 60 53 84 7 61 54 85 7 61 55 87 7 62 56 88 7 62 57 88 7 63 58 90 8 64 59 92 8 64 60 93 8 65 61 93 8 65 62 93 8 66 63 94 8 . 66 64 94 8 67 65 94 8 68 66 94 8 68 67 95 8 69 68 96 8 69 69 96 8 70 70. 96 8 71 71 97 • 9 71 72 98 9 72 73 98 9 72 74 99 9 73 75 99 9 74 76 77 78 79 80 99 74 Raw Score: Mean 35.21 S.D. 16.85 Number of Students - 109 (70% of entire NLSD population Aged 14 years 0 months to 14 years 5 months.) Blanks for derived scores at the extreme ranges of the scale indicate that the corresponding raw scores were•not obtained in the standard ization . 157 TABLE 36 Means and Standard Deviations for Northern Lights School Division Students Aged l6 Years 0 Months to l6 Years 5 Months on Achievement and School Ability Tests. TEST N MEAN S.D. Gat es-Mac Ginit i e Reading Tests Level E, Form 2 Vocabulary 55 18.86 10.29 Comprehension 52 2h.k0 10.03 Reading Total 52 U2.98 19-^5 Canadian Tests of Basic Skills Level 15, Form 5 Mathematics 5^ 16.56 10.1+5 Otis-Lennon School Ability Test Advanced, Form R 5^ 25.98 Ik.62 1 58 APPENDIX C INVESTIGATION OF THE READING AND MATHEMATICS ACHIEVEMENT OF STUDENT ABSENTEES 1 59 Although an effort was made to assess all students in designated grade and age groups, nonresponse did exist due to student absenteeism and some school organizational difficulties. This raised the question, "were the obtained results for students tested systematically altered so that they were not representative of the original population?". In order to provide information as to how the group norms in this study might have differed from group norms had all the students in the eight groups been assessed, an investigation was made of the reading and math achievement of the students absent during the fall test administration. To the extent differences in academic success-related variables existed between absentees and tested students, theoretical group score distributions on the tests of interest could be expected to differ from those actually obtained. This study was limited to students who had been absent from the entire fall norrning study. Administrative difficulties prevented an assessment of the achievement status of the few students from each group who missed only a portion of the battery of tests administered during the fall. A five-point rating scale was devised on which teachers could appraise the reading and math achievement of the absent students in relation to the tested students. In each of the two achievement domains, an absent student was rated by his teacher as being either (1) much worse, (2) worse, (3) about the same, (4) better, or (5) much better compared to a designated tested student. The following steps utilized in this appraisal were as follows: A. Relative ratings: 1) A relative comparison was made between an absent student and a designated tested student who scored between the 30th and 70th percentile (preferably at the 50th percentile level) on the regional norms. Used as a reference person, this tested student was given a relative value of "3", and the absent student was given a relative rating of from "1" to "5" depending upon the appraisal of the teacher. 2) If no tested student in the 30th to 70th percentile range could be found for comparative purposes, the absent student was compared to a tested student who scored in the 11th to 29th percentile range on the regional norms. This tested student was given a relative value of "2". The absent student was then given a rating of "2" if judged by the teacher to be at about the same level of achievement; a rating of "1" if worse or much worse than the achievement of the tested student; a rating of "3" if better; and a rating of "4" if much better. The value of "5" was not utilized on the scale in this circumstance. 3) Similarly, if no tested student could be used 1 60 comparatively from the above range, a tested student from the 71st to 90th percentile range was used. This student was given a relative value of "4". An absent student was then either given a rating of "4" if judged to be of about the same level, a "5" if better or much better, a "3" if worse, or a "2" if much worse. The value of "1" was not utilized in this case. 4) If no tested student could be found for comparative purposes in any of the above percentile ranges, one was chosen who scored from the 1st to 10th percentile on the regional norms. This student was given a relative value of "1". An absent student who was viewed by his teacher to be at the same achievement level as the tested student was also given a rating of "1". A rating of "1" was also given if the student was judged to be worse or much worse in the achievement area. A rating of "2" was given if the student was appraised as being better, and a "3" if much better. Values of "4" and "5" were not utilized. 5) Unable to find a tested student in the above percentile range, a student was sought who fell into the 91st to 99th percentile range. This student was given a rating of "5". The absent student was also given a rating of "5" if rated the same or better in the achievement area as the tested student. A rating of "4" was given if worse, and a rating of "3" given if judged to be much worse. Values of "1" and "2" were not considered in this case. B. General ratings: 6) Finally, where no tested students could be found to compare the absent student to, a general rating was made by his teacher. The teacher was asked to to judge the achievement of the absent student according to the teacher's knowledge of the student as a school achiever in reading and math. In this case, the five-point scale was collapsed to a three-point scale. This took the form of (3) average, (2) below average, and (4) above average. While steps two to six show increasing undesirability for exacting results, taken in their entirety, it was believed that these data provided a reasonable general indication of differences. The results of the relative and general ratings were plotted for each group in each achievement area, and a mean of the relative ratings and a mean combining both relative and general ratings were calculated (see Figures 1-8). Drawn onto each plot was the shape of the distribution based on the relative ratings. Also transcribed was a hypothetical expected distribution based on the premise that of the students who were relatively rated, 40 percent (i.e. between the 30th and 70th 161 percentiles on regional norms) would receive a rating of "3", 20 percent would receive a rating of "2" and "4" respectively, and only 10 percent would each receive a rating of "1" and "5". Considering the distribution of the plots, the means of the data, and the weighting of each absentee group (that is, the percentage of absentees to their total group population), certain conclusions were formulated. The results give some indication that the grade five and nine, and age sixteen norms would not have been affected, or affected only minimally, had all the students been tested. However, also indicated was that a certain disproportionate number of students not assessed would have scored lower than the obtained regional means for the other groups. Specifically, grades three, seven, and the three younger age groups would probably have been affected by a downward swing in the norms. This lowering of group norms would have produced even greater differences between the regional and national group means. As this swing would probably have been slight, the value of the obtained norms should be judged accordingly. 162 FIGURE 1 Illustrations showing the teacher ratings of the grade three absentees in Reading and Mathematics achievement. Total number of absentees 65 Number of absentees accounted for 61-59* Population size ^31 Percentage of absentees in pop. 15.1 Reading x relative ratings UO-38* general ratings 21 expected distribution actual, distribution (relative) Mathematics Rating x Mean 2.h (Relative) x- Mean 2.7 (Combined) x Mean 2.6 (Relative) x« Mean 2.8 (Combined) * Number accounted for in Reading, followed by Math. 163 FIGURE 2 Illustrations showing the teacher ratings of the grade five absentees in Reading and Mathematics achievement. Total number of absentees 26 Number of absentees accounted for 16 Population size 350 Percentage of absentees in pop. T.1* x relative ratings io . general ratings 6 expected distribution actual distribution (relative) Reading Mathematics 5 - Rating x Mean 3.2 (Relative) x« Mean 3.1 (Combined) x Mean 3.0 (Relative) x« Mean 2.9 (Combined) 164 FIGURE 3 Illustrations showing the teacher ratings of the grade seven absentees in Reading and Mathematics achievement. Total number of absentees 40 Number of absentees accounted for All Population size 259 Percentage of absentees in pop. 15. ^ x relative ratings 31 general ratings 9 expected distribution actual distribution ' (relative) Reading Mathematics 5 - Rating x Mean 2.3 (Relative) x» Mean 2.4 (Combined) x Mean 2.3 (Relative) x« Mean 2.4 (Combined) 165 FIGURE 1* Illustrations showing the teacher ratings of the grade nine absentees in Reading and Mathematics achievement. Total number of absentees 22 Number of absentees accounted for All Population size 187 Percentage of absentees in pop. 11.8 x relative ratings 15 general ratings 7 expected distribution actual distribution (relative) Reading Mathematics Rating x Mean 3.0 (Relative) X' Mean 2.8 (Combined) x Mean 2.^ (Relative) x« Mean 2-7 (Combined) 166 FIGURE 5 Illustrations showing the teacher ratings of the age nine absentees in Reading and Mathematics achievement. Total number ol absentees 22 Number of absentees accounted for 21 Population size !80 Percentage of absentees in pop. 12.2 x relative ratings 12 general ratings 9 expected distribution actual distribution (relative) Reading Mathematics 5 - Rating x Mean 2.2 (Relative) x» Mean 2.3 (Combined) x Mean 1.9 (Relative) x« Mean 2..1 (Combined) 167 FIGURE 6 Illustrations showing the teacher ratings of the age eleven absentees in Reading and Mathematics achievement. Total number of absentees <*9 Number of absentees accounted for 31 Population size 180 Percentage of absentees in pop. 27.2 x relative ratings 7 . general ratings . 2k expected distribution actual distribution (relative) Reading Mathematics Rating x Mean 2.3 (Relative) x« Mean 2.9 (Combined) x Mean 2.1 (Relative) x* Mean 2.6 (Combined) 168 FIGURE 7 Illustrations showing the teacher ratings oi the age fourteen absentees in Reading and Mathematics achievement. Total number of absentees 1*1 Number of absentees accounted for 31* Population size 156 Percentage of absentees in pop. 26.3 x relative ratings 6 . general ratings 28 expected distribution actual distribution (relative) Reading Rating Mathematics x Mean 2.5 (Relative) x« Mean 2.5 (Combined) x Mean 2.5 (Relative) x» Mean 2.5 (Combined) 169 FIGURE 8 Illustrations showing the teacher ratings of the age sixteen absentees in Reading and Mathematics achievement. Total number of absentees 25 Number of absentees accounted for 17-19* Population size 82-Percentage of absentees in pop. 30.1 x relative ratings 8-10* general ratings 9 expected distribution actual distribution (relative) Reading Rating -Mathematics x Mean 2.8 (Relative) x» Mean 2.9 (Combined) x Mean 3.1 (Relative) x« Mean 3.0 (Combined) * Number accounted for in Reading, followed by Math. 1 70 APPENDIX D NLSD TEST RESULT INFORMATION BY AGE Table A Means and Standard Deviations on Tests of Achievement and School Ability for Students in Four Age Groups in the Northern Ligths School Division TEST Mean S .D . TEST Mean S.D. 9-0 to 9-5 years of age 11-0.to 11-5 years of age Gates-MacGi ni t ie Gates-MacGinitie Vocabulary B1 150 22 32 12 14 Vocabulary D1 130 17 70 9 94 Comprehension B1 149 20 63 12 14 Comprehension D1 130 16 28 8 4 1 Reading Total B1 147 43 05 21 57 Reading Total B2 130 33 98 17 36 CTBS Match Comp. CTBS Math Comp. Level 7, Form 5 145 21 03 4 82 Level 10, Form 6 126 23 79 1 1 39 Otis-Lennon School Otis-Lennon School Abi1ity Pri II R 140 41 0 14 7 1 Abi1ity Ele-R 127 33 82 16 12 14-0 to 14-5 years of age 16-0 to 16-5 years of age Gates-MacG i n i t i e Gates-MacGinitie Vocabulary D2 1 12 23 18 12 33 Vocabulary E2 55 18 86 10 29 Comprehension D2 1 12 23 49 10 01 Comprehension E2 52 24 40 10 03 Reading Total D2 1 10 4G 56 21 80 Reading Total E2 52 42 98 19 45 CTBS Match Comp. CTBS Math Comp. Level 13,Form 6 1 10 17 4 1 10 43 Leve1 15,Form 5 54 16 56 10 45 Otis-Lennon School Otis-Lennon School Abi1 i ty Int-R 109 35 21 16 85 Abi1i ty Adv-R 54 25 98 14 62 h-1 172 Table B Summary Item Statistics of Achievement and Scholastic Ability Tests Administered to Students in Four Age Groups in the Northern Lights School Division Test Number of Test I tems Ave. I tem Dif f. Average I tem Discrim. Reliabi1ity Coef f ic ient (K-R 20) Standard Error of Meas. 9-0 to 9-5 years Gates-MacGinitie Vocabulary, B1 45 .496 .683 .952 2.66 Gates-MacGinitie Comprehension, B1 40 .516 .642 .937 • 2.51 Gates-MacGinite Reading Total, B1 85 .507 .638 .971 3.67 CTBS Mathematics Computation, 7-5 26 .809 .437 .869 1 .74 Otis-Lennon School Ability, Prill-R 75 .547 .501 .938 3.66 11-0 to 11--5 years Gates-MacGinitie Vocabulary, D1 45 .393 .563 .929 2.65 Gates-MacGinitie Comprehension, D1 43 .379 - - -Gates-MacGinitie Reading Total, D1 88 .386 .501 .950 3.72 CTBS Mathematics, 10-6 42 .567 .698 .954 2.44 Otis-Lennon School Ability, Ele-R 70 .483 .571 .954 3.46 1 73 Table B (cont'd) Test Number of Test I tems Ave. I tem Dif f. Average I tem Discrim. Reliability Coef f ic ient (K-R 20) Standard Error of Meas. 14-0 to 14--5 years Gates-MacGinitie Vocabulary, D2 45 .515 .701 .956 2. 59 Gates-MacGinitie Comprehension, D2 43 .546 .596 .925 2.74 Gates-MacGin ite Reading Total, D2 88 .529 .634 .970 3.78 CTBS Mathematics Computation, 13-6 45 .387 .590 .940 2.55 Otis-Lennon School Ability, Int-R 80 .440 .540 .953 3.65 16-0 to 16--5 years Gates-MacGinitie Vocabulary, E2 45 .419 . 562 .931 2.70 Gates-MacGinitie Comprehension, E2 43 .568 .589 .929 2.67 Gates-MacGin it ie Reading Total, E2 88 .488 .561 . 961 3.84 CTBS Mathematics, 15-5 48 .348 .542 .933 2.70 Otis-Lennon School Ability, Adv-R 80 .325 .429 .943 3.49 Table C Test Items with Item Difficulty Levels Below the Chance Level1 or Item Discrimination Indices' Below .20 on Tests of Achievement and Scholastic Ability Administered to Four Age Groups in the Northern Lights School Division 9-0 to 9-5 years of age 11-0 to 11-5 years of age 14-0 to 14-5 years of age 16-0 to 16-5 years of age Gates-M< ac Reading Total, B1 Gates-f Hac Readin< 3 Total, D1 Gates- i/lac Readin 3 Total, D2 Gates -Mac Readi ng Total, E2 I tem Number I tem D i f f . I tem D i scr i m. I tem Number I tem Diff . I tem D i scr i m. I tem Number I tem Dif f . I tem D i scr i m. I tem Number I tem Dif f . I tem D i scr i m. 45V . 39C . 37C . . 340 .095 . 177 . 162 . 189 . 270 37C . 45V. 44V . 10C . 40C . 42V . 25C . 29C . 31C . . 169 . 123 . 108 . 439 . 200 .115 . 146 . 146 . 162 .086 .090 . 120 .171 . 179 .211 .211 . 240 .331 22C . . 436 .021 38V . 36C . 13V . 45V . . 154 .115 . 346 . 135 .000 . 154 . 154 . 231 CTBS Ma :h Computa : i on, 7-5 CTBS Ma :h Computa tion, 10-6 CTBS Ma :h Computa :ion. 13-6 CTBS r Aathemati c 3. 15-5 1 1 . 16 . .890 .931 . 139 . 139 13 . .889 .131 93 . 98 . 89 . 74 . . 073 .073 . 109 . 173 . 106 .213 . 321 . 392 4 1 . 17 . 29 . 38 . 32 . 43 . 45 . 22 . 44 . . 241 . 685 . 148 .111 . 130 .093 . 093 . 130 . 130 .060 . 165 . 203 .214 . 280 . 357 . 357 . 429 . 500 0-L Schc sol Abi 1 i t\. /, Prill-R 0-L Scr iool Ab i1i iy, Ele-R 0-L Sci lool Abi1i ty, Int-R 0-L Scl ioo 1 Ab i 1 i • :y, Adv-R 1a . 27c . 30c. . 900 . 193 . 243 .171 . 229 .600 50. 69 . . 252 . 173 .094 . 156 80. 39 . 73 . 64 . . 138 . 303 .092 . 165 . 185 . 185 '. 333 . 407 79 . 77 . 37 . 42 . 70. 80. 63 . 38 . 78 . 68 . 67 . 39 . 7 1 . 62 . 65 . 35 . 57 . .019 .111 .315 . 333 . 185 .111 . 148 .278 .093 .111 .074 . 185 . 093 . 130 . 130 . 185 . 130 - .077 .060 .12 1 . 126 . 132 . 137 . 137 . 189 . 209 . 209 .2 14 . 280 . 286 . 286 . 286 .423 . 500 Note: c designates an item from the Comprehension test; V designates an item from the Vocabulary test. 1 Based on 1/A, where A is the number of choices for an item, and assuming all choices are equally attractive. 1 The point-biseria 1 correlation of the examinees' scores on an item with the their total test scores. 1 An item with subscript 'a' has come from Part I of this test; an item with subscript 'C has come from Part III. 1 75 Table D Pearson Correlation Coefficients Among Tests Administered to Students in the Northern Lights School Division Within Four Age Groups Test Test 1 Test 2 Test 3 Test 4 Test 5 1. G-Mac Vocabulary 2. G-Mac Comprehension 9-0 to 9-5 years: 11-0 to 11-5 years: 14-0 to 14-5 years: 16-0 to 16-5 years: .9085 .7899 .8883 .8209 3. G-Mac Reading Total 9-0 to 9-5 years: 11-0 to 11-5 years: 14-0 to 14-5 years: 16-0 tO 16-5 years: .9812 .9549 .9773 .8941 .9721 .9364 .9654 .9532 4. CTBS Mathematics 9-0 to 9-5 years: 11-0 to 11-5 years: 14-0 to 14-5 years: 16-0 to 16-5 years: .6120 .7053 .6985 .7690 .6274 .6378 .7336 .7110 . 6335 .71 23 .7362 .7091 5. O-L School Ability 9-0 to 9-5 years: 11-0 to 11-5 years: 14-0 to 14-5 years: 16-0 to 16-5 years: .6501 .7854 .8322 .8091 .7066 .7360 .8688 .7633 .6899 .8079 .8787 .7349 .541 5 .7054 .8332 .8987 1 76 Table E Percentage1 of Students by Age in the Northern Lights School Division Scoring at or Below Selected Percentile Ranks from the National Standardization of the Otis-Lennon School Ability Test Selected Percentile Ranks from the National Standardization Test 10 20 50 80 9-0 t -o 9-5 3 'ears o: : age Otis-Lennon School Ability, Pri.II-R 39 58 83 95 11-0 to 11-5 years of age Otis-Lennon School Ability, Ele-R 37 52 77 94 14-0 to 14-5 years of age Otis-Lennon School Ability, Int-R 38 59 84 94 16-0 to 16-5 years of age Otis-Lennon School Ability, Adv-R 42 62 86 90 1 The percentages given are equivalent to the NLSD midpoint percentile ranks. 1 77 Table F Results of the One Sample t-Tests For Students Assessed by Age on the Otis-Lennon School Ability Test in the Northern Lights School Division Test NI JSD Nat ional Mean Stan. Dev. Median 1 df t** 9-0 to 9-5 years of age Otis-Lennon School Ability, Pri.II-R 41 .00 14.71 58 1 39 1 2.87 11-0 to 11-5 years of age Otis-Lennon School Ability, Ele-R 33.82 16.12 49 1 26 9.92 14-0 to 14-5 years of age Otis-Lennon School Ability, Int-R 35.21 1 6.85 53 1 08 1 0.40 16-0 to 16-5 years of age Otis-Lennon School Ability, Adv-R 25.98 14.62 40 53 7.05 1 The reported median values lie between the median scores indicated for the X-0 to X-2 years of age and X-3 to X-5 years of age subgroups found under each age (X) in the national norm tables. ** p < .001 

Cite

Citation Scheme:

    

Usage Statistics

Country Views Downloads
China 11 33
Japan 7 0
United States 5 0
Canada 3 0
Ukraine 1 0
City Views Downloads
Shenzhen 10 33
Tokyo 7 0
Unknown 2 3
Québec 2 0
Ashburn 2 0
Wilmington 1 0
San Mateo 1 0
Beijing 1 0
St. Albert 1 0

{[{ mDataHeader[type] }]} {[{ month[type] }]} {[{ tData[type] }]}
Download Stats

Share

Embed

Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                        
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            src="{[{embed.src}]}"
                            data-item="{[{embed.item}]}"
                            data-collection="{[{embed.collection}]}"
                            data-metadata="{[{embed.showMetadata}]}"
                            data-width="{[{embed.width}]}"
                            async >
                            </script>
                            </div>
                        
                    
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:
http://iiif.library.ubc.ca/presentation/dsp.831.1-0054522/manifest

Comment

Related Items