Open Collections

UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Investigating intermediate teachers’ CDRom selection criteria McPherson, Keith 1998

Your browser doesn't seem to have a PDF viewer, please download the PDF to view this item.

Item Metadata

Download

Media
831-ubc_1998-0544.pdf [ 9.08MB ]
Metadata
JSON: 831-1.0054790.json
JSON-LD: 831-1.0054790-ld.json
RDF/XML (Pretty): 831-1.0054790-rdf.xml
RDF/JSON: 831-1.0054790-rdf.json
Turtle: 831-1.0054790-turtle.txt
N-Triples: 831-1.0054790-rdf-ntriples.txt
Original Record: 831-1.0054790-source.json
Full Text
831-1.0054790-fulltext.txt
Citation
831-1.0054790.ris

Full Text

INVESTIGATING INTERMEDIATE T E A C H E R S ' C D R O M S E L E C T I O N CRITERIA by KEITH M C P H E R S O N B A . , The University of British Columbia, 1994  A THESIS SUBMITTED IN PARTIAL F U L F I L L M E N T O F T H E REQUIREMENTS FOR THE D E G R E E OF MASTER OF A R T S in T H E F A C U L T Y OF G R A D U A T E STUDIES (Department of Curriculum and Instruction)  W e accept this thesis as conforming to the required standard  T H E UNIVERSITY O F BRITISH C O L U M B I A May 1998 © K e i t h McPherson, 1998  In  presenting  degree freely  at  this  the  thesis  in  partial  fulfilment  University  of  British  Columbia, I agree  available for  copying  of  department publication  this or of  reference  thesis by  this  for  his thesis  and study. scholarly  or for  her  Department The University of British C o l u m b i a Vancouver, Canada  DE-6 (2/88)  the  requirements that  I further agree that  purposes  may  representatives.  financial  permission.  of  It  gain shall not  be  the  advanced  Library shall make  by  understood be  an  permission for  granted  is  for  allowed  the that  without  it  extensive  head  of  my  copying  or  my  written  ABSTRACT This study investigates the current selection criteria and processes that experienced intermediate educators (teachers, teacher librarians, technology teachers, teacher librarian/technology teachers, and teacher/technology teachers) in the Vancouver metropolitan area use to assess and evaluate C D R O M software for use with intermediate students. The study was exploratory and descriptive in nature and employed a survey approach. The survey was administered using questionnaire and interview instruments. The questions for the study were developed through input from a panel of three C D R O M experts, and further refined in a pilot study.  The  research instruments were administered by the researcher to twenty educators experienced with C D R O M selection. During the interview, subjects were observed and tape recorded evaluating two C D R O M encyclopedias. All audio recordings were transcribed. Content analysis of the interview transcriptions objectively, systematically, and quantitatively identified and described references to C D R O M selection criteria, factors affecting the development of these selection criteria, and resources consulted when evaluating and selecting C D R O M software.  Subjects  identified forty-six C D R O M selection criteria, six factors influencing their evaluation and selection of C D R O M s , and sixteen resources which they consulted when selecting C D R O M s . It was found that groups of educators (e.g., teachers, teacher librarians, and technology teachers) differed on the types of selection criteria they used to evaluate C D R O M s . Educators also used different criteria when evaluating a specific C D R O M as opposed to criteria they listed for selecting all C D R O M s . The recommendations for this study included: (a) ensuring educators are given time to  ii  preview, evaluate and test C D R O M S with their students, (b) training educators how to select and use C D R O M software in groups of educators both similar and different to their teaching specialties, (c) increasing the involvement of provincial and district resource specialists, and (d) the establishment of a previewing site on the internet to assist educators in their evaluation and selection of C D R O M software.  iii  TABLE OF CONTENTS Abstract  ii  Table of Contents  iv  List of Tables  viii  List of Figures  ix  Dedication and Acknowledgments  xii  C H A P T E R O N E : Introduction  1  Identification of a Problem  3  Rationale  7  Purpose of the Study .  7  Research Questions  7  Significance of the Study  8  Definitions  8  Organization of the Thesis  12  C H A P T E R T W O : Review of Related Literature  13  The Need for Evaluation and Selection Criteria  13  Responsibility for Selecting Educational C D R O M S  18  Development of C D R O M Selection Criteria  22  Methodology and Theoretical Frameworks  27  Methodology  27  Theoretical Framework  29  Summary  35  C H A P T E R T H R E E : Methodology  37  iv  Instrument Development  37  Stage One  38  Stage Two  40  Stage Three  44  Sample Selection  45  Administration of Final Instruments and Data Collection  48  Stage One  48  Stage Two  49  Stage Three  49  Stage Four  52  Data Analysis  52  C H A P T E R FOUR: Findings  56  Descriptive Data of Research Sample  56  Findings in Relation to Research Questions  62  Research Question One  62  Research Question Two  66  Stage One  66  Stage Two  80  Comparison, Stages One and Two  88  Research Question Three  91  Research Question Four  93  Human Information Resources Other Educators  v  94 94  Professional Development Events  98  Publishers and Retailers  100  Parents and Students  101  Nonhuman Information Resources  106  Periodicals  107  Provincial and District Resources  109  Internet  111  School-Based Resources  111  Summary  111  Research Question Five  112  Cost  112  Time  117  Hardware and Software Compatibility  120  Previewing  122  Usage and Training  125  Summary  126  Research Question Six  127  C H A P T E R FIVE: Discussion and Recommendations  129  Discussion  129  Research Question One  129  Research Question Two  130  Research Question Three  133  Research Question Four  134  vi  Research Question Five  135  Research Question Six  136  Recommendations for Administrators and Practitioners  141  Limitations of the Study  145  Recommendations for Further Research  147  BIBLIOGRAPHY  151  APPENDIX A: Draft Questionnaire  160  A P P E N D I X B: Draft Interview Guide Questions  165  A P P E N D I X C: Letter Requesting Panel Experts' Assistance  167  A P P E N D I X D: C D R O M Research Study Overview  169  A P P E N D I X E: Panel Expert's Covering Letter  172  A P P E N D I X F: Panel Expert's Rating Form  173  A P P E N D I X G: Final Questionnaire  193  A P P E N D I X H: Final Interview Guide Questions  198  A P P E N D I X I:  200  Pilot Study Research Consent Form  A P P E N D I X J : Letter of Initial Contact  203  A P P E N D I X K: Research Consent Form  204  A P P E N D I X L. Permissions to Conduct Research Examples  207  A P P E N D I X M: Initial Set of C D R O M Selection Criteria Codes  209  A P P E N D I X N: Final Set of C D R O M Selection Criteria Codes  211  APPENDIX O: Example of Content Analysis  214  vii  LIST OF T A B L E S Table 1 C D R O M Selection Criteria Types as Listed in Published Checklists  25  Table 2 Example of C D R O M Classification by Type  30  Table 3 Changes to the Questionnaire Resulting from Panel Experts' Reviews. . . . 43 Table 4 Changes to Interview Guide Questions Resulting from Panel Experts' Reviews  43  Table 5 Revisions to the Initial List of C D R O M Selection Criteria Codes  55  Table 6 Rank Order of Percentage of Educators to Mention Specific C D R O M Selection Criteria Prior to C D R O M Evaluation Activity (Stage 1)  73  Table 7 Rank Order of Percentage of Educators to Mention Specific C D R O M Selection Criteria During C D R O M Evaluation Activity (Stage 2)  83  Table 8 Percentage of Sample to Use the Same Criteria in Both Stages One and Two  90  viii  LIST OF F I G U R E S Figure 1  Types of Criteria Listed in Current C D R O M Selection Literature  26  Figure 2  Sequencing of the Interview Instrument  50  Figure 3  Average Number of Computers Located in Educators' Classroom Displayed by Type of Educator  58  Figure 4  Types of C D R O M S Being Used by Sample  60  Figure 5  Categories of Research Subjects' C D R O M Selection Criteria. Percentages are Based on the Number of Educators to Mention Each of the Forty Six Criteria  Figure 6  63  Top Eleven Selection Criteria Given the Highest Overall Scores of Importance by the Sample Before Hands on Computer Activity  Figure 7  Twelve Most Popular C D R O M Selection Criteria as Indicated by Percentage of Total Sample Throughout Part Two of the Study  Figure 8  67  Categorical Breakdown of Subjects' Total Number of Written C D R O M Selection Criteria  Figure 10  68  Average Number of C D R O M Selection Criteria Mentioned During the Interview Displayed by Type of Educator  Figure 11  69  Categorical Breakdown of Subjects' Total Number of Verbal C D R O M Selection Criteria by Type of Educator  Figure 12  65  Average Number of Listed C D R O M Selection Criteria by Type of Educator  Figure 9  64  70  Average Number of C D R O M Selection Criteria Mentioned Displayed by Type of Educator  71  ix  Figure 13  Categorical Breakdown for the Total Number of C D R O M Criteria Mentioned by Educators  Figure 14  72  Percentage of Types of Educators to Use Top Seven C D R O M Selection Criteria (Stage 1)  Figure 15  74  Percentage of Types of Educators to List and Use C D R O M Selection C D R O M Selection Criteria Ranked Eighth to Fifteenth (Stage 1)  Figure 16  Categorical Breakdown of Percentage of Educators' C D R O M Selection Criteria  Figure 17  79  Average Number of Listed C D R O M Selection Criteria Displayed by Type of Educator  Figure 18  81  Categorical Breakdown for the Percentage of Educators to Mention C D R O M Selection Criteria  Figure 19  84  Percentage of Types of Educators to Use C D R O M Selection Criteria Ranked Eighth to Fourteenth (Stage Two)  Figure 21  85  Difference Between Samples' Emphasis on Categories of Criteria in Stage One and Stage Two  Figure 22  81  Percentage of Types of Educators to Use Top Seven C D R O M Selection Criteria (Stage Two)  Figure 20  75  89  Comparison of the Content Categories of C D R O M Selection Criteria Mentioned by Educators in Stage One, Applied in Stage Two, and the Content Categories of Criteria Mentioned in the Literature  Figure 23  92  Resources Educators Consulted when Evaluating and Selecting Intermediate Educational C D R O M s During the Interview  x  95  Figure 24  Resources Sample Consulted when Evaluating and Selecting Intermediate Educational C D R O M Software (Questionnaire)  Figure 25  96  Factors Most Responsible for Influencing the Development of the Research Subjects' Educational C D R O M Software Selection Criteria . . .113  Figure 26  Percentage of Sample to Mention Cost as an Important Difficulty Educators Must Face when Selecting C D R O M s Displayed by Type of Educator  Figure 27  114  Percentage of Sample to Mention Cost Prior to and During the C D R O M Activity Displayed by Type of Educator  Figure 28  114  Percentages of Sample to Cite Time as an Important Difficulty Educators Face when Selecting C D R O M s Displayed by Type of Educator  Figure 26  118  Percentages of Educators to Cite Hardware and Software Compatibility as a Factor which Influences their Evaluation and Selection of Educational Intermediate C D R O M Software  Figure 30  121  Percentages of Educators to Cite Previewing as an Important Difficulty which Educators Face when Selecting C D R O M s  xi  123  DEDICATION A N D A C K N O W L E D G M E N T S  This thesis is dedicated to Pye. A great friend and joy. Until we meet again.  I would also like to acknowledge: my advisors, Ann Lukasevich, and Lee Gunderson, for their sincere encouragement and guidance through this project and over the many years we have known each other; Dr. Marv Westrom for his assistance as third reader of my thesis; Gary Johnson, for his hours of transcribing; Marion Crowhurst for believing in me; Carol Kelly, Anna Ip, Simone Goguen, Teresa O'Shea, Anne White, and Lise Winsemann for their patience and understanding; my friends for accepting my need to be an antisocial hermit during this time; my 'can do' parents, Jack and Shirley Johnson, for their ability to listen intently and truly care; my parentsin-law, Sophia and Bruce Welna, for those comforting Sunday dinners (and for helping me side-step research grant applications); the district supervisors for facilitating the selection process; the teachers who voluntarily participated in my research; and lastly, but most importantly, to my wife, Janet McPherson, councillor, proof-reader, critical advisor, supporter, comforter, and life-friend extraordinaire.  W e did it.  xii  CHAPTER ONE INTRODUCTION The use of computers in B C schools has increased greatly over the past decade. A s early as 1987, the Provincial Advisory Committee on Computers to the Minister of Education (British Columbia Ministry of Education, 1987) noted that computers could assist teachers in developing the appropriate skills and attitudes needed to enable students to successfully adapt and participate in a society undergoing rapid technological and informational change. The Advisory Committee also stated that computer technology should be integrated into all existing curriculum strands. For the next eight years, most B C curricular reform documents made reference to this report.  Unfortunately, appropriate government funding to purchase  adequate computer equipment and to train teachers in this area, did not follow, making it unclear as to how educators could fully implement the 33 recommendations without the tools. Eight years later, the B C Ministry of Education established the Provincial Advisory Committee on Education Technology [PACET] to survey and report on the state of technology in B C public schools (British Columbia Ministry of Education, 1995). The committee reported that even though the use of technology in schools was increasing, additional financial support was needed to purchase computer hardware and to help educate British Columbians so that they could "function effectively in a global market place, using computers, multi-media and information technology to do business" (p. 2). In response to the P A C E T report, the Ministry of Education revivified its technology policy as outlined in the Technology in British Columbia  1  Report and Action Plan (British Columbia Ministry of Education, 1995). It stated: The B C Government and the Ministry of Education are committed to sustaining the use of technology to support the needs of students for modern and relevant learning tools in our public schools. This means providing new equipment, links to services such as the Internet and the training and support needed to utilize technology most effectively...the government of British Columbia is confident that our public school system will deliver the technology and skills that students need to succeed in the information economy, (p. 14) The policy also included a five year funding of $100 million dollars with the expectation that schools would purchase enough computers so as to provide "one computer for every three students in secondary schools, and one computer for every six students in elementary schools - or better" (British Columbia Ministry of Education, 1995, p. 1). Finally, significant funds were in place that would help educators purchase the computers needed to meet ministerial curriculum recommendations. This plan to increase the ratio of computers to students in public schools was not a phenomenon isolated to British Columbia. Most Canadian provincial education ministries have mandated clear technology policies aimed at increasing the number of computers in public schools (e.g., Manitoba Education and Training, 1997; New Brunswick Department of Education, 1997; Newfoundland Ministry of Education and Training, 1997; Nova Scotia Department of Education and Culture, 1996; Ontario Ministry of Education and Training, 1995; Prince Edward Island Department of Education, 1997).  In addition, these provincial policies were supported by Canadian  federal technology policies. For example, in 1993 the Federal government designed  2  the Computers for Schools program. The program's objective was to place over 100,000 corporately donated and recycled computers into Canadian schools and public libraries by the year 2,000.  To date, the program has placed 54,991  computers (Government of Canada, 1998).  A s a result of such initiatives, many  public schools are using government funding and programs to purchase as many computers and software programs as possible for their students. Identification of Problem During 1995, P A C E T also reported on a survey of elementary teachers' existing computer resources and projected technological needs (British Columbia Ministry of Education, 1995). In their survey of existing computer resources, P A C E T found that the number of C D R O M players in B C public schools had increased sixfold from 400 players in 1991 to 2,500 players in 1995. And in their report on projected technological needs, P A C E T concluded that,  "For the future, elementary schools  anticipate that W P W P [Writing, Publishing & Word Processing] will remain their highest-use application. They also anticipate moving more into C D - R O M information access" (British Columbia Ministry of Education, 1995, p. 5). This projected trend for the increased use of C D R O M s in B C public schools was verified a year later in a Learning Resources Branch (LRB) survey published in the spring of 1996 (British Columbia Ministry of Education, 1996a). The L R B surveyed the learning resource needs of almost 600 B C educators (of which 456 were teachers, teacher librarians, and counselors) and found that 2 3 % of the respondents were already using C D R O M s "Often or Always" (section 4, p. 2), 29.9% felt that they "needed more funding for equipment software/ C D R O M " (section 4, p. 5), and, 56.3%  3  of district staff felt that support was needed in assisting schools evaluate and select C D R O M s (section 4, p.8). The authors of the L R B survey concluded that, "The increasing importance of technological resources [Internet, software, C D R O M ] , combined with the rapid change in curriculum, has resulted in a significant increase in the quantity and type of learning resources available to educators" (British Columbia Ministry of Education, 1996a, p. 1). A movement towards an increased use of C D R O M software and hardware has also been identified in the United States. McCarthy (1993) reported that from 19911993 the number of K-12 educational C D R O M s in U.S. schools had more than quadrupled.  Berger and Kinnell (1994a, p.27) found that "In 1989, there were only a  handful of C D - R O M titles; now in 1994, there are over 5,000." They also reported that "In 1988-1989, only three percent of all schools were using C D - R O M s ; by 19931994, that number had gone up to 43 percent" (Berger & Kinnell, 1994b, p. 26). Hays (1995, p.46) noted that, "The number of public schools using C D - R O M has increased nearly 250% since 1988. In the most recent year for which data are available (1993-94), the number of schools using C D - R O M increased 80% [and] C D R O M drives are used in 37% of public schools, accounting for more than 15 million students in the United States." The reasons why elementary teachers are moving towards an increased use of C D R O M s in their classrooms are quite varied and are outlined in numerous books, articles, and research studies (Berger & Kinnell, 1994a, 1994b; Lamy, 1994; McCarthy, 1993; Safford, 1994; Shade 1994). Most authors attribute this increased use of C D R O M s in schools to three common factors: (a) the recent drop in C D R O M  4  hardware and software prices; (b) the current proliferation of educational titles; and (c) the C D R O M ' s large storage capacity and multimedia capabilities. In British Columbia, the increasing number of C D R O M s in the elementary setting can also be attributed to an increased desire in teachers to use C D R O M s in their classrooms, and to the financial support and computer program development coming from both provincial and federal governments. As the number of elementary teachers using C D R O M hardware increases, and the selection of educational C D R O M software increases, so does the need for the careful evaluation and selection of instructionally effective C D R O M resources. Unfortunately, much of the C D R O M technology in elementary schools is fairly new to many teachers, and it is difficult for district and school administrators to review a significant percentage of the computer software on the market (Reiser & Kegelmann, 1994). Consequently, little is known, and even less written, about how teachers develop selection criteria, what factors affect their criteria, or even what their selection criteria may include when choosing C D R O M s for use in their classrooms. Of the literature that does exist on C D R O M selection criteria, much of it has been written for and by teacher librarians. In her M.Ed. Thesis investigating the evaluation criteria for World Wide W e b documents, Baker (1997) reviewed the literature on teacher-librarians' C D R O M selection criteria and found that teacher librarians evaluated both the content and technical design of C D R O M s .  Baker  further reported that teacher librarians used traditional print resource criteria to evaluate a C D R O M ' s content and that criteria for evaluating a C D R O M ' s technical design "were brief and not too specific for consistent application" (p. 50). Although  5  teacher librarians are often regarded as pioneers in the evaluation, selection, and dissemination of information technology resources like C D R O M s (Gibson, 1997), they too have not fully developed distinct criteria for assessing and evaluating characteristics and features particular to C D R O M s , and would probably agree with librarians Richards and Robinson (1993) who state "Very little has been written about how to evaluate C D - R O M software" (p.92). In addition to the literature, British Columbian teachers have traditionally been able to rely on the Ministry of Education's Learning Resource Branch (LRB) to assist in evaluating and selecting resources. Unfortunately, data from the recent L R B survey (British Columbia Ministry of Education, 1996a, section 5.5, p.3-6) found "only 43.6% of the teachers were satisfied with their existing resources selection process". The report further stated: Less than 48.4% of the [526] teachers and district staff surveyed felt that the LRB was proactive in identifying technology-based learning resources, [and that] several respondents questioned the LRB's ability to be a leader in this area given the poor quality of electronic products produced by the Branch in the past. (p. 6) Without adequate proactive support and guidance from the L R B , and a lack of literature focusing on the selection of educationally sound C D R O M software, elementary teachers using this new media are faced with the daunting task of selecting educationally sound C D R O M software from an ever increasing list of possible choices. In addition, without explicit criteria to evaluate and select educationally sound C D R O M software, educational software developers will not have  6  adequate guidelines with which to create new C D R O M software, or improve existing educational C D R O M s . Rationale It is this researcher's belief that in order to help educators identify instructionally effective C D R O M software for their students, and to assist educational software developers in producing educationally sound materials, the criteria used by educators to select C D R O M s must be researched and made more explicit, and the factors which shape the development of these criteria should be researched and understood as well as possible. Purpose of the Study The purpose of this study is to investigate the current selection criteria and processes that experienced teachers, teacher librarians, technology teachers, teacher librarian/technology teachers, and teacher/technology teachers in the Vancouver metropolitan area use to assess and evaluate C D R O M software for use with intermediate students. Research Questions Specifically, the researcher will seek to answer the following questions: 1.  What selection criteria do experienced intermediate educators use to evaluate C D R O M s to be used with intermediate students?.  2.  Which of these selection criteria do experienced intermediate educators use when evaluating two encyclopedia C D R O M s to be used with intermediate students?  3.  How different or similar are the criteria selected by experienced intermediate  7  educators to those criteria listed in the current literature? 4.  What resources do experienced intermediate educators consult when evaluating and selecting C D R O M software for intermediate students?  5.  What factors are most responsible for influencing the development of experienced intermediate educators' criteria when evaluating and selecting C D R O M s for intermediate students?  6.  Do different groups of educators (teachers, teacher librarians, and technology teachers, teachers/technology teachers, teachers/technology teachers) use similar C D R O M selection criteria and are they influenced by similar factors when selecting intermediate educational C D R O M software? Significance of the Study This study has significance for educators, software designers, and educational  curriculum developers. Firstly, since C D R O M s are, and most likely will be, a relatively new resource in many schools, the selection criteria identified in this study will assist educators in identifying and selecting educational C D R O M s .  Secondly,  educational software designers and curriculum developers could use the criteria identified in this study to assist them in developing C D R O M software more closely aligned to teachers' needs and educational objectives. Definitions of Terms Used Key terms in this study are as follows: C D R O M (or C D - R O M ) - This is an acronym for the expression Compact Disk Read Only Memory. A C D R O M is a silver platter onto which digital information is stored in the form of physical fluctuations. This digital information is read by a  8  C D R O M player and can be translated into text, sound, animation, pictures, or a combination thereof, user-defined selective criteria - This is adapted from Barry (1994). It is used to describe criteria developed and employed by educators when selecting software, in this case C D R O M software for use with intermediate students, intermediate students - It refers to grades four, five, six, and seven students attending a British Columbia public school system, assessment - It is the process of gathering evidence that supports the educational value of a resource. evaluation - It is the process of interpreting the evidence and making decisions and judgements based on the evidence gathered through assessment. selection (selecting) - It is the final act of choosing a resource to be used in an educational setting. selection criteria - It is a set of standards or ideal qualities that are established for judging, rating, or gauging the educational value of a C D R O M resource. They are used to make judgements during evaluation, influencing factor - Is an element which causes teachers to change the manner in which they evaluate and select C D R O M s . Factors are not selection criteria (although subjects in this study referred to some factors as criteria) and thus are not used to evaluate C D R O M resources, educator - Educator refers to each individual teacher, teacher librarian, technology teacher, teacher/teacher librarian, teacher/technology teacher) participating in this study.  9  technology teacher (or computer teacher) - Is a person whose main responsibility in a school is to facilitate the students' and the teaching staff's acquisition of the necessary skills, attitudes and knowledge in the field of technology, and more specifically, in computer technology. Often, the computer teacher works in a classroom comprised of many computers (often called the computer lab), and is responsible for teaching technology to most, if not all, the classes, for all the grades in a school. Usually, technology teachers do not have a class of their own in which they instruct a variety of subjects through the day. Typical computer teachers in this study spent more than 90% of their instructional time teaching technology. teacher librarian - Is a person whose main responsibility in a school is to collaborate with all the members of the school community in an effort to develop and implement school curriculum, and to facilitate the students' and teaching staff's acquisition of the necessary skills, attitudes, and knowledge in the field of information processing (e.g., research and reporting skills, critical evaluation of resources).  A teacher librarian is also responsible for managing the school's  library resources and is based out of the school library.  Like technology  teachers, teacher librarians often instruct all classes in a school, and therefore do not have an assigned group of students. Typical to this study, a teacher librarian spent 90% of their day conducting their duties as a teacher librarian and/or instructing within an information processing curriculum. teacher - Is a person whose main responsibilities in a school is to meet the learning outcomes specific to his/her class of students.  10  Unlike technology teachers  and teacher librarians, teachers are assigned one group of students for the year.  Generally speaking, teachers work in their own classroom and instruct  their students in a wide variety of subjects. Typically, teachers in this study spent 90% or more of their instructional time with their own class. teacher librarian/technology teacher (or TL/TT) - Is a person whose main responsibilities are divided between the library and the computer lab.  In this  study, a TL/TT was a full time teacher librarian, but due to his/her expertise, or as a result of financial cutbacks, and/or a host of other reasons, he/she was given the role of computer teacher for the school as well.  In this study, a  typical TL/TT spent his/her morning as the school's teacher librarian, and the afternoon as the school's technology teacher. teacher/technology teacher (or T/TT) - Is a person whose main responsibilities are divided between his/her own class and the computer lab. A typical T/TT in this study was a full time teacher who had developed a personal interest in computers and who assisted (and still assists) and facilitated the development of the computer lab and technology resources for the school. A s a natural extension of his/her interest, he/she had also taken on the job of teaching technology to a wide variety of classes and grades. A typical T/TT in this study exchanged his/her class with other teachers, or was relieved by the Vice Principal or Principal so the T/TT would be free to instruct in the computer lab. A T/TT spent anywhere from one hour to several hours a day instructing in the computer lab.  In this study, a teacher was categorized as a T/TT if he/she  stated that they spent more than 35% of their weekly instructional time in the  11  computer laboratory. Organization of the Thesis The thesis is organized into five chapters. The first chapter provides the background information related to the study, outlines the problem, explains the rationale for seeking information on the problem, defines the purpose of the study, states the study's research questions, and explains the significance of the study. The second chapter reviews literature related to the thesis. The review will examine five aspects of the literature: (a) the need for C D R O M evaluation and selection, (b) the development of educators' C D R O M selection criteria and the factor's shaping the development of these criteria, (c) who should be responsible for selecting educational C D R O M s , and (d) the methodology and theoretical framework for selecting C D R O M s . Chapter three describes the study's methodological design. The discussion explores the development of the questionnaire and interview guide instruments, subject selection, data collection methods, and data analysis. Chapter four presents and summarizes the data collected from administering the survey and interview instruments. Chapter five summarizes the findings in the study, and explores the study's limitations and implications for C D R O M software selection and development. Finally, taking into account the limitations of the study, some recommendations for further research into the topic of intermediate C D R O M software selection criteria are presented.  12  CHAPTER TWO R E V I E W OF R E L A T E D L I T E R A T U R E The purpose of this study was to investigate intermediate educators' C D R O M selection criteria and the factors shaping these criteria. In order to place this study into context, literature dealing with the development of educators' C D R O M selection criteria and the factor's shaping the development of these criteria are reviewed. The researcher begins by documenting the need for evaluation and selection criteria for C D R O M s . The responsibility for selecting educational C D R O M s is documented and the development of C D R O M selection criteria is discussed. In addition, the methodology and theoretical framework for selecting C D R O M s is addressed. The Need for Evaluation and Selection Criteria The subject of integrating technology in Canadian curriculum has risen to the forefront of current curriculum development and implementation at all levels. In an effort to increase the number of computers and internet connections in public schools, school systems are presently funded by federal and provincial organizations. Along with this rapid increase of hardware has come a need for software, including C D R O M s . The available titles has increased exponentially over the past twelve years (Berger & Kinnell, 1994b; McCarthy 1993). In 1994, Berger and Kinnell estimated that there were more than five thousand C D - R O M s on the market. This increase has made it increasingly difficult for school administrators, educators, district resource centers, and governmental agencies to evaluate a significant percentage of the software. Consequently, "it has become increasingly important that educators be able to select and have their students use software that is instructionally effective"  13  (Reiser & Kegelmann, 1994, p. 63). In British Columbia, the job of evaluating and recommending educational resources has traditionally been a centralized task allocated to the Provincial Government's Learning Resource Branch (LRB).  In 1996, the L R B surveyed 600 B C  educators and found that the majority of these teachers felt that the L R B was falling behind in the job of identifying and evaluating technology based resources (British Columbia Ministry of Education, 1996a, section 5.5). This situation parallels Russell's (1994) findings in Australia, where state incurred financial restraints and reductions in computer education services had decreased the amount and availability of computer software evaluative information, and the number of educators (teacher librarians, computer co-ordinators, and principals) trained to evaluate and select computer software.  Russell termed this lack of evaluative software information an "advice  vacuum," and warned that when faced with this situation, teachers may make uncritical software choices which may include two types of unsuitable software. The first type may include potentially useful software for which no suitable teaching processes have been developed by that teacher. Such examples could include the well-known Where in the World is Carmen San Diego which the writer has seen used in some schools as an entertaining reward for those students who finish first. The second type may include a return to the teacher-proof software which surfaced in the United States in the post-sputnik period of educational change, where the design of the software reduced teachers opportunities for independence and creativity, (p. 159)  14  Russell argued that in the presence of an advice vacuum and an ever-increasing selection of educational software programs, teachers must be trained to critically evaluate, select, and use educationally sound computer software, or the potential of the computer will be hampered by the selection and use of noneducational games and teacher-proof software. Gill, Dick, Reiser, and Zahner (1992) add that the benefits of having teachers trained to evaluate and select their own software is that "teachers can be more confident about how a particular software package will work with their students, [and] will likely provide teachers with additional insights about the software and ideas about how to implement it in their curriculum" (p. 43). Collura (1995) took Russell's argument one step farther and proposed that all educators should actively and continuously participate in the critical evaluation and selection of potential computer software for their school.  Whether the school is  experiencing an advice vacuum or not, Collura noted that new software should be reviewed throughout the year by a school software selection committee. The committee would consist of a maximum of six members consisting of parents, teachers, administrators, and technology teachers, half of whom would be replaced after a maximum term of two to three years. Collura believed that such an approach to software selection would help reduce impulse buying of software, increase the number of programs that are educationally beneficial to students, and claimed that "when the effort and responsibility of software selection is shared by a committee the software used in a school can become a unifying feature that allows for smooth transitions between courses and grade levels" (p. 33). In addition, Berger and Kinnell (1994a) recommended involving students in the  15  evaluation and selection of C D R O M s , arguing "students should be encouraged to become knowledgeable consumers of electronic information. What better way than to be involved in evaluating and selecting C D R O M s ? " (p. 45). This approach to C D R O M evaluation and selection not only creates informed consumers of information but also is reflective of particular key learning goals and objectives as outlined in several British Columbia Ministry of Education documents, such as developing independent learners, helping students become managers of information rather than memorizers of information, developing student centered experiences and activities, and encouraging students to think critically, creatively, and reflectively (British Columbia Ministry of Education, 1991, pp. 2-8; 1992, p. 28; 1996, p. 5). Another rationale for a more careful evaluation and selection of educational software like C D R O M s is that the information and advice contained in journals may not provide the type of information educators are seeking. Zink (1991, p. 16) stated that the "reviews of C D - R O M products in the professional literature have largely been uncritical in their discussion of the user interface or have been limited to subjective general comments about difficulty of use." Richards and Robinson (1993, p. 92) explained that "this can be attributed to the fact that existing guidelines for the evaluation process are essentially lists of features to look for." In addition, Russell (1994) argued that more emphasis should be placed on developing software selection criteria derived from the educational context. He cautioned educators that "the use of criteria in computer education journals is less satisfactory, as the criteria may be decontextualized" (p. 159).  For example, if educators are to develop their C D R O M  selection criteria from lists of criteria presented in journals they must be aware that  16  the suggested criteria may not (a) be applicable to the educational setting (e.g., The C D - R O M Professional periodical does not focus on educational issues.), (b) be reflective of the needs of Canadian students and the curricular goals and teaching methods of Canadian teachers, (e.g., Journal of Educational Research is an American journal focusing on American educational contexts.), and (c) be specific to the needs and context of their own classroom setting or expertise (e.g., Articles in the School Library Journal are geared to teacher-librarians and their teaching environment, not computer teachers, classroom teachers, or their instructional environments.). Nordgren (1993), Richards and Robinson (1993), and Zink (1991) further pointed out that many of the periodical reviews of computer software, including C D R O M s , are far from critical. Shade (1992) highlighted this deficiency for educators in an analysis of twenty award winning software titles, in which he found very little correlation existed between winning a parents' or editors' award and the software being developmentally appropriate. Shade warned that without the careful development and knowledge of evaluation processes and selection criteria reflective of the teachers' instructional goals, students' needs, and educationally sound research, an educator may select software on the basis of slick technical features. In addition, Shade (1996) addressed the importance of the teacher's role in critically evaluating and selecting educational software like C D R O M s , stating "the most critical decision a teacher can make is that of software selection" (p. 17). Collura (1995) expands on this statement commenting, It has never been the hardware that determined educational potential. That  17  potential has always been realized through the quality of the material presented to the students. The facts on the transparency, the content of the videocassette and certainly the substance of the computer software - as well as how they are presented - determine a successful lesson. For this reason, software for the classroom must be selected through a thoughtful, well designed process, (p. 33) Responsibility for Selecting Educational C D R O M s According to Heller (1991), the variety of players involved in the evaluation and selection of educational software resources is diverse, but can be divided into five main groups: (a) students, (b) classroom educators, (c) librarians, (d) administrators, and (e) professional associations including district software specialists and governmental agencies. Of these five groups, Heller argued that "teachers are thought by most to be the most appropriate software reviewers. They know the dayto-day informational and pedagogical needs of the students as well as curricular needs" (p. 286).  Without exception, all of the literature on C D R O M selection criteria  either assumed or directly stated that teachers were, or should be, one of the key people responsible for evaluating and selecting C D R O M software (e.g., Berry, 1992; Ekhami & Brown, 1993; Graf, 1996). The umbrella category of teachers was further divided into three specific groups: classroom teachers, teacher librarians, and technology teachers. This grouping represents the three audiences targeted in the reviewed articles, chapters, and books on C D R O M selection criteria.  For example, of the twenty three journal  articles reviewed, thirteen were written for audiences of teacher librarians or  18  librarians (e.g., Emergency Librarian, School Library Media Quarterly), six were written for a general teacher audience (e.g., Educational Leadership, Australian Journal of Education), and four were intended for technology teachers (e.g., C D - R O M Professional, Computers in Education). The emphasis in the literature on teacher librarians as the best educator suited for evaluating and selecting educational C D R O M s is not surprising (Baker, 1997). Teacher librarians have traditionally been responsible for the evaluation and selection of a wide variety of learning resources and instructional media for many years. In B C , for example, teacher librarians have been mandated by the Ministry of Education and contracted by their school district to implement resource based learning in their schools, assist in the development of independent learners, and trained and expected to "develop and implement criteria for the evaluation and selection for a wide range of resources" (British Columbia Ministry of Education, 1991). Computer software, including C D R O M s , fits into this wide range of resources.  During the Learning  Resources Branch (LRB) Needs Assessment Project (British Columbia Ministry of Education, 1996a), a focus group of librarians reminded the ministerial organization that the " L R B should promote the idea that teacher-librarians are key resources that should be used by school/district staff when considering various resources" (p. 4). Classroom teachers are the second most addressed audience in the literature with regard to C D R O M software evaluation and selection. This finding is supported by research conducted by Reiser and Kegelmann (1994) who reviewed (a) the software evaluation and selection procedures listed in thirty journal articles on educational software evaluation, and (b) the practices of eighteen American and  19  Canadian educational software review organizations (e.g., government resource branches, software research laboratories, and nonprofit organizations). Almost all of the thirty articles recommended teachers as the individuals who should be responsible for selecting a school's software. In addition, sixteen of the eighteen organizations employed teachers to evaluate software programs. The perception that teachers are the best suited to select software was further supported by Ely (1996, p. 37) in an extensive content analysis of the technology literature in U.S. journals, university dissertations, conferences, and ERIC catalogue entries. Ely reported the trend that "there is a new insistence that teachers must become technologically literate" (p. 31) and trained in the use of technology. Several articles and chapters reviewed for this study reflected this trend (e.g., Russell, 1994; Schank & Cleary, 1995; Squire & McDougall, 1994). In these studies, teachers were also portrayed as best suited to the task of evaluating and selecting software, like C D R O M s . However, these same teachers also required some training, or retraining, so as to sharpen their software evaluation skills, and update their selection criteria. With the increasing numbers of computers, computer labs, and computer software in schools, it not surprising that "the fastest growing [teaching] specialty is school technology co-ordinator" (Heinich, Molenda, Russell, & Smaldino,1996, p. 358). It is also not surprising that a large proportion of the literature refers to school media specialists, or technology teachers, as important players in the process of evaluating and selecting educational C D R O M software.  Their training and  experience with technology alone makes them prime candidates for assisting with the evaluation and selection of C D R O M s for their school.  20  In B C , the decrease in the  number of teacher librarians, and/or in the number of hours a teacher librarian works per week may also account for the technology specialists increased involvement in the evaluation and selection of educational C D R O M s . Although teachers were cited as the best individuals suited for evaluating and selecting educational software like C D R O M s , Berger and Kinnell (1994a) argued that educators, especially library resource specialists, should not take full responsibility for selecting the schools' or classrooms' C D R O M software. They felt that students should share in the selection process by actively evaluating and selecting C D R O M s with the teacher, explaining that "students should be encouraged to become knowledgeable consumers of electronic information [and that] it is very helpful to hear the different perspectives offered by faculty and students" (p. 45). Bakker and Piper (1994), Flagg (1990), and Barry (1994) agreed that the most effective method of determining whether or not a computer program can successfully meet the teacher's curriculum goals while engaging the students, was to evaluate it with students. Collura (1995) argued that the responsibility of evaluating and selecting educational computer software should involve even more school stakeholders. He recommended forming a software selection committee consisting of parents, teachers, administrators, parents, and technology coordinators. He stated that each member has an important role in the selection process, proposing that (a) teachers will assure the content is appropriate and the method of presentation is acceptable, (b) administrators will guide the committee on how the software will fit into the existing curriculum and budget, (c) technology coordinators will ensure the software is compatible with the hardware and fits the school's technology curriculum, and, (d)  21  parents will offer a fresh perspective and keep the committee grounded in the larger community picture. Collura concludes that the advantage of a committee is that most of the school's software needs will be met and the students will benefit from quality educational software. The B C Ministry of Education's Learning Resource Branch (British Columbia Ministry of Education, 1996b, p. 60) also recommended that software be selected by committees at both the district level and school level. These committees of educators (teachers, administrators, teacher librarians, and subject specialists) should be formed to take responsibility for evaluating and selecting district and school software resources. This approach removes the responsibility of software selection from individual educators and improves the quality of resources. The L R B employs B C educators on ministerial software selection committees whose mandate is to evaluate and select specific C D R O M resources which are then listed as recommended teacher resources on the internet (British Columbia Ministry of Education, 1998). Development of C D R O M Selection Criteria In the early developmental years of C D R O M s , a great deal was written about their limitations (Arnold, 1987; Herther, 1989; Pearce, 1988; Reese & Steffey, 1988). Reese and Steffey (1988) summarized these limitations in a list of seven deadly sins, which focused on answering the question, "What are the technical problems associated with a C D R O M player and software?"  In response to this question they  recommended that C D R O M s should (a) be hardware and (b) software compatible, (c) not consume hard disk space, (d) should require as little training as possible, (e) be  22  secure, (f) inexpensive, and (g) should come with a flexible site licence.  Although  not presented as selection criteria, they represent the earliest attempt at listing C D R O M selection criteria found in the literature. The criteria also focused exclusively on the technical aspects of C D R O M s . By the early 1990s, the number of educational C D R O M titles that teachers could choose from had increased (McCarthy 1993; Richards and Robinson, 1993), while their cost had decreased (Shade 1994). With this increased availability came a shift in the type of criteria that teachers applied to C D R O M s . No longer could educators ask themselves the simple question, "Should we buy a C D R O M or not?" Instead, they were asking more complex questions about which programs are available, which ones should be bought, and how these programs can be used in the curriculum (Ekhami and Brown, 1993). By 1995, the number of C D R O M players found in B C and U.S. public schools had increased a dramatic 200 to 250 percent since 1988 (British Columbia Ministry of Education, 1995; Ely, 1996). Along with this dramatic increase of C D R O M hardware came an abundance of criteria with which to assess C D R O M software (Baker, 1997). For example, Nordgren (1993) listed eighty three criteria for evaluating multimedia C D R O M S , Schwartz (1993) listed over ninety seven C D R O M criteria, and Richards and Robinson (1993) listed one hundred criteria for evaluating a C D R O M ' s technical features. Along with the increased number of C D R O M selection criteria came an increase in articles focused on identifying characteristics specific to C D R O M s (e.g., Berger & Kinnell, 1994a, 1994b; Graf, 1993; Nordgren, 1993; Parker, 1992; Schwartz, 1993). For example, Nordgren (1993), drawing upon work done by Stewart (1987),  23  presented C D R O M evaluation criteria as an amalgamation of characteristics from three overlapping technologies: publishing, computing, and broadcasting.  He further  suggested that C D R O M s could be evaluated with criteria drawn from traditional print formats (e.g., books), computer formats (e.g., floppy disk software), and broadcasting (e.g., video).  He also warned that much of the development, evaluation, and  selection of C D R O M s focused on "reviews [which] are for the most part cheerleading sessions emphasizing the bells and whistles rather than the content, value, and effective integration of media into products" (p. 105). In a recent and extensive review of the literature pertaining to teacher librarians' and librarians' C D R O M S selection criteria, Baker (1997) reported that these users applied criteria from both traditional print and electronic formats. She also concluded that, according to the literature, librarian's C D R O M selection criteria placed more emphasis on technical quality than on content quality. This emphasis on evaluating technical quality, Baker concluded was because librarians "needed criteria to assess the ease of use and search performance of C D R O M s in order to perform hands-on product evaluations, as these two aspects were difficult to judge from reviews which were nonevaluative and congratulatory" (1997, p. 41). A review of the literature outside of Baker's study, and/or which is not specific to teacher librarians or librarians, revealed a similar trend towards evaluating and selecting C D R O M s based on technical criteria (see Table 1 and Figure 1). Berry (1992), Graf (1996), Nordgren (1993), and Parker (1992) presented evaluation and selection criteria which focused on evaluating a C D R O M ' s technical qualities.  24  Table 1 C D R O M Selection Criteria Types as Listed in Published Checklists  Instructional  Content  Audience Authority/Accuracy/ Validity Scope/Depth/Breadth of C D R O M Currency Accomodates User's Background Knowledge & Abilities Supports Curriculum  Hackbarth (1996)  X  X  X  X  X  X  X  X X X  X X  X X X  X X X  X  X X X X X X X X  X X  X  X X  X X X X  X X  X  X X  X X X  X  X X X  X X X X X X  X  X  X  X  X  X  X  Student Interest/Motivating  25  X X  X X X X  X  User Feedback  Cost  Bakker & Piper (1994)  X  Schwartz (1993)  X  Nordgren (1993)  X  Parker (1992)  X  X X  Social Issues (e.g., racism, sexism) Other  Graf (1996)  Visual Presentation: Display & Design Networking Updates Training/Technical Support Printing & Down Loading Documentation and Installation Intuitive Format, Ease of Use Navigation Searching and Search Results  Berger & Kinnell (1994)  Technical/Design  Hardware/Software Compatability  NCET(1992)  Berry (1992)  Literature Reviewed  X  X  Social Content  Other  Technical Design 65%  Figure 1. Types of Criteria Listed in Current C D R O M Selection Literature  Although Berger and Kinnell (1994), Hackbarth (1996), the National Council for Educational Technology (NCET,1992), and Schwartz (1993), listed criteria that emphasized the need to evaluate both the content and technical quality of C D R O M s , the majority (55% to 66%) of the criteria still emphasized evaluating technical quality. Bakker and Piper (1994) were the only authors who listed selection criteria which emphasized evaluating a C D R O M ' s general content and social content. Bakker and Piper reviewed the software and C D R O M selection criteria for the California Software Clearing House, an organization funded by the California Department of Education. In summary, a review of the literature dealing with the development of C D R O M selection criteria indicated that evaluators have placed, and still place, more emphasis on a C D R O M ' s technical quality rather than on its content quality.  In  regard to this focus on technical features, Shade (1996) noted: vast improvements in hardware have allowed software to become more friendly, more complex, and more aesthetically pleasing. These changes have  26  nothing to do with the content of the software. The worst drill program can be just as glitzy as the most developmentally appropriate piece, (pp. 20-21) He concluded that" although improved technical features have made things possible for children's software, one still needs to look carefully beyond the bright packaging and wonderful graphics to see what the software is really doing" (p. 21). Methodology and Theoretical Frameworks for Selecting C D R O M S W h e n educators select C D R O M software for use in the intermediate school setting, they consciously, or unconsciously, follow a software selection process, or methodology, and draw upon selection criteria based on underlying theoretical frameworks, or beliefs, about the educational application of computers and computer software.  A s little has been written on the methodology and selection frameworks  specific to C D R O M software selection, the following discussion explores these topics in light of the theoretical literature written on the evaluation and selection of software in general. Methodology Of the literature written on evaluating and selecting educational C D R O M software, most either listed or referred to software selection checklists (e.g., Berger & Kinnell, 1994a, pp. 45-49; Nordgren, 1993; Parker, 1992; Richards & Robinson, 1993 Schwartz, 1993).  The form used by the British Columbia Ministry of Education  (1996b) to initially evaluate software materials, including C D R O M s , is also based on a checklist. The preference for using checklists to evaluate and select C D R O M s parallels findings by Gill, Dick, Reiser, and Zahner (1992), Reiser and Kegelmann (1994), and Squire and McDougall (1994).  27  Squire and McDougall (1994, p. 20)  summarized the situation when they noted that they "have been used extensively for software selection in every country where educational software is used. In fact we have not been able to find any other systematic method for software selection in use at all." The advantage of using checklists to evaluate computer software is clear. Checklists are easy to construct, facilitate detailed record keeping, and are efficient devices for recording observations (Begin, Hall, Jeffrey, Lukasevich, Meister, & Pronovost, 1992, section 4.1). Considering how precious time is to most educators, it is not surprising that checklists are the preferred and recommended approach for evaluating and selecting C D R O M s .  However, Squire and McDougall (1994) argue,  "Checklists are unable to deal with essential issues of teaching and learning, core justifications for the use of educational software, and show inappropriate emphasis on easily assessed technical features" (p. 65).  The extensive use of checklists may also  explain why criteria presented in the literature focused on the technical quality of C D R O M s as opposed to their content quality. Gill, Dick, Reiser, and Zahner (1992) criticized the checklist method of evaluating software. They argued that the use of evaluative software checklists, especially those published by software evaluation services, "typically require the reviewer to make subjective judgments about the accuracy of the content, effectiveness of the instructional techniques employed, whether the program meets curriculum objectives, and the technical quality of the program" (p. 42).  They cited  research conducted by Callison and Haycock (1988) and Jolicoeur and Berger (1986, 1988), which found subjective checklists provide very weak correlations between software reviews, and between students' and teachers' software ratings, and that  28  "teachers may not be accurate judges of how much students will learn from instructional software" (p. 40). Hoping to overcome the shortfalls of subjective checklists, Gill et al. (1992) recommended another method for selecting educational software. Their method involved (a) teachers identifying instructional objectives and creating tests to assess students' attitudes and attainment of objectives, (b) pretesting students, (c) having students evaluate software individually and in small groups, and (d) posttesting students to determine if the instructional objectives were reached. They stated that the advantage of this method of software evaluation is twofold. First, teachers become more confident evaluating and selecting software specific to their individual instructional objectives and their students' needs. Secondly, the software overcomes the evaluation weaknesses inherent to subjective checklists by providing objective student performance and attitude data. Gill et al. (1992, p. 43-44) also warned that their method of software evaluation and selection is not without its weaknesses. They stated that the process may (a) consume a great deal of the teacher's time, (b) impacts on individual students who are being pulled away from "normal" class activities, and (c) can not be applied to the evaluation of all software. Theoretical Frameworks A s a result of an extensive review of the existing frameworks underlying educators' software selection and evaluation, Squire and McDougall (1994, pp. 5371) concluded that educators' software selection frameworks could be generalized into four categories: (a) classification by type, (b) classification by educational role, (c) classification by educational rationale, and (d) an interactive paradigm. These  29  categories will be used to help frame the following discussion on possible theories underlying educator's C D R O M software evaluation and selection. The first three theoretical approaches to evaluating and selecting computer software is based on classification by type, role, and rationale. Examples of an approach to developing and classifying C D R O M selection criteria by type can be found in Berger and Kinnell (1994b, p. 28), Ekhami and Brown (1993, p. 41), Berry (1992, pp. 45-46), and in Table 2.  In a classification by type framework, C D R O M  Table 2 Example of C D R O M Classification by Type Title  Publisher  Type  Compton's Multimedia Encyclopedia  Encyclopedia Britannica  Encyclopedia  Grolier Electronic Encyclopedia  Grolier Publishing  Encyclopedia  Encarta 98 Deluxe  Microsoft Publishing  Encyclopedia  Paper Bag Princess  Discus Books  Literature  Scary Poems for Rotten Kids  Discus Books  Literature  Stowaway  Dorling Kindersley  Science  Whales and Dolphins  Media Design interactive  Science  The 1997 Canadian Encyclopedia  McClelland & Stewart Inc.  Encyclopedia  Mammals, Multimedia Encyclopedia  National Geographic  Encyclopedia  selection criteria is based on, and grouped by, predetermined categories.  A s is the  case with the list in Table 2, and Berger and Kinnell's (1994, p. 28) chart listing C D R O M s by curriculum area, these categories are often general and display information in a format which gives educators a quick overview of the product and the subject area in which the C D R O M can be used. Checklists based on this framework usually  30  contain over generalized selection criteria, but are easy to apply. They are also useful only if the criteria is recent and directly applicable to the C D R O M ' s content. None of the authors in the literature suggested evaluating and selecting C D R O M s solely using the second framework, classification by role. However, each article or chapter did suggest some criteria that exhibited characteristics of this framework. For example, most authors listed C D R O M criteria which emphasized the importance of having the computer provide the user with feedback. Nordgren (1993, p. 101) listed five user feedback criteria and concluded, "Good [CDROM] software provides feedback to the user that something is happening and the machine is not just hung and needs to be rebooted."  Leu and Kinzer (1987) give an example of this  framework being applied in a general software evaluation checklist of questions designed for selecting computerized reading and writing materials.  For example,  question six asked educators, "In what area does the software fall: (a) learning from computers; (b) learning with computers; (c) managing learning with computers; (d) other?" (Leu and Kinzer, 1987, p. 577).  Such a question classifies the attributes of  the software, including C D R O M software, by the role it can play in an educational setting. That is, the software is viewed in terms of what it can offer or perform, rather than what the user can do or learn. This classification by role framework is reflective of the principles and criteria of software developers, and is the basis of computerized worksheets and tutorial software.  It would also explain the lack of checklists focused  on the framework of classification by role, as developers and educators rarely work together on educational software (Shank & Cleary, 1995). The third evaluation framework is classifying by educational rationale. This  31  framework is based on four paradigms: instructional, revelatory, conjectural, and emancipatory. C D R O M s that focus on presenting sequenced information and reinforcing the content through test feedback (e.g., drill and practice) would fit into the instructional paradigm.  The revelatory paradigm includes C D R O M s based on  simulations and which encourage learners to search and discover in new environments. C D R O M s based in the conjectural paradigm encourage students to learn about a topic by creating hypotheses, testing their hypotheses, and being able to create and modify models representative of their hypothesis. Emancipatory C D R O M s would be programs which help students to crunch large amounts of data so they can spend more time focusing on the activities and educational content in the other three paradigms. Criteria representative of all four paradigms can be found in most of the literature on C D R O M evaluation and collection criteria. For example, Schwartz (1993) recommended evaluating and selecting C D R O M s based on the clarity and context sensitivity of the help and tutorials. This criteria is representative of the instructional paradigm.  Criteria based on the revelatory paradigm can be found in  Heinich, Molenda, Russell, and Smaldino (1996, p. 335), who list thirteen checklist items for evaluating the instructional design, technical quality, and educational content of a simulation or game.  In their review of the California Clearinghouse's  C D R O M and computer software evaluation review procedures, Bakker and Piper (1994, p. 67) stated, "evaluators measure interest to determine the material's capacity to promote critical thinking skills and to motivate, engage, and intellectually stimulate students." By looking for software that promotes critical thinking skills, Bakker and  32  Piper are implying that better software falls into the revelatory and conjectural paradigms, and not within the instructional. Criteria based in the emancipatory paradigm can be found in Berry (1992) and Graf (1996), who recommend selecting C D R O M s that are programmed to quickly search through vast amounts of digital data. The strength behind selection criteria based in the framework of classification by educational rationale is that it addresses curriculum matters relevant to variety of educational paradigms. This is in direct contrast to developing selection criteria based solely on classification and/or on the performance merits of the software. The weakness behind this framework of classification by educational rationale is that it does not focus on the criteria that is specific to individual student's learning styles and approaches. The fourth and final framework in which educators may systematically assess, evaluate, and select educational C D R O M s , is the interactive paradigm. This framework views the evaluation and selection of C D R O M software as an interactive partnership between student (user), teacher (instructor), and developer (software designer). Consequently, Squire and McDougall (1994) state: By considering the interactions between the perspectives of the designer, teacher, and student, the assessor adopts a comprehensive view of the design and use of the educational software, and identifies issues that are of significance in the context of the perceived use of the software being assessed, (p. 71) This framework views the development of selection criteria grounded in social and educational contexts, where both the technical and content qualities are assessed,  33  and in which the instructional objectives of the teacher and the learning needs of students are paramount. C D R O M selection criteria that fall in the interactive paradigm would involve evaluating the extent in which the C D R O M is able to draw upon students' background knowledge, facilitate the delivery of the teacher's curriculum, and go beyond the topical appropriateness of the C D R O M . Schank and Cleary (1995) proposed eight principles, or criteria, for selecting quality educational software, three of which fall into the interactive category. These principles focused on developing software that incorporated tasks which (a) incorporated the skills or knowledge teachers wanted to teach, (b) implemented instruction which clearly and directly addressed the real needs of students, and (c) took the view that the designer's most important job was to make learning fun (pp. 175-176). Schank and Cleary also pointed out that these principles for evaluating software would require a great deal of time and effort to plan and execute, and that software designers are still too focused on their own goals to acknowledge educationally appropriate software design. All of the C D R O M selection criteria listed in the literature could be placed within one or more of Squire and McDougall's (1994) frameworks. Most criteria focused on evaluating the technical aspects specific to the C D R O M and could easily be attributed to one of the classification categories. Very few C D R O M criteria focused on evaluating how the C D R O M could be used to enhance teaching and learning, and thus could not be attributed to the interactive paradigm. This supports Squire and McDougall's (1994, p. 65) proposal "to shift the focus in software selection away from attributes of the software itself, and towards a more comprehensive and  34  balanced emphasis on the use of software to enhance teaching and learning." Summary A review of the literature found that there is a growing need for the evaluation and selection of educationally appropriate C D R O M software. This need can be attributed to many factors including: (a) the increased number of computer hardware and software being placed in elementary schools, (b) the rapid increase in the number of software titles that teachers have to select from, (c) the inability of traditional resources to provide reliable reviews, (d) a propensity for evaluations to focus on technical quality over content quality, and (e) a general need to select C D R O M resource materials more closely aligned with the teachers' instructional objectives. Although the literature recommended that the evaluation and selection of C D R O M s should be the responsibility of a wide range of people and organizations, the literature almost unanimously recommended teachers as the single most suited person to evaluate and select educational C D R O M s . It was also noted that the umbrella term, 'teachers', included teacher librarians, technology teachers, and classroom teachers. A review of the literature written on the development of C D R O M software selection criteria emphasized evaluating and selecting C D R O M s on the basis of technical quality as opposed to content quality. It also recommended incorporating evaluation processes and selection criteria derived from existing print criteria (e.g., books), broadcasting criteria (e.g., video), and computer software (e.g., floppy disks). The criteria presented in these articles were listed in Table 1.  35  The literature also reported that the most common method for evaluating and selecting software was the use of subjective checklists. However, there was an increasing amount of literature focused on developing a methodology which incorporated more objective evaluation and selection criteria which more closely reflected teachers' instructional objectives, were more sensitive to students' needs, encouraged developers to work closely with educators when designing software, and placed much more emphasis on the evaluation of content qualities as well as technical qualities. A discussion of the theoretical frameworks for studying the evaluation and selection of educational software was based on Squire and McDougall's (1994) four software selection categories, or frameworks. It was noted that these frameworks viewed and presented the evaluation and selection of computer software as (a) classification systems based on categories, (b) classification systems based on the role that software can perform, (c) classification systems based on educational rationales which stress general curriculum issues, and (d) on an interactive paradigm which places emphasis on the use of software to enhance learning and teaching.  It  was further noted that although C D R O M selection criteria presented in the literature could be located in all four frameworks, most criteria fell into the first three framework categories and emphasized evaluating a C D R O M ' s attributes. Very little criteria fell into the interactive paradigm, which emphasized how the C D R O M could be used to enhance teaching and learning.  36  CHAPTER THREE METHODOLOGY The purpose of this study was to examine the C D R O M selection criteria and the factors affecting the development of these criteria.  Since this is an area where  little research had been conducted, the research was exploratory and descriptive in nature. It incorporated both qualitative and quantitative research and used the survey method. Using a questionnaire and interview, twenty randomly selected volunteer intermediate educators (teachers, teacher librarians, technology teachers, teacher librarian/technology teacher, teacher/technology teacher) from four lower mainland school districts completed a questionnaire and were later interviewed. During the individual interview, each subject was given two C D R O M s which they were asked to view and critically evaluate. The research instruments' reliability and validity were increased by having them reviewed by a panel of three experts, followed by a pilot study. After the data was collected, it was analyzed using content analysis as explained by Krippendorf (1980) and Stempel (1981). The following discussions describe the development and validation of the research instruments, sample selection, instrument administration, data collection, and data analysis. Instrument Development Instrument development for this study took place from June to November of 1997, and occurred in three stages. In the first stage, the researcher developed the initial research instruments and refined the procedures for administering the instruments. In the second stage, the professional advice and judgements of three panel experts were called upon to add reliability and validity to the initial instruments.  37  In the final stage, the instruments were evaluated by two teachers who were asked to critically review the questionnaire and interview questions in a pilot study. Stage One Since there were no appropriate research instruments available to answer the research questions posed by the researcher, the researcher developed two instruments for the study: a questionnaire and a set of interview guide questions (see Appendixes A and B). The instruments were designed to generate both quantitative and qualitative descriptive data from a sample of experienced intermediate educators in order to identify their C D R O M selection criteria, and the factors that shaped the development of their criteria. The questions developed for the initial instruments were derived from five sources: 1. The researcher's own experience evaluating and selecting intermediate educational C D R O M s for use in several B C elementary schools. 2. A preliminary C D R O M survey conducted by the researcher in a University of British Columbia Language Education children's literature course in the fall of 1996. The survey examined the number and types of C D R O M equipment and software to which teachers had access. 3. A review of the current literature on C D R O M selection criteria (see Chapter Two). 4. A B C Learning Resource Branch (LRB) survey of teachers' resource selection processes (British Columbia Ministry of Education, 1996a), in which the teachers surveyed reported that the L R B was not proactive at identifying and  38  selecting technological resources. 5. From concerns raised by experienced educators, to the researcher, while attending district professional development workshops, and taking education courses at the University of British Columbia. The researcher conducted C D R O M workshops in several lower mainland districts, and coordinated the University of British Columbia's Language Education Research Center. During these activities, he was often approached by experienced educators who spoke of the difficulties they experienced evaluating and selecting educationally sound C D R O M software. The questionnaire, titled C D R O M Study, Part 1, consisted of two sections. Section one contained a series of questions inquiring about the subjects' background. Section two consisted of C D R O M selection criteria value statements that employed a five point Likert scale to record the subjects' degree of agreement. (1 = strongly agree, 2 = agree, 3 = disagree, 4 = strongly disagree, 5 = no opinion). The interview instrument, titled C D R O M Study. Part 2. consisted of three sections. Section one was titled the Main Interview, followed by section two titled C D R O M Activity, finishing with section three titled Follow Up Interview. In the initial stage, the researcher developed the framework for the sequencing of the instruments' administration. The subject brought the completed questionnaire to the interview. Once the questionnaire was handed in, the researcher conducted the interview with the subject. A complete explanation of the final implementation and sequencing of the administration of these research instruments can be found later in this chapter under Instrument Administration and Data Collection.  39  Stage Two The second stage of development focused on validating the questionnaire and interview questions.  Since the researcher developed the instruments specifically for  this study, he sought professional advice and judgement from a panel of educators considered experts in the field of intermediate educational C D R O M evaluation and selection. Once the five potential panel experts were identified, they were mailed a copy of the study's instruments and asked to rate the effectiveness and clarity of each question. They were also asked to make suggestions, comments, or changes to the questions (e.g., adding, deleting, rewording) that would help strengthen the validity of the study. Educators identified and selected as panel experts had (a) four or more years experience evaluating and selecting educational C D R O M s for use with intermediate students; and (b) were either a teacher, teacher librarian, technology teacher, or combination thereof. Five potential panel experts were contacted by either emailing or hand delivering a copy of the instruments and a letter requesting their assistance (see Appendix C), or by personally requesting the educator's assistance. Three of the five consented to participate.  The panel consisted of one teacher librarian, one  technology teacher, and one district resource specialist. A list of the three panel experts and a brief overview of their expertise in evaluating and selecting C D R O M software is discussed below: 1. The first member was a Faculty of Education sessional instructor at the University of British Columbia who had six years of elementary teaching experience. During this time he selected C D R O M software for his lower mainland intermediate  40  classes. At the time of this study, he was instructing both undergraduate and graduate university students on the design and application of educationally sound software programs. This panel expert was chosen not only because of his C D R O M selection experience with intermediate students, but because he specialized in the development of interactive, multimedia software, two characteristics inherent to C D R O M software. 2. The second member was a University of British Columbia Teacher Librarian who was responsible for evaluating, selecting, weeding, and maintaining a wide variety of resources, including C D R O M s , for the Faculty of Education library. Although her intermediate C D R O M selection experience did not occur in lower mainland schools, she did have eight years experience selecting a large and varied number of intermediate C D R O M resources for the Faculty of Education library and for other experienced lower mainland intermediate educators. She was also included because of her twenty six years experience as a teacher librarian, and for her extensive knowledge of B C resource selection criteria. 3. The third member was a lower mainland school district computer resource representative who coordinated the existing software selection policies and processes developed and employed in her district. This member had over 16 years of teaching experience and five years of selecting C D R O M software programs for use with intermediate students. She was also selected because of her experience with, and knowledge of, the variety of different computer software selection strategies employed in her district. Once a panel expert consented to review the instruments' questions, the researcher either mailed or emailed them a package containing an overview of the  41  study (see Appendix D), a covering letter explaining the review process (see Appendix E), and a panel expert's rating form (see Appendix F). These mail outs occurred one to three months prior to the study. The covering letter requested panel experts to read the research outline, review the questionnaire and interview questions, use the panel expert rating form to rate the appropriateness and relevance of each question to the objectives of the study, suggest any revisions or additions they deemed necessary, and return the completed rating form to the researcher by mail or email. One panel expert returned their form by email, another by mail, and a third completed the rating form over the phone. The panel experts completed and/or returned their reviews of the instruments to the researcher at least three weeks prior to the study. All three felt that the interview would generate better insights into intermediate educators' C D R O M selection criteria than the questionnaire. Panel experts' feedback were incorporated into the final instruments using the following criteria: 1. A question was eliminated from the instruments if two or more of the panel experts rated it not relevant or appropriate to the study, or if it was rated slightly relevant by all three experts. 2. A question was retained if two or more experts rated it very relevant or relevant. 3. If two or more panel experts commented that a question was lengthy, wordy, or needed clarification, the question was reworded to incorporate suggestions. 4. Additional questions were integrated into the instruments if more than one panel expert made the suggestion, or if the suggestion fulfilled an obvious need in the  42  study. An example of a question which fulfilled an obvious need would be the addition of questions numbered one and two in section four of the final interview questionnaire. These two questions were, suggested as an excellent way to break the ice and relax the subjects. More importantly, question one was the question which generated the largest number of C D R O M selection criteria from the subjects. Table 3 and Table 4 summarize the changes incorporated into the final questionnaire and interview instrument (see Appendixes G and H) as recommended by the panel. Table 3 Changes to the Questionnaire Resulting from Panel Experts' Reviews Change to Question  Questions Changed  Appendix  Eliminated  1  A  1  Reworded  3  A  10, 13, 33  Added  1  G  29  30  G  2-9, 11, 12, 14-28, 30-34  Unchanged  Question Number  Table 4 Chanaes to Interview Guide Questions Resulting from Panel Experts' Reviews Questions Changed  Appendix  Eliminated  3  B  1,3,4  Reworded  6  B  5, 6, 7, 8, 9, 16  Added  4  H  1,2, 3, 8, 9, 11  Unchanged  9  H  4-7, 10, 12-19  Change to Question  43  Question Number  Stage Three In the third and final stage of the instruments' development, the researcher conducted a pilot study. The pilot study further tested the validity and reliability of the instruments in that it verified the participants' understanding of the instruments' directions and interview questions. Sample and recruitment requirements for the pilot study duplicated those of the main study, except that two, instead of twenty, volunteer lower mainland practicing intermediate educators were selected from a class of Department of Language Education's summer session students.  Pilot subjects were required to complete a  pilot consent form (see Appendix I) prior to commencing the study. The pilot study was conducted in three phases. In phase one, the two pilot study subjects took fifteen minutes to complete the questionnaire on their own. In phase two, the researcher administered the interview instrument to the subjects in the University of British Columbia's Language Education Research Center. The researcher asked the subjects twelve interview questions, involved the subjects in a hands on evaluation of two intermediate C D R O M s , and finished by asking the subjects five follow up questions (See Administration of the Instruments and Data Collection for an in depth explanation of this phase two process.). In the final phase, the researcher orally administered a panel expert rating form with the two subjects so as to further detect possible flaws in the reliability and validity of the questions. The researcher chose to administer the rating form orally for the sake of expediency (the teachers had just completed the exact questions in the preceding hour and a half.), and because the two teachers voiced their preference to respond collaboratively in  44  their ratings. Their rating of each question was recorded by the researcher on a panel expert rating form and incorporated into the final instruments (see Table 1 and Table 2) using the same selection criteria as applied to the other three panel experts. They also made three additional recommendations that were incorporated into the research design. First, they recommended that the researcher decrease the interview time on the pilot consent form (see Appendix I), as the request for two hours of a Lower Mainland educator's time might effect the number of teachers volunteering. They also recommended that the researcher take a proactive role in the research during the hands-on activity and assist the subject in his/her exploration of the new C D R O M , rather than maintaining a hands-off approach. This would decrease the time needed by the subject to learn the two new C D R O M s , while increasing the subject's time spent generating C D R O M selection criteria data. And finally, they suggested that the questionnaire precede the interview so more time could be dedicated to the latter. Sample Selection The sample was selected in order to gain data on the types of C D R O M selection criteria used by experienced intermediate educators. In order to obtain a list of potential subjects, a technology resource specialist from four school districts in the Vancouver metropolitan area was contacted by phone, sent an overview of the study and a research consent form, and asked to compile a list of the educators in the district which would meet the researcher's recruitment requirements. Through these procedures, the names of forty two educators were obtained from the four districts. The researcher then placed each educator into one of three groups. A group of eight  45  teachers, a group of fourteen teacher librarians, and a group of nineteen technology teachers.  From these three groups, four teachers, four teacher librarians, and four  technology teachers were randomly selected by the researcher as possible research subjects. Next, the researcher contacted the district resource specialist a second time, asking them to contact, by phone, each of the subjects selected from their district. At this time, the resource specialist outlined the research to the participants, explained the subjects' role and rights as volunteers, reconfirmed that the subjects met the selection criteria, and asked the subjects if they would participate in the study. All randomly selected potential subjects contacted through the district resource specialists participated in the study. The district resource specialist either had the subjects phone the researcher back with their consent, or informed the researcher that the subjects were willing to participate in the study. Once the subject's oral consent was received over the phone, a C D R O M research package consisting of a letter of initial contact (see Appendix J), a C D R O M Questionnaire (see Appendix G), two consent forms (see Appendix K), and a research study overview was mailed or hand delivered (Canada Post was on strike) to the subjects. Final confirmation of the subject's inclusion in the study came when the subject completed the consent form and either mailed it, or presented it by hand, to the researcher. In the early stages of data collection, it became apparent that the sample did not fully represent the total number of groups of educators actively involved with selecting intermediate C D R O M s . The researcher discovered two new "hybrid" 46  groups.  One group consisted of teachers who spent up to half of their day as their  schools' technology teacher. The second group were teacher librarians who also spent a significant portion of their day working as their schools' technology teacher. As a result of this finding, the sample was increased to include these two new groups.  The final sample totalled twenty participants, and was comprised of four  teachers, five teacher librarians, four computer specialists, four teacher librarians/technology teachers (TL/TT), and three teachers/technology teachers (T/TT).  The new TL/TT and T/TT groups were also randomly selected from the  original pool of forty two educators, and contacted in the same manner as the subjects in the original three groups. The sample focused on these five particular groups of educators (as opposed to principals, district resource specialists, and provincial evaluators) as software evaluators for five reasons. First, current research cites teachers as being the most qualified and/or most often recommended individual to evaluate software (e.g., Dudley-Marling & Owston, 1988; Henri, 1997; Litchfield, 1992; Tolhurst, 1992). Second, research in educational change concludes that teachers are central to the successful integration, promotion, and implementation of technology in education (Fullan 1982, 1991, 1993). Third, the accumulating literature on technological change in schools has traditionally omitted the teacher's perspective (Miller & Olson, 1995).  Fourth, a recent ministerial survey of British Columbia educators (British  Columbia Ministry of Education, 1996a, section 3, pp. 3-10), found that 88.6% of teachers believe recommendations from other teachers to be the most important source for evaluating and selecting resources, "...94% of the teacher librarians felt it  47  necessary to use practicing teachers to properly screen industry-supplied resources", and "the majority of teachers/district staff felt that resource selection would be most effective [sic] completed by individual teachers (64%)."  And finally, whether it is a  direct result of the Ministry of Education's policy to increase the number of computers in schools or not, the number of teachers either changing their resources or wanting assistance evaluating and selecting new electronic resources has risen significantly (British Columbia Ministry of Education, 1996a, section 4, p. 15). The sample also incorporated a varied sample (five 'types' of teachers) in order to build a comparative descriptive database of user-defined C D R O M selection criteria.  In addition, the final sample is characteristic of an explorative ethnographic  study in that it attempts to gather and categorize in-depth qualitative data regarding the selection criteria, and the factors affecting the development of this criteria, from five very specific groups of educators. Administration of Final Instruments and Data Collection Once the instruments were developed, permission to conduct research was obtained from the University of British Columbia's Behavioral Ethics Review Board, and the four Lower Mainland school districts (see Appendix L), and the researcher had received the subject's written consent form, administration of the final instruments and data collection commenced. Administration of the final instruments and data collection occurred from December 1997 to February 1998, and took place in four stages. Stage One In the first stage, the organizational stage, the subject was contacted by the 48  district resource supervisor by phone, or by the researcher by mail (see Appendix L). At this time, the objectives of the study were explained and the subjects were asked to participate. The subjects consented to participate by either contacting the researcher by phone, or asking the resource supervisor to have the researcher contact the subject by phone at school. During these phone conversations the researcher answered any questions and concerns the subjects had with regards to their participation in the study, and scheduled an interview time with the subject in a place of the subject's choosing. Most participants chose to schedule the interview in their classroom, library, or school computer room during the month of January 1998. No data was collected in the first stage. Stage Two In stage two, the researcher mailed or hand delivered the C D R O M questionnaire to each subject.  Participants were asked to complete the  questionnaire prior to the scheduled interview (stage three).  Subjects reported the  questionnaire took fifteen to twenty minutes to complete. After subjects completed their questionnaire the researcher collected them and answered any of the participants' questions or misunderstandings concerning the questionnaire. Stage Three Stage three, the administration of the interview instrument, consisted of three activities sequenced in three phases (Figure 2). First, the researcher administered and audio taped the main interview with the subjects. Second, he observed and audio taped the subjects conducting a hands on evaluation of the  49  Phase 1: Main Interview Vz hour  Phase 2: CDROM Exploration V£ hour  Phase 3 : Follow-up Interview 15 minutes  Figure 2. Sequencing of the Interview Instrument  intermediate C D R O M encyclopedias, Encarta 98 (Microsoft Encarta 98 Deluxe, 1997), and the Ultimate Children's Encyclopedia (Ultimate Children's Encyclopedia, 1996). And third, he administered and audio taped a follow up interview with the subjects. The amount of time it took each subject to complete each phase varied greatly from one individual to another because no time limits were placed on the subject's answers and activities, nor was there any control for repetition or lengthy responses. The time it took subjects to complete all three phases of this stage ranged from one to three hours. The questions in both the main and follow up interviews (Phases One and Three) were open ended and evaluative in nature. This encouraged subjects to reflect upon their own C D R O M evaluation processes, selection criteria, and the processes shaping the development of their intermediate C D R O M selection criteria. The subjects did not see the questions prior to the interview. However they were given a copy during the interview so they could follow, or reread, as the researcher asked each question. Data generated during the hands on evaluation of the C D R O M s (Phase Two) was stimulated by the researcher asking each subject the following open ended question: "Keeping in mind the intermediate students you are currently instructing, I 50  want you to explore and critically evaluate these two C D R O M encyclopedias [Encarta 98 and The Ultimate Children's Encyclopedia] for use with these students." The researcher also instructed subjects to articulate their criticisms, observations, praises, insights, parallels, comments, likes, dislikes, criteria for judging and selecting C D R O M s , and any other information that came to mind while they evaluated the C D R O M encyclopedias. The subject's comments were audio taped and augmented by the researcher's anecdotal notes. The researcher chose to use C D R O M encyclopedias, in this study for six reasons.  First, many elementary schools have C D R O M encyclopedias at their  disposal (McCarthy, 1993; Safford 1994). Second, C D R O M encyclopedias have been developed and refined into some very powerful educational research tools since their inception ten years ago (Jasco, 1996). Third, price wars between vendors have made C D R O M encyclopedias quite affordable for schools, especially when compared to the cost of a bound encyclopedia (Shade, 1994). Fourth, the large number of C D R O M encyclopedias on the market presents educators, and the researcher, with a realistic selection situation. Fifth, many intermediate educators are very familiar and comfortable with using encyclopedias to teach research skills. And sixth, because C D R O M encyclopedias can be, and are being, used by many intermediate educators to integrate resource based learning with information technology. Although two participants had briefly explored Encarta 97 prior to this activity, both C D R O M s were new to all twenty subjects. (Encarta 98 was shipped only two days before this study commenced.). The newness of the C D R O M s to the subjects helped to minimize preconceived judgments and criteria, making the responses original and 51  fresh. In addition, the same software and computer system was used with every subject, helping to eliminate confounding variables like computer software and hardware incompatibilities and inconsistencies. The primary objectives of all three activities in stage three, were to have the subjects verbally generate and physically demonstrate as many C D R O M selection criteria as possible, and to orally reflect upon the factors shaping the development of these criteria. A secondary objective of these three activities was developed through the sequencing of the three activities. That is, the main interview was placed prior to the hands on C D R O M exploration to see if their responses were consistent with what they had indicated in the questionnaire and interview, and to ascertain which of these stated selection criteria were being applied during the hands on C D R O M evaluation. The researcher also chose to follow the hands-on C D R O M exploration with an interview so as to record the subjects' reflections into any new C D R O M selection criteria which might have been stimulated during the hands on activity. Stage Four In the fourth and final stage, the researcher and two trained research assistants transcribed and reviewed all twenty tapes and subjects' questionnaires. Then, using the researcher's observational field notes, they verified that the raw data sets for each subject was complete and/or clear. In the case data needed to be clarified, the subjects were contacted by the researcher. The subject's final raw data set contained both the individual's questionnaire and transcribed interview data. Data Analysis Raw data sets for all participants were contrasted and compared using the 52  content analytic method as explained by Krippendorf (1980) and Stempel (1981). Content analysis of the data sets was administered by the researcher and two independent content data analysts (research assistants), who inductively identified and coded response categories from a copy of each subject's transcribed interview data. Since both research assistants were unfamiliar with content analysis they were trained how to interpret subjects' speech acts from transcriptions and translate them into the closest available meaning from a list of C D R O M selection criteria codes (see Appendix N).  One assistant was a University of British Columbia graduate from the  Department of Psychology who was familiar with research studies, assessing research data, and using C D R O M s . The second assistant was experienced with assessing educational C D R O M s for use with his intermediate aged children. The researcher initially compiled a list of thirty eight C D R O M selection criteria derived directly from participants' written responses to question numbers three and four of the main interview (see Appendix M). These criteria were coded with ascending numbers and categorized into five distinct content categories: (a) content, (b) instructional design, (c) technical design, (d) social content, and (e) other. These five content categories and numbers were established to help the researchers quickly code and categorize C D R O M selection criteria identified within the educator's criteria lists and interview transcriptions. Except for the category other, all of these categories and associated groupings of selection criteria were taken from the British Columbia Ministry of Education Learning Resources Branch's resource selection guide, Evaluating. Selecting and Managing Learning Resources (British Columbia Ministry of Education, 1996b, pp. 24 - 42, Appendix A). These categories were selected  53  because they were evident in the literature. The category other was used to group criteria which did not fall into any existing content categories. Using this list of thirty eight C D R O M selection criteria, the researcher and one of the research assistants independently identified and numerically coded the types of content criteria contained in two of the twenty participant's transcriptions. The percentage of agreement, or intercoder reliability, between each of the researcher's and research assistant's coded items, was then calculated and checked using a simple ratio: Number of agreements between coders X 100% = Intercoder Reliability Number of possible agreements  The minimum intercoder reliability percentage considered reliable in any exploratory content analysis study is eighty percent (Barry, 1994, p. 153, and Stemple, 1981). It was only in two instances that the intercoder reliability between the researcher and research assistant fell short of eighty percent. Each time the criteria and coding categories were revised and tested for reliability on a different set of randomly selected data (see Table 5 ) . The third revision of the C D R O M selection criteria codes (see Appendix N) yielded a list of forty five criteria codes and an acceptable intercoder reliability of eighty six percent. The transcriptions for all twenty participants were then coded and found to yield a similar intercoder reliability percentage. To further strengthen the reliability, a second research assistant was asked to code the responses from all twenty data sets and these were then checked for intercoder reliability with the  54  researcher's and the first research assistant's criteria codes. The final average intercoder reliability between all three researchers for all twenty coded data sets was calculated at 82.5% (see Appendix O). Table 5 Revisions to the Initial List of C D R O M Selection Criteria Codes Total  Appendix  Eliminated  2  M  Subject Definition Added  4  N  Revision Type  Criteria Code (or Number) 26, 27 1,6,23,27  r 5, 7,  Researcher Clarified  9,10,11,12,13,14,  18, 20, 21, 23, 24, 27, 29, 23  N  Definition  <  32, 34, 35, 37, 41, 43, 44, 45  New Criteria Added  N  11, 13, 19, 24, 30, 36, 37, 40, 42  Unchanged  14  N  2, 3, 4, 8, 15, 17, 22, 25, 26, 28, 31, 33, 42  55  CHAPTER FOUR FINDINGS The research findings are presented in this chapter.  Except for the  presentation of data representative of the sample's background, the findings are presented in the same order as the research questions were asked in chapter one. Each of the study's research questions have been stated prior to the data in order to encourage the reader to construct meaningful connections between the data and the research questions. Question six is the one exception, as the data relevant to this question is presented throughout the chapter. In order to maintain the confidentiality of the subjects in this study, the first participant in each of the five groups of educators was identified by using the number one. The second participant of each group was identified with the number two, and so on. For example, teacher one became T1, teacher two became T2, teacher three became T3, and teacher four became T4. Therefore when a quote is followed by the subject identifier (T/TL3) the quotation was made by the third teacher/teacher librarian. Descriptive Data of Research Sample Although no specific research questions were put forth with respect to determining the background of the sample, data collected from the questionnaire provided information on the subjects' prior experiences, knowledge, skills, and attitudes. This information in turn, provided both direct and indirect insights into the research questions.  For example, one of the study's findings that surfaced in the  early stages of data collection, was not linked directly to a research question, but was  56  directly linked to information collected on the research subjects. After administering the first three questionnaires and interviews, the researcher discovered that the assessment and selection of intermediate C D R O M s was the responsibility of more than just teachers (Ts), teacher librarians (TLs), and technology teachers (TTs). Two other types of educators were also involved in the selection of intermediate C D R O M s : (a) teacher librarians doubling as the school's technology teachers (TL/TTs) and (b) teachers who were also the school's technology teachers (TL/TTs). Since one of the objectives of the study was to represent the different types of the intermediate educators assessing and selecting C D R O M s in British Columbia at the time, the researcher chose to integrate the two new groups of educators into the research sample. Of the twenty research subjects, five were male and fifteen were female. All of the Ts, TLs, and TL/TTs were female. Five of the seven TTs and T/TTs were male. Except for the TL/TTs, the majority (71 %) of educators working as the TTs in the schools for this sample were male. The average age of the sample was forty years. The youngest group of educators were the TTs with an average age of twenty-six, and the oldest being the TLs averaging fifty-five years of age. The TLs were also the most experienced group of educators having worked in schools for an average of twenty-one to twenty-five years. The most inexperienced group of educators were Ts logging an average of six to ten years of teaching experience. TTs, TL/TTs, and T/TTs had an average of almost sixteen years (15.9) of teaching experience. Fifteen of the twenty educators were full time, five were part time.  The part  time educators consisted of two TLs, one TT, one TL/TT, and one T/TT. They  57  averaged twenty-four teaching hours a week. Ts, TL/TTs, and T/TTs taught at schools with enrollment numbers near the average for the sample (four hundred and twenty-eight students).  TLs were teaching at schools with smaller than average  school populations (three hundred and ten students), whereas TTs were stationed at schools with larger than average student populations (five hundred and fifty-five). Except for one T, all of the research subjects had one or more computers in the room in which they most frequently taught. On scheduled days, this one T brought her home room class to the computer laboratory when she and her students required the computer equipment. The number of computers and C D R O M players that each type of educator had in their classroom varied. The average number of computers and C D R O M players for each type of educator are displayed in Figure 3. TTs and TL/TTs taught using the largest average number of computers and the TLs, T/TTs, and Ts instructed with the smallest average. However, the T L s had the highest percentage (72%) of computers with C D R O M players, followed by T/TTs (66%), Ts (50%), TL/TTs (22%), and T/TTs (8%). When asked how old their classroom computers were, most of the sample indicated that their machines varied in age from one to six years. Several educators explained that this was because the computers were purchased gradually over the past six years. The only exception was T/TTs who were working with much older computers ranging from seven to nine years old. All of the educators indicated that they had access to shared school computers. However, one T indicated that they were in the process of dismantling their computer laboratory and placing one or more of these computers into every classroom. Of these shared computers, an average of  58  six and a half computers had C D R O M players. Educators also indicated that fifty-two percent of these shared computers were located in computer labs, thirty-eight percent in the libraries, and ten percent in classrooms (e.g., mobile computers and computers in shared classrooms). Educators also estimated that their classes accessed the shared computers for an average of three hours per week, and they used C D R O M s with their students for an average of three hours per month.  There was little  difference between educators on the amount of time they and their students accessed these resources.  Type of Educator Figure 3. Average number of computers located in educators' classroom displayed by type of educator.  The average number of C D R O M s in a school's collection was estimated at twenty-four. TL/TTs reported the largest school C D R O M collections, averaging  59  sjBMyos  uouBjuesaJd  P3 UV  P3 lepads S9LUB0  6U;A|OS uieiqojd  ASQ  leuoisapjd  P3 a6en6ue-i  P3 | B O ! S A d  90UBQ PUB BlUBJO  O 6 U J A H JOl 6 U ! U J B 8 - |  suo|}B|nLU!s  OH  a o  o a> Q. >>  A6o|ouqo9i  6ujssaooJd P J O M  aouaps o]sni/\|  S3!prns  |Bpos  6uiqsi|qnd  8 j n ) e j 8 ) n s,U3jp||qo  SBjpadopAoug  owoinowomomowowowowow O f f l O ) t O ! D N N ( D t D l f i m i - t n n < N N i - T 3 | d i u e s j o 9Be;uaojad  Figure 4. Types of C D R O M s being used by sample. 60  thirty-six disks. Ts reported the smallest school C D R O M collections averaging thirteen disks. TTs, TLs, and T/TTs reported average sized C D R O M collections. Whether for their school's or their personal collection, all of the educators indicated that they evaluated and purchased an average of two new C D R O M s per year. All of the schools had a wide range of C D R O M s . A breakdown of the types of C D R O M s being used in the subjects' schools is presented in Figure 4. The entire sample indicated that they had been involved for, on average, two and a half years both evaluating and selecting C D R O M s , and using C D R O M s with their students. Except for TL/TTs and Ts, research subjects also reported that they spent an average of two hours per month evaluating and selecting C D R O M s . TL/TTs noted that they spent almost three and a half hours per month on these activities, whereas Ts only spent just under one and a half hours a month. All educators also indicated that they had home computers which they had owned and used for, on average, six or more years. TLs spent an average of one and a half hours per week on their home computers, whereas Ts, TTs, and TL/TTs averaged around three and a half hours per week. T/TTs spent over four hours per week working on their home computer. Finally, when asked to respond to five evaluative questions on the questionnaire concerning C D R O M s , educators in this study were generally uniform and very opinionated. For example, all of the research subjects agreed or strongly agreed that: (a) it was important to purchase C D R O M software for elementary students, (b) that teachers are in the best position to evaluate and select C D R O M s , and (c) that teachers should be trained to select their own software. They also  61  agreed to disagree, but not strongly disagree, with the statement, "Most C D R O M software today is of high quality and does not need evaluating." However, on the final question, educators were less certain and united when asked "I feel that the multimedia capabilities of C D R O M s meet a wider variety of student needs than many traditional sources." Educators' answers varied from strongly agree to disagree with two people choosing no opinion (This was the only question to which no opinions were recorded.). Findings in Relation to Research Questions The purpose of this study was to investigate the current selection criteria and processes that experienced teachers, teacher librarians, technology teachers, teacher librarians/technology teachers, and teachers/technology teachers in the Vancouver metropolitan area use to assess and evaluate C D R O M software for use with intermediate students. In order to do this, six research questions were asked. The answers to these questions will be presented in the order in which they were asked. Research Question One What selection criteria do experienced intermediate educators use to evaluate C D R O M s to be used with intermediate students? Forty-six different C D R O M selection criteria were identified through the course of this research study (see Appendix N).  Most of these criteria were identified from  the content analysis of subjects' transcribed interviews (see Chapter 3, Data Analysis). Thus, almost all of the key words used to identify the selection criteria (e.g., Motivating, Accessibility) and the definitions of each selection criteria were  62  taken exclusively from the educators' interview transcripts. In a very few cases, several of the key words were obtained from the Learning Resource Branch's (LRB) resource guide, Evaluating. Selecting and Managing Learning Resources (British Columbia Ministry of Education, 1996b). In addition, some of the definitions were added by the research team during their content analysis of the transcriptions. These additions are in italics in the criteria code listings in Appendix N. Principally, though, this list is reflective of the C D R O M selection criteria employed by the intermediate educators participating in this research at the time of the study. These selection criteria were also grouped into five larger content categories so as to facilitate discussion about the criteria's general descriptive characteristics. Four of the five content categories and associated groupings of selection criteria were taken from those defined and recommended in the LRB's software selection guide (British Columbia Ministry of Education, 1996b, pp. 65-67, Appendix A). The five content categories were: content, instructional design, technical design, social content, and other. The category other was not a category listed by the L R B or found in the literature.  The bulk (87%) of the forty-six selection criteria mentioned by the  sample were divided equally into three content categories: content (29%), instructional design (29%), and technical design (29%) (see Figure 5).  Only four  percent of the selection criteria were placed in the social content category and nine percent were placed in the other (or miscellaneous) category. Any items which the educators referred to as criteria, and which the research team could not categorize onto the first four content categories, were placed in other (the fifth category).  63  Social Content  Figure 5. Categories of research subjects' C D R O M selection criteria. Percentages are based on the number of educators to mention each of the forty six criteria.  S e l e c t i o n Criteria  Figure 6. Top eleven selection criteria given the highest overall scores of importance by the sample before hands on computer activity.  64  At the beginning of the interview, each subject was asked two questions to help them relax and focus on the topic of C D R O M selection. They were then asked to list and rank order C D R O M selection criteria in response to questions three and four of the interview (see Appendix H). Criteria ranked one were very important and ten much less important. The top three criteria listed by each educator were selected and assigned a value. The selection criteria ranked first were assigned three points, and ones ranked second were assigned two points. Third ranked selection criteria was assigned one point. These values were then totalled for each selection criteria. The criteria to obtain the highest total score were said to be more important for the sample than those ranked with a very low score. The eleven selection criteria which were given the highest overall scores of importance by the sample are displayed in Figure 6. Provincial curriculum was the sample's top ranked C D R O M selection criteria. Six of the twenty educators ranked it as their number one selection criteria, three ranked it as their second most important criteria, and two ranked it as their third most important criteria.  Seven of these twelve selection criteria are categorized as content  criteria (dark grey bars), with the rest being technical and instructional design criteria. Another indication of the importance educators placed on a selection criteria was also found in the percentage of the sample to mention and/or use each criteria throughout part two of the study (main interview, hands on activity, follow up interview). Selection criteria mentioned by a large percentage (more than half) of the sample during the study were considered to be popular or important criteria. Selection criteria mentioned by less than half of the sample were considered to be less popular  65  or less important criteria. The twelve selection criteria mentioned by the largest percentages of the sample are displayed in Figure 7.  In this figure, it was noted that  criteria on this bar graph are technical design criteria (black), a third are content criteria (dark grey), and the rest are instructional design and other criteria.  100%  •2  90%  85%  80%  Q.  i w  60% 4-I  o  20% -|-|  CO  o O  .Q co in <D o o <  in <D  Q o  'n  ro i— CL  O  a> +J  c o O  «•o  E  o  £ a. a>  JZ  <  r~-'  Q  (U  < co  Criteria  Q. CO  B  £  lotivation  g> Q.  Teacher  0% (D >  b  C  o  u_  O  Cu  i_  <t Clarity  S  OL  -H  PP ropriate  c  40%  tiveness  §)  d  Figure 7. Twelve most popular C D R O M selection criteria as indicated by percentage of total sample throughout part two of the study.  Research Question Two Which of these selected criteria do experienced intermediate educators use when evaluating two encyclopedia C D R O M s to be used with intermediate students? Data representative of the sample's C D R O M selection criteria was collected at two different stages during the interview. The first stage involved the collection of data prior to the subjects' hands on evaluation of two encyclopedia C D R O M s . The second stage involved the collection of data during the hands on activity. Criteria  66  from each stage were then compared and contrasted to determine if there was consistency between the C D R O M selection criteria educators stated before the activity as compared to the ones used in the evaluation of the two C D R O M programs. Stage One In this stage, data representative of the sample's C D R O M selection criteria were collected from two sources. The first set of criteria was compiled from the lists written by the subjects in response to questions three and four of the main interview (see Appendix H). The second set of criteria was collected from transcriptions of the subjects' verbal responses to questions one, two, and five through thirteen asked during the main interview (see Chapter Three). Written data was analyzed separately from verbal data to determine if there were differences across data sets. A third data set was formed by combining written and verbal data sets. This third data set was representative of the total criteria mentioned by the research subjects prior to the hands on activity. Using content analysis (see Chapter 3, Data Analysis), the total number of criteria for the first data set were identified and coded from subjects' written responses to questions three and four of the main interview. These coded criteria were then grouped into the following five content categories: (a) content, (b) instructional design, (c) technical design, (d) social content, and (e) other (See Appendix N for a detailed list of the selection criteria codes and categories.). The first data set consisted of one hundred and forty-four C D R O M selection criteria and averaged 7.2 criteria per subject.  67  .2  8  5  l  Educators by  Type  Figure 8. Average number of listed C D R O M selection criteria displayed by type of educator.  The average number of criteria listed in the first data set varied across groups, or types, of educators (see Figure 8). The five Teacher Librarians (TL) listed the highest average number of criteria per person (8), the four Teachers (T) recorded the lowest (6), and the three remaining groups, the four Technology Teachers (TT), four Teacher Librarians/ Technology Teachers (TL/TT), and five Teacher Librarians (TL), fell between these extremes and very close to the average (7.2). Further analysis of the total criteria listed by the sample revealed that ninety percent of the coded responses were situated in three content categories: (a) content quality (34%), (b) technical design (33%), and (c) instructional design (23%) (see Figure 9). Almost nine percent of the subjects' listed criteria were placed in the content category other, of which eight percent were based on one criteria, cost. Only  68  one and a half percent of the subjects' listed criteria were categorized as selection criteria based on social content (e.g., sexism, ageism, and socio-economic).  Social Content Technical Design 34.1%  Instructional Design 23.0%  Content 33.4% Figure 9. Categorical breakdown of subjects' total number of written C D R O M selection criteria.  A second data set was identified, coded, and categorized from the subjects' verbal responses to questions one, two, and five through thirteen of the interview. A total of two hundred and forty-six C D R O M selection criteria were identified, over one hundred more than those listed by the subjects.  The average number of C D R O M  selection criteria for this data set was 12.2. A s with the first data set, the average number of criteria varied between types of educators (see Figure 10).  The T/TTs  recorded the highest average number of selection criteria (15.3), the Ts and TTs recorded the lowest (10.8), and the two remaining groups, the TLs and TL/TTs fell close to the average for all educators.  69  < Educators by Type  Figure 10. Average number of C D R O M selection criteria mentioned during the interview displayed by type of educator.  Content analysis of the subjects' interview transcriptions generated similar findings to those found in the subjects' handwritten lists. A large percentage (86%) of the subject's written selection criteria emphasized evaluating a C D R O M ' s content quality (33%), technical design (27%), and instructional design (26%) (see Figure 11). Twelve percent of all criteria mentioned were categorized as other, and like the first data set, a substantial portion of these criteria (8%) were identified and coded as cost related criteria. Also similar to the first data set, only two and a half percent, or six of the two hundred and forty-six selection criteria mentioned, were categorized as social content criteria.  70  Social Content  Technical Design  Figure 11. Categorical breakdown of subjects' total number of verbal C D R O M selection criteria. Results displayed by type of educator.  Since the findings in the subjects' written and verbal C D R O M selection criteria data sets were very similar, the two were combined to form a third data set. This third data set was representative of the total criteria mentioned by the sample prior to the hands on activity. The number of C D R O M selection criteria in this set was three hundred and eighty-nine. This resulted in an average of nineteen selection criteria per subject. Again, the average number of criteria varied between types of educators (see Figure 12).  The T/TTs recorded the highest average number of criteria (22)  and the Ts recorded the lowest (16). The TTs, TLs, and TL/TTs fell close to the average for all educators. Eighty-seven percent of the criteria in this third data set were categorized as content quality (32%), technical design (30%), and instructional design (25%) (see  71  Figure 13). Only two percent of the total criteria were based on social content, with the remaining eleven percent falling into the category other. Criteria coded as cost made up five percent of the total other criteria.  TVTT  TL  TL/TT  TT  T  Type of Educator  Average for Sample  Figure 12. Average number of C D R O M selection criteria mentioned displayed by type of educator. Social Content  30% Figure 13. Categorical breakdown for the total number of C D R O M criteria mentioned by educators.  72  Table 6 Rank Order of Percentage of Educators to Mention Specific C D R O M Selection Criteria Prior to C D R O M Evaluation Activity (Stage 1) Criteria and (Percentage of Educators) Cost/Price (90%) Intuitiveness (75%) Provincial Curriculum (75%) Graphic Design Learner Diversity Motivation Teacher's Curriculum  (65%) (65%) (65%) (65%)  Educational (60%) Interactivity 55%) Compatibility 50%) Age Appropriate/Readability 50%) Multimedia 45%) Edutainment 45%) Accessibility 35%) Critical Thinking 35%) Time/Opportunity 30%) Complimentary Resource 30%) Constructivism Networking Technology Currency  25%) 25%) 25%) 25%)  Creativity (20%), Canadian Content (20%), Depth of Content 20%) Individual Learner (20%), Program Tempo (20%), Integration 20%) Social Content, General (15%), Accuracy 15%) Customization (15%), Usage 15%) Printing/Copying (10%), Dual Platform (10%) Authentic/Authority (10%), Documentation/Help (10%) Transferability (10%), Previewing (10%) Consistency (5%), Meaningfulness (5%), Feedback (5%) Font/Text Clarity (5%), Producer's Values (5%) Assessment (5%), Patronizing (5%) Language Use (0%), Internet (0%) 73  Rank  £ E  °i <D  -•  CO 1  CD  JZ  O TO  |2 O O  c  JZ  o  o ro  c  ?! 0  .55  J=  o fo >;  52  CO CO  J2 ~  > o  cb  CD  o o  IZ JZ  JZ  CO  i  CD  CO  C  i l l i i i i l l illl  mn i mn im i  <D  fc  .2 co  b  CD  o  O  c  CD  co CO  o  -4—»  CD O =J T3 LU  on  E ,n  i— CD  CD J Z  o 5?  o o ro c ro a> O <u I— CD  JZ  CO  o o  cu ro o cu ro h o '— I  JZ  i  H  J= c a. o> 5 <o CD g  T  '  \m  3 I —— I— H n  —i — I  I—  'C  o a> a> CO  S O or  Q  o  • .2  E  •H 0- 3 CO °  CD LU  CD >  ZJ  c  CM  0) O  o o o  o  o  o  o  o  o  o  o  o  c n o o r ^ c D i O T f c o c M T -  sjojeonpg io  o  aBejuaojad  Figure 14. Percentage of types of educators to use top seven C D R O M selection criteria (Stage 1). 74  CO  ••H .£ O  111 mil 1  .E . JZ  in  H-  Tf JI  CO CO  O  o  <  a> o CO a>  JZ  I-  c o c CD  'i  co  JD  c o i— a > o JoZ •4—' CO I  CD O ZJ TJ LU <  CO CD  CO I  CD JZ  o CO a>  1>> O)  o co cu  p o  CD O O C JZ  co  o cu  O c o  t:  o cu tn  CO  o CO cu  S  r-  o Qi  I— H  Q O  a z  LU  co LU  sjojeonpg jo  aBejueojad  Figure 15. Percentage of Types of Educators to Use C D R O M Selection Criteria, Ranked Eighth to Fifteenth (Stage 1). 75  This third data set, was further analyzed to determine the percentage of educators using each of the forty-six C D R O M selection criteria. Selection criteria were rank ordered from one to forty-six, with one representing the criteria used by the largest percentage of educators and forty-six representing the criteria used by the least (see Table 6). All of the criteria numbered one to fifteen were mentioned by at least thirty-five percent or more of the research sample. The criteria numbered fifteen to forty-six were mentioned by thirty-five percent or less of the research sample. C D R O M selection criteria listed in the top fifteen were found to be used by the largest percentage of educators, making these the sample's most important selection criteria. The three C D R O M selection criteria to be mentioned by the largest percentages of educators were cost, intuitiveness, and provincial curriculum. Cost was mentioned by all but one educator, or ninety-five percent of the research subjects, making it the sample's most popular C D R O M selection criteria. Cost was also the only criteria to be mentioned by large percentages of subjects in all five groups of educators (see Figure 14).  The importance of cost was clearly shown since the next closest selection  criteria, intuitiveness and provincial curriculum, were mentioned by only seventy-five percent of this sample. Their explanations for choosing cost as a criteria were varied and are explained in more detail later in this chapter. Intuitiveness and provincial curriculum were the subjects' next two most popular C D R O M selection criteria.  In both cases, the percentage of educators using  these criteria varied substantially between groups of educators. For example, all of the TLs and T/TTs indicated that they looked for intuitiveness in a C D R O M , and all of the TLs and TL/TTs wanted the C D R O M to support the provincial curriculum.  76  However, only fifty percent of the Ts and TL/TTs looked for intuitiveness and fifty percent of the Ts and TTs searched for provincial curriculum content when evaluating a CDROM. Of the fifteen educators to mention intuitiveness as a C D R O M criteria, all indicated that the C D R O M had to be easy to use and/or user friendly for the students. In most cases, this meant that the C D R O M required very little intervention from the educator and encouraged a high level of student independence. One TL illustrated the type of intuitiveness that many of these educators were looking for in C D R O M s : [Yukon Trail C D R O M ] is easy to use. I can get them started on it, and then I don't have to be there with them all the time. It's comfy enough that they can do some exploration on it, and they end up working it. It's all self-explanatory. (TL1) Another T explained why she thought it was important to look for ease of use in a C D R O M , especially for students in the lower elementary grades: Since I deal with younger intermediate children, it [the C D R O M ] has to be absolutely easy for them to use, so they can use it independently. Once I have taught them one or two things then they could move on with themselves and teach the other kids. It has been my experience that once you have taught one or two kids that they are able to teach the other kids. (T4) Provincial curriculum was also one of the top three criteria used by educators. Provincial curriculum in this study was defined as any curriculum that fell outside of the educator's own personally developed curriculum. Thus provincial curriculum included any curriculum originating from the local school district, the staff of a school, or the provincial ministry of education. All nine of the TLs and TL/TTs indicated that 77  most of their C D R O M s were selected to meet the curricular needs of the educators in their school. These educators indicated that this was one of the most important, if not the most important criteria for evaluating and selecting a C D R O M . The two following quotes highlight the importance T L s place on this criteria: First off I would see what the teachers needs are in the terms of curriculum areas. If I found something that fit into that category then that would be part of my choice. Then I would choose what was the best of what was available at that time. Curriculum area comes first and then I tend to go towards reference C D R O M s because this is the library. (TL/TT3) Another TL stated, "I would be looking at our [the school's] curriculum areas. Seeing which C D - R O M s would fit into the curriculum areas, particularly our social studies and science areas" (TL5). Two T L s and TL/TTs also mentioned that they selected C D R O M S on the basis that they supported the implementation of provincial curriculum. On the other hand, no other group displayed unanimity when it came to selecting C D R O M s based on implementing school, district, or provincial curriculum. Only half of the Ts and TTs, and sixty-five percent of the T/TTs, mentioned they used provincial curriculum criteria. An analysis of the next twelve criteria also revealed several substantial differences between groups of educators (see Figure 14, and Figure 15). For example, one hundred percent of the TL/TTs cited graphic design, whereas only thirty-five percent of the T/TTs used graphic design.  Similarly, all of the TTs reported  they used learner diversity as C D R O M selection criteria, whereas only thirty-five percent of the T/TTs reported this criteria. All of the TTs and T/TTs mentioned  78  motivation as a criteria, whereas only twenty percent of the T L s referred to this criteria.  However, only one of the five TLs, or twenty percent of the group,  mentioned motivation as a C D R O M selection criteria. All of the Ts also indicated they evaluated C D R O M s with criteria based on interactivity and critical thinking, whereas none of the T L s used interactivity and none of the TLs and TTs mentioned critical thinking.  Similarly, all of the T/TTs cited compatibility (hardware and software) and  multimedia as important C D R O M selection criteria. This contrasted with the Ts who did not mention compatibility at all, and the Ts, TTs, and TL/TTs where only twentyfive percent of these groups evaluated C D R O M s ' multimedia qualities. Although there were many more differences between groups of educators with regard to the types of C D R O M selection criteria they used (a trend also reflected in the criteria ranked sixteen and lower), there were also several similarities. For example, fairly equal percentages (50% to 75%) of educators from all five groups indicated that teacher's curriculum was an important criteria. The same was true for the criteria edutainment (A term used by two subjects to denote a C D R O M that was educational as well as entertaining) as most of the groups fell close to the sample's forty-five percent average. The third data set was also analyzed to determine which content categories of criteria (e.g., content, instructional design, technical design, social content, or other) were used by the highest percentages of educators.  Unlike the data presented  earlier in Figure 10, this analysis was not based on the number of times (frequency) a criteria was mentioned. Instead it was based the percentage of educators to draw upon each of the criteria in each category. The results are presented in Figure 16.  79  Similar to the earlier frequency findings, the bulk (87%) of the criteria types educators used when evaluating and selecting C D R O M s were equally distributed across content (33%), technical design (28%), and instructional design (26%). Also similar was the low percentage of educators using social content criteria (2%), and the large percentage of educators indicating cost as a criteria in the category other (11%).  Social Content Other 11%  2  %  Technical Design 28% Figure 16. Categorical breakdown of percentage of educators' C D R O M selection criteria.  Stage Two At the commencement of the hands on exploration of the two C D R O M encyclopedias, subjects were asked to provide the researcher with a verbal account of their assessment and evaluation, and to cite any selection criteria they used or might have used to judge the C D R O M for use with their students.  The subjects'  verbal reflections were transcribed. Then using content analysis, C D R O M selection  80  criteria was identified, coded, and categorized into the five content categories. Three hundred and four selection criteria were identified averaging 15.2 criteria per subject. Similar to the data set collected in stage one, the average number of criteria each type of educator mentioned during the hands on activity varied slightly between types of educators (see Figure 17). The Ts and TTs recorded the highest average number of criteria, the TLs and TTs recorded the lowest, and the TL/TTs fell very close to the average. Unlike the results of the data sets collected prior to the C D R O M activity, the Ts and TTs recorded the highest average number of C D R O M selection criteria, instead of the lowest, and the TLs recorded the lowest average number of C D R O M selection criteria as opposed to the highest. Of the three hundred and four coded selection criteria mentioned during the subjects' hands-on evaluation of the C D R O M encyclopedias, two hundred and ninetytwo, or ninety-six percent of the criteria, were situated in categories emphasizing technical design (56%), content quality (24%), and instructional design (16%) (see Figure 18). Only three percent of the criteria were categorized as social content criteria, and only one percent were categorized as other. The data set for the subjects' hands on C D R O M evaluations was also analyzed to determine what percent of the research sample mentioned each criteria and the rank order of these percentages (see Table 7).  The subjects mentioned  thirty-one C D R O M selection criteria during their hands on C D R O M exploration. The top fourteen criteria were mentioned by a low of twenty-five percent to a high of one hundred percent of the sample. The criteria ranked fifteen to thirty-one were  81  mentioned by less than twenty-five percent of the sample.  17 -,  T  TT  TL/TT  TL  Educators by Type  T/TT  Average for Sample  Figure 17. Average number of listed C D R O M selection criteria displayed by type of educator.  Social Content  Other  Figure 18. Categorical breakdown for the percentage of educators to mention C D R O M selection criteria.  82  Table 7 Rank Order of Percentage of Educators to Mention Specific C D R O M Selection Criteria During C D R O M Evaluation Activity (Stage 2) Criteria and (Percentage of Educators) Accessibility (85%) Graphic Design (80%) Depth of Content (75%) Multimedia (75%) Intuitiveness (70%) Font/Text Clarity (65%) Age Appropriate (65%) Language Use (35%) Individual Learner ;35%) Program Tempo Documentation/Help Patronizing Canadian Content Learner Diversity Transferability Customization Educational Consistency Interactivity Edutainment  ;30%) ;30%) ;30%) ;25%) (25%) (20%) (20%) (20%) (20%) (20%) (20%)  Compatibility (15%) Constructivism (15%) Teacher's Curriculum (15%) Cost (10%) Printing (10%) Feedback (10%) Internet (10%) Producer's Values (10%) Motivation (10%) Currency (5%), Authentic/Authority (5%) Provincial Curriculum (0%), Complimentary Resource Meaningfulness (0%), Assessment (0%), Networking Previewing (0%), Dual Platform (0%), Critical Thinking Integration/Diversity, Subject (0%), Time/Opportunity Usage (0%), Accuracy (0%), Creativity (0%), Technology Social Content, General 83  (0%) (0%) (0%) (0%) (0%) (0%)  Rank  I < o  < X  T CO  L CD J Z  o CD CD CO  p o c jr o  CD  t  CO  CO  c  CD  CD .—  CD CD  *  CO CD O  ZJ  LU  <  —'  C D  CD CD J Z  o  "Tn >-  2  CD J Z  o  o CD CD  I-  CD  0  O o £Z  o  .12  L_  O  It  CD O  cz o CD  CZ J Z  o  ©  _J CD J Z O CD CD  o — I  E  CD  CO  s o  CD J Z O CD CD  OH  Q O  CD  .  —  H H  DD •  H  B  •  LU  o  9- co 2 '55 O «  LU  csi  I ^  ^  ^  J2  'lO CO CO  ^  o o  < o O  o C  3  o )  0  o 0  r  ~  o C  D  o C  O  sjojeonpg jo  o '  <  o t  C  o 0  C  o M  T  o -  aBejuaojed  Figure 19 Percentage of Types of Educators to Use Top Seven C D R O M Selection Criteria (Stage Two)  84  Figure 20 Percentage of Types of Educators to Use C D R O M Selection Criteria. Ranked Eighth to Fourteenth (Stage Two).  85  The educators' four most frequently mentioned criteria during the C D R O M activity were accessibility, graphic design, depth of content, and multimedia (see Figure 19 and Figure 20). Accessibility was mentioned by eighty-five percent of the sample, the largest percentage of educators to mention a criteria during the hands on C D R O M evaluation. The percentage of educators referring to accessibility varied between groups. For example one hundred percent of the Ts and TTs mentioned accessibility whereas sixty-five to seventy-five percent of the TLs, TL/TTs, and T/TTs mentioned this criteria. Eight of the seventeen educators mentioned accessibility during the hands on activity while navigating through the encyclopedia programs. For example, two Ts liked how easy it was to link to the dictionary in one of the encyclopedias. However, one TL found it hard to activate the search engine in the same resource. Another T/TT voiced his concern that the graphic table of contents in the Ultimate Children's Encyclopedia would make it hard for some students to access the information. One T/TT also noted that moving around in Encarta was a complex task and that, "The kids can't navigate the C D R O M without a lot of teaching" (T/TT1). Six of the educators also mentioned that Encarta's two disk format was difficult for students to use in their school's one disk C D R O M players, making the encyclopedia's information even more difficult to access. The criteria, graphic design, was mentioned by the second largest percentage of educators (80%) during the hands on C D R O M exploration. The percentage of educators mentioning this criteria varied substantially between groups of educators. For example, all of the TLs referred to this criteria, whereas fifty to seventy-five percent of the other groups mentioned graphic design. Although there was quite a 86  variety of reasons why educators mentioned criteria categorized as graphic design, the three most common reasons were: (a) preference for high quality photos and graphics as opposed to hand drawn pictures, (b) appreciation for graphics which supported the associated text, and (c) a need to have less visual clutter on the screen. Five of these sixteen educators mentioned the first two reasons, and four of these sixteen educators mentioned the third. The third and fourth ranked criteria, depth of content and multimedia, were mentioned by seventy-five percent of the research sample. Again, there was a great deal of variance between groups of educators. In the case of depth of content, one hundred percent of the TTs and TL/TTs mentioned this criteria, whereas only forty percent of the TLs made this reference. All of the Ts and TLs referred to evaluating the multimedia aspects of C D R O M s , yet only twenty-five percent of the TL/TTs mentioned multimedia. Of the fifteen educators to mentioned depth of content, thirteen mentioned it as they explored the Ultimate Children's Encyclopedia. They all had concerns over the lack of information and detail in this encyclopedia. One TT summarized his concerns when he stated, "There is not enough detail for upper intermediate students. It is useless for research" (TT4). The TL and T who did not criticize the Ultimate Children's Encyclopedia did remark that Encarta contained a great deal more information and that, "it was useful to have an encyclopedia help you create your own research outline" (TL5). The majority of educators making reference to the criteria, multimedia, did so at three different stages in their explorations. First, seven educators mentioned multimedia when they explored the virtual tours in Encarta. Second, five educators 87  indicated that the voice and actions of the animated character in the bottom right corner of the monitor was irritating and intrusive as they entered Ultimate Children's Encyclopedia.  And third, three educators mentioned the need for moderation of  sound as both of the encyclopedias' music can be, as one TT worded it, "annoying to you and the people around you" (TT1). Analysis of the next nine criteria also found several substantial differences between groups of educators.  For example, one hundred percent of the Ts  mentioned that the C D R O M should be intuitive, or easy to use for students. However, only sixty-five to seventy-five percent of the TTs, TL/TTs, and T/TTs, and only forty percent of the TLs mentioned this criteria. Similarly, all of the TL/TTs pointed out that the C D R O M must have clear readable fonts and text, whereas only half of the Ts and only thirty-five percent of the T/TTs cited this as a criteria . One hundred percent of the TL/TTs spoke of the importance of ensuring that the C D R O M s have age appropriate content and reading levels. Only fifty to sixty-five percent of all other educators mentioned this criteria during the hands on activity. Except for three instances, the remaining criteria (Social Content, Individual Learner, Program Tempo, Documentation, Patronizing, Canadian Content, and Learner Diversity) were mentioned by fifty percent or less of the educators in each group. In many cases, the criteria was not mentioned by one or more groups. The three exceptions were, (a) seventy-five percent of Ts mentioned social content, (b) sixty-five percent of the TTs indicated that it was important to reflect Canadian content, and (c) sixty-five percent of the TTs indicated that they look for C D R O M s that meet the needs of specific learners or age groups (e.g., English as a Second Language students, or grade six music students). 88  Comparison of C D R O M Selection Criteria from Stages One and Two Percentages of educators to use C D R O M selection criteria collected from stage one and stage two were then compared so as to determine if there was consistency between the C D R O M selection criteria educators stated they used, and the ones they used to evaluate the two C D R O M programs. The comparison of the sample's content categories suggested that a substantial change took place between stages (see Figure 21).  In stage one, the sample  indicated that they evaluated C D R O M s using a balance of criteria from three content categories: technical design (34%), content (28%), and instructional design (26%). However, in stage two the sample placed a much greater emphasis on technical design (56%), a similar emphasis on content (24%), and a much lesser emphasis on instructional design (16%). Furthermore, educators placed a much greater emphasis on the selection criteria other in stage one (11 %) than in stage two (3%). This difference in the category other was due primarily to one criteria, cost, which was mentioned by ninety-five percent of the educators prior to the activity and only ten percent during the activity. One criteria exhibited little change between the two stages. Very few educators mentioned social content criteria either prior to (2%) or during (1%) the C D R O M activity. These findings suggested that for some selection criteria categories the sample did not always use the criteria they claimed to use when evaluating C D R O M s (Stage One). This was especially evident for technical design criteria, where twenty-two percent more educators evaluated the C D R O M encyclopedias using technical criteria than had indicated prior to the activity. Similarly, but on a much lesser scale, instructional design and other criteria were mentioned by nine to ten percent fewer 89  educators during the hands on evaluation than before the activity.  56%  60%  • Stage One (Prior to CDROM Activity)  .2 50%  i l Stage Two (During CDROM Activity) O  40% 33%  30% cn  30%  IB  g  co  20%  Q_  Technical Design  Content  Instructional Design Criteria  Other  Social Content  Categories  Figure 21. Difference between samples' emphasis on categories of criteria in stage one and stage two.  The findings also suggested that for some categories of criteria, the sample did use the same criteria they claimed to use. The examples in this case would be social content and content criteria.  For both categories, very little change was found  between the number of educators using the criteria prior to and during the hands on C D R O M evaluations. However, an analysis of the criteria mentioned by individual subjects, does not support all of these claims for the whole sample (see Table 8).  For example, only  thirty-one percent of the subjects mentioned the same criteria during the hands on activity as they had mentioned before the activity. Similarly, only twenty five percent of the subjects to cite instructional design in stage one mentioned it in stage two, and  90  only ten percent of the sample mentioned other in both stages. Also, sixty percent of subjects who mentioned social content in stage one mentioned it in stage two (The high percentage here may be due to the very low numbers of educators choosing social content.). However, of the people who mentioned technical design selection criteria prior to the hands on activity, eighty one percent of them also mentioned it during the activity. Table 8 Percentage of Sample to Use the Same Criteria in Both Stages One and Two Percentage of Educators to Mention the  Criteria Category  Same Criteria in Stages One and Two Technical Design  81%  Social Content  60%  Content  31%  Instructional Design  27%  Other  12%  These findings indicate that both the individual educators in this study, and the entire sample, applied some of the same selection criteria during the hands on activity that they stated they used to evaluate C D R O M s .  The findings for the whole sample  and individuals indicated that there was a much greater emphasis on technical design selection criteria during the hands on activity, as well as a lack of selection criteria based on social content. Research Question Three How different or similar are the criteria selected by experienced intermediate educators to those criteria listed in the current literature? 91  In a recent review of librarians' and teacher librarians' C D R O M selection criteria, Gloria Baker (1997, pp. 49-50) reported that "the literature on the evaluation of C D R O M formats indicated that librarians closely scrutinized electronic products with long lists of detailed criteria to evaluate the intrinsic quality of technical components such as data presentation, navigation and search performance". She concluded that librarians' C D R O M selection criteria focused on evaluating a C D R O M s technical features and that "Content quality was less important than technical quality."  In order  to verify Baker's findings, a review of the literature dealing with the evaluation and selection of C D R O M s , including that which was not just pertinent to teacher librarians and librarians, was conducted for this thesis. The findings were similar to Baker's (1997). Sixty-five percent of the criteria listed in the literature focused on evaluating the technical aspects of C D R O M s (see Chapter Two, Figure 2, Page 25). Twentyseven percent evaluated the C D R O M ' s content, and five percent the instructional design.  Three percent dealt with other C D R O M selection criteria such as cost. Only  one percent of the criteria focused on evaluating social content. Current literature on C D R O M selection criteria emphasized the evaluation of C D R O M s based on their technical design qualities rather than on qualities like content, instructional design, and social content. In order to determine if educators in this study placed a similar emphasis on the same types of C D R O M selection criteria as that indicated in the literature, a comparison was conducted between the types of C D R O M selection criteria mentioned by the educators in stage one and the types of criteria applied by the educators in stage two, with the types of criteria mentioned by the authors of the literature reviewed. The comparisons yielded both corresponding and opposing results (Figure 92  22).  Unlike the emphasis in the literature on technical design criteria (65%),  educators in stage one claimed they used a balance of technical design (34%), content (33%), and instructional design (23%) C D R O M selection criteria.  Technical Design  Content  Instructional Design  C D R O M  Selection  Other  Social Content  Criteria  Figure 22. Comparison of the content categories of C D R O M selection criteria mentioned by educators in stage one, applied in stage two, and the content categories of criteria mentioned in the literature.  Also unlike the literature review, educators mentioned many more instructional design criteria (23%) than the literature (5%), and a greater percentage of other criteria (9%) than the literature (1 %).  However, in stage two, educators closely paralleled the  literature in that the majority (56%) of the selection criteria applied was categorized as technical design, while much less (16%) was based on instructional design. In all three cases, criteria based on social content never climbed above three percent. These findings indicate that intermediate educators C D R O M selection criteria are both different and similar to those presented in the literature.  The findings also  suggest that the C D R O M selection criteria educators claim to use (stage one) are 93  much different than those presented in the literature. Whereas the criteria they actually use (stage two) are more similar. It was also found that the criteria listed in the literature and mentioned by the educators were similar in that a very small percentage of the total criteria were categorized as evaluating social content. Research Question Four What resources do intermediate educators consult when evaluating and selecting C D R O M software for intermediate students? The data indicated that they consulted a variety of resources when evaluating and selecting intermediate educational C D R O M s . Data sets representative of these resources were generated from two sources. First from the subjects' written responses to question twenty-nine of the questionnaire (see Appendix G ) , and second, from subjects' verbal responses to questions asked during their interviews. Content analysis was used to identify and code resources mentioned in the subjects' interview transcriptions, as well as to build in intercoder reliability (see Chapter Three, Data Analysis). Sixteen different types of resources were identified from the educators' interviews (see Figure 23). During the interview, subjects not only duplicated the eleven resources listed by the researcher for question twenty-nine of the questionnaire, but they added five more to the list.  The resources were also grouped,  into two descriptive categories: human information resources and nonhuman information resources. Each category contained eight different resources. Human Information Resources Subjects placed a great deal of emphasis on consulting other people when  94  in CD  LO  co ZJ CD o o  v° IO  o  CO =1 CD O Oi CO _ CD £= Qi .9 ro p E ro E cz  IO  o  cz ro E 3  0"  I t  S|OOL|OS JSLjlO l  B  6u!M8!A8JcJ  dioa gg/}8UJ8tu|  s  O  sj3}uao aojnosey  PJJ)SJQ  S}U8JEd  sjuaiunooQ uoj)eonp=j jo AJ)SJU!|AJ  suoneoiiqnd |BUO|ieonp3-uoN  ipunoo luapnts / siuapnjs  U i_ 3  O CO G>  or sAeg iuaujdo|9Aaa iBuojssapjd sjsqsjiqnd  SJ8U|0B91 A6o|Ouqoei  sueuBjqn  J8qOB91  suojiBOiiqnd iBuonBonpg  SJ9qOB91  o o  o  o oo  o  o  CD  O LO  o  o ro  O CN  sjoieonpg jo eBejuaoaad  Figure 23. Resources Educators Consulted when Evaluating and Selecting Intermediate Educational C D R O M s during the Interview.  95  selecting C D R O M resources. For example, in response to question twenty-nine of the questionnaire (see Appendix G), every subject chose word of mouth as an information resource that they consulted when selecting C D R O M software for their classrooms. No other resource logged one hundred percent of the subjects' responses on this question, making people the single most important resource that educators turned to for assistance when selecting C D R O M s .  Similarly, seven of the  top eight resources mentioned by the largest percentages of educators during the interview specified people (see Figure 23), and sixty-two percent of the total number of different resources mentioned during the interview referred to people. The eight different human information resources mentioned by educators during the study, can be categorized into four distinct groups: (a) Other Educators, (b) Professional Development Events, (c) Retailers and Publishers, and (d) Parents and Students. Other Educators When asked on the questionnaire to specify which resources they would consult when selecting intermediate educational software, three of the four resources chosen by the largest percentage of the sample were Ts (teachers), T L s (teacher librarians), and TTs (technology teachers) (see Figure 24).  Correspondingly, a  review of the percentage of educators to mention each resource in the interviews also found that three of the top four resources mentioned by the sample were Ts, TLs, and TTs (see Figure 23). Several transcriptions provided insights into why such a large number of the research subjects said they consulted other educators when selecting C D R O M software.  One part time TL explained that consulting with other educators helped her 96  overcome her inexperience in selecting C D R O M s : I tend to rely on someone who is more experienced than I am. But at least I know where to look. Or if I'm taking a course and somebody, a tecky type, is handy I will ask if they have a list of recommended C D R O M s , and sometimes they do. I received a list of C D R O M s from one person at a U B C library course that way. (TL2)  I Human Information Resources ijNonhuman Information Resources 40%  35%  35% 30%  20% 10%  CO i CD  c o  o  CO  JZ  CO  cu h>-. cn _o o c JZ  o cu  o  3  co  1 CD  JZ  o  CO  cu  3  Q.  tz  m  E  Oz  cu  c o  Q. O CU > CU  c o  CO 3  CL  o  TJ LU  "5 a o  Q Q 15  cz o  o  TD LU  o\  'cz  +^ CO CJ 3  CO CD  JZ  CO  X3 LU  co  c o  5%  CO  in  c o  25  cu O cu o  3  ZJ  co o  Q.  co  or  c cu  XI 3 -*-» CO  I  o co cu  or  CO  b  Resource  Figure 24. Resources sample consulted when evaluating and selecting intermediate educational C D R O M software (Questionnaire).  But relying on another educator's knowledge was not just limited to the less experienced educators of this study. One TL/TT who had over twenty years of teaching experience stated: "There is also a high school teacher librarian in our district who does a lot of research, so I tend to rely on him to get information to make my [CDROM] choices" (TL/TT1). 97  Another T/TT explained that he was responsible for purchasing the C D R O M s for his school, thus consultation with other educators in the school helped him to select and purchase C D R O M s more closely aligned with their needs: I am finding out that it is difficult for me to choose something for grade one and two. I definitely want the teachers of each grade to help me select because they know their students' vocabulary and their abilities. For me, I would get the teachers to go and get title names and software reports and then I could take it from there. So collaboration with other teachers is good. (T/TT2) Another very experienced T explained that consultation with the school's teacher librarian helped ensure that her C D R O M resources met the necessary curriculum guidelines: Our librarian is very good and I trust her selection. She knows all our curriculum and she probably knows them better than I do, because all the teachers are in and out of there all the time. Often she will ask to sit down with me and we'll assess the C D R O M s together. (T2) One TL explained that consultation with other TLs helped them to quickly zero in on recommended educational titles, "I would look at some lists that have the C D R O M s in their districts and talk to some teacher librarians because they are already categorized within schools as which ones are best" (TL3).  Another T also pointed  out that consultation with other Ts was a big time saver, "I rely on other teachers' recommendations. You never have time to sit down with a [CDROM] sampler and go through it" (T1). Finally, and although subjects spoke of many positive reasons for turning to other Ts, TLs, and TTs for assistance evaluating and selecting C D R O M s , one 98  educator indicated that because another educator's selection criteria may not parallel your own (or yours might not parallel other educators'), one should be cautious when buying C D R O M s on just other people's recommendations (or recommending C D R O M s to other educators): A lot of people are doing [ C D R O M evaluation] now, by word of mouth. Even though people are suggesting a C D R O M does not mean it is right for you. I can choose what I like, but then people are always going to ask why don't we have this or that, so I put the school first over my personal preference. (TL/TT3) Professional Development Events After Ts, TLs, and TTs, the next most popular people resource chosen by the largest percentage of educators on the questionnaire, was professional development (pro-d) events. Often termed in-service, conferences, professional development days, pro-d days, and district workshops, pro-d events were also cited by seventy percent of the sample during the interview as a resource to rely on when seeking assistance in evaluating and selecting intermediate C D R O M s . Pro-d events were mentioned by one less subject than publishers. Pro-d events were categorized as a people resource because most educators spoke about the people they contacted, learned with, learned from, and talked to at the event rather than speaking about the chance to use the computer equipment, read reviews, and interact with other nonhuman resources. There were also substantial differences between groups of educators. During the interview, seventy five percent of the Ts, and one hundred percent of the TL/TTs, and T/TTs stated they attended pro-d events in the process of evaluating and 99  selecting C D R O M s and/or learning how to evaluate and select C D R O M s . Only forty percent of the T L s and fifty percent of the TTs cited they attended pro-d events when assessing and selecting C D R O M s . Of the educators who explained why they cited professional development events as a C D R O M selection resource, all suggested that it was a place or activity where they could exchange lists and information with other people regarding exemplary and or useful educational C D R O M s . For example, one TL/TT explained that she attended TL conferences as a way to exchange information with other TLs about C D R O M evaluations and recommended C D R O M s : I find that going to things like Fast Forward conferences are a good way. W h e n we meet with other teacher librarians we talk about C D R O M s and share ideas about what are some good ones. And I tend to take their evaluation, its important to follow that. (TL/TT2) Another TL specified workshops as a way to learn more about the criteria which makes a good C D R O M , "Often I have access to a workshop or someone giving a demo, such as the Vancouver Children's Literature Roundtable. on what makes a good C D R O M " (TL2). Publishers and Retailers The next most popular people resource to be mentioned on the questionnaire and during the interview were publishers and retailers. Seventy-five percent of the research sample indicated that they would consult publishing representatives, and fifty percent said they would consult retailers. There were no substantial differences between groups of educators. Of the educators that indicated they consulted retailers and publishers when selecting C D R O M s , most spoke about these contacts positively: 100  I usually look at the educational needs of the school, the IRP's, and I usually go down to a computer store and talk to my buddy down there and ask him what he has got that would fit the grade level. (TL5) And, When I'm selecting C D R O M software I use [Company Name] and they are very knowledgeable and I often just phone them up and say that this is what I'm looking for, and then they will look through the catalogue and suggest something. (T4) At the same time, subjects consulting retailers and publishers were careful to ensure that their classroom goals were being met and that they were not being pushed into buying something they didn't want or need: There is a lot to be said about the publisher representatives that come to the school or P A C meetings. They try to come and sell their software, such as C C T , Software Plus, Image Media. They push a certain software product with you. Y o u can take it or leave it, and there is some pressure tactics also. One representative at Software Plus, I didn't give the time of day to him so he doesn't come around any more. (TT1) Parents and Students Of the eight different types of people resources the sample referred to in this study, students and parents were mentioned by the smallest percentages of subjects. During the interview, half of the subjects cited students and thirty-five percent cited parents as resources. During the questionnaire, five percent mentioned they used students, and no one indicated they used parents as a resource.  Except for TL/TTs,  there were very little differences in the percentage of groups of educators (20-33%) 101  mentioning students and parents as resources during the interview.  Seventy five  percent of TL/TTs cited parents as a resource when assessing and selecting C D R O M s . In order to further determine the extent of parental and student involvement and influence on the sample's selection of C D R O M s , research subjects were asked if and/or how they collaborated and consulted with parents and students when selecting C D R O M s (see Appendix H, questions seven and nine). Four of the twenty educators said they involved parents in the evaluation and selection of C D R O M s , and three of the twenty said they involved students. Educators who involved parents in their selection of C D R O M s often did so indirectly, informally, and with some difficulty and reservations. One TL/TT explained that: I get feedback from them [parents]. W e have some very informed parents who do offer some suggestions. But they are good at letting me come to the decision of what the students need. If one or two come to me with a great name or C D R O M then I check it out. But I have developed the [selection] criteria for the school. Home choices are not the same as school choices. They [parents] seem to always like the good deals, and unfortunately the good deals are not always appropriate for schools. (TL/TT3) Another TL/TT explained that parents were only involved indirectly with the selection of C D R O M s because computers were relatively new to the school and staff, and a few parents were involved with the school's new technology committee. She stated: They [the parents] are just getting involved in helping us select C D R O M s because we have a new technology committee, and at the next meeting I will be showing them Digital Chisel and Hyper Studio, and we are going to discuss whether we can get a site licence and what they feel about it. (TL/TT1) 102  At another school, a T noted that another way parents were indirectly involved and influential in her development of C D R O M selection criteria was through the school's Parent Advisory Committee (PAC): The P A C at our school is putting lots of money into a fund so they can change our computers. A couple of them come in as lab monitors and they give me their suggestions or computer magazines. (T2) The majority of research subjects did not involve parents in their C D R O M selection process. One explanation was that involving parents required more time and energy than was available to either teacher or parent. One technology T, "I would not like to involve the students' parents. But if I did it would be quite a task" (T4). Another teacher librarian explained: I have a parent night where they are invited to come in. There is always a standing invitation to come in after school and work with them [CDROMs]. It doesn't happen a lot. Parents have lots of things to do. I think kids have a lot of things going on after school, so parents don't hang around in school like they might want to. So, no, parents are just not interested. (TL1) One TL/TT noted that she did not involve parents in her C D R O M selection or development of criteria because she felt that parents lacked the knowledge to determine what would fit the school's curricular needs. One problem is that parents tend to buy what's highly motivational for their children, and it may or may not fit our school's curriculum. Or it may or may not really have enough educational content to it. I have a balance in my [CDROM] selection criteria, the balance between information and glitter. (TL/TT2) 103  Similarly, one T/TT stated, "Some parents have donated to the [CDROM] selection and said they would like to see this or that. They haven't given a lot of direction, they have allowed us to make the proper educational selections" (T/TT2). Although six educators in this study noted that parents had been or were currently involved with fund-raising to either purchase or upgrade the school's computer hardware, none reported parents as being directly involved with the selection and purchasing of C D R O M software. However, four educators noted that several parents had approached them for advice on which C D R O M s to purchase for home use. Only one TL, one T/TT, and two Ts said they would involve students in their C D R O M selection process. All four educators involved students by having the students preview and critically evaluate the C D R O M S . One T worked with small groups of students and evaluated the C D R O M S using the district software evaluation form as a guide: I will get a program to preview and I will put it in the lab and sit with some students and just watch how they react with it. I will sit and listen and take notes on what is said. Kids are quite honest and will say, 'Oh cool look at the graphics', or 'How come I can't do this'? By watching their reactions and talking to them afterwards and going through the same checklist as we do for the district for previewing software, I will discuss it with the kids and give it [the C D R O M ] a rating. (T3) One T/TT commented that students have been assisting him to evaluate and select C D R O M s ever since they installed their networked computers a year ago: The C D R O M s were loaded onto the network, a number of [CDROMs], and the 104  children were involved. W e [teaching staff] watched and asked them [students]. W e actually surveyed the kids which [CDROMs] they would like to keep. I asked the teachers to ask the students to evaluate them because we had fifteen titles and could only afford five. So we got the children to choose their best five or six favorites. (T7TT3) Working with other teachers in the school, one TL said she involved students in the evaluation and selection of C D R O M s as part of her thematic units. W e actually built [ C D R O M assessment] into some of the units that we do. W e are doing early humans right now and using Groliers, Comptons, and World Book. They [students] need to do a comparison of how useful and easy to use they are. W e give them criteria, and if there are others, we list them. They look for how great visually they are and how hands on they are. It [ C D R O M evaluation] is built in for them. (TL/TT3) One T not only involved her students in the evaluation and selection of the C D R O M s , she had them post their reviews on web pages on the internet: Students are a valuable resource when selecting C D R O M s . The kids speak their mind about their thoughts on the C D R O M s . My kids do reviews on C D R O M s and put them onto the internet. (T2) The majority of educators reported that they did not involve students in the C D R O M selection process. The most frequent reason for not involving students was stated by a TT: "Kids would tend to go more for the game type programs and most of the time they are not educational" (TT4). Another T/TT agreed: "Students suggest C D R O M s that they would like. The problem is that at the intermediate level what they want is games. They would prefer to have games. They are not big on things that 105  are educational" (TVTT2). Two TLs also remarked, "Kids, by in large, want games. If you give them two hundred dollars to buy three or four C D R O M s they would buy all games" (TL5), and Kids really like to have games, and I refuse to have games in here, unless it's things like the Yukon Trail, which is a game format and they are learning from that. I just won't have games. I tell kids that they can have games at home, you can't do it here. (TL1) One TL/TT also doubted her students' ability to select C D R O M s which meet her curricular goals: I don't foresee them helping me select [CDROMs] for the same reason I don't ask them to help me select my textbooks. C D R O M s purchased here are curriculum related. I don't think that the students have the ability to assess the value of the information on C D R O M s . (TL/TT4) When asked if she would involve students in the C D R O M selection process in the future, a TL explained that finding time in her schedule to accomplish such a task would be difficult: "I would, except it comes down to the time needed. I would love to use students. So, I would involve students, if I had more time" (TL2). After explaining why they didn't use students in their assessment and evaluation of C D R O M s , six of the educators explained that they would like to see students involved with the evaluation and selection of C D R O M s in the future. "I would like the students to, but they don't now. But I would like that in the future" (TL/TT). "At this time they [students] haven't really shown that much interest [in CDROMs]. W e are at the beginning stages. This might be a possibility further down the line. I can't see it right at this time" (TL/TT1). 106  Finally, many educators who reported that they did not involve students in their assessment and evaluation openly struggled with the decision during the interview: Do I involve students? No. [Pause]. Should I? Y e s . Well, I do involve the students but not in a formal sense. If students say they want a certain book I will buy it. But, I haven't involved the kids formally in selecting C D R O M s . The kids in this school would be a really good source for selecting C D R O M s . A lot of them, about seventy percent, have C D R O M s at home. So they would have a good knowledge of what they like, not so much as what is useful for them.  I  would take their information with a grain of salt. (TL4) Nonhuman Information Resources Although subjects in this study indicated that they preferred to consult human information resources (especially other educators) when evaluating and selecting intermediate level educational C D R O M software, they also indicated that they consulted a variety of other nonhuman information resources (see Figure 22, and Figure 23). Eight different types of nonhuman information resources were identified in the course of the study, and findings for these will be presented under the following headings: (a) Periodicals, (b) Provincial and District Resources (c) Internet, (d) School-Based Resources. It was noted that some of these resources may have included contact with people (e.g., a resource specialist in a district resource center). If the research subjects did not specifically refer to the resource as a person, or did not indicate that the resource could not be used without any assistance from other people, it was included in this category. Only the resource periodicals were mentioned by a sufficient number of subjects to be able to highlight any substantial differences between groups of educators. 107  Periodicals According to the interview and questionnaire data, educational and noneducational periodicals were the top two information resources consulted by the largest percentages of educators in this study (see Figure 23 and Figure 24). Furthermore, educational periodicals were mentioned by the second largest percentage of educators, both during the interview (75%) and on the questionnaire (70%), falling just behind the human resource, teachers. Noneducational journals were mentioned by a much lower percentage of the sample (e.g., 4 0 % during the interview and 35% on the questionnaire), ranking these resources eight and ninth respectively on the questionnaire and interview. It was also noted in both the questionnaire and interview data that, except for TTs, seventy-five to one hundred percent of the all groups of educators consulted educational periodicals. Only fifty percent of the TTs consulted educational periodicals. However, the opposite was true in the questionnaire results for noneducational journals, where the largest percentage of users were TTs (50%) and the other groups averaged only twenty-three percent.  Data generated during the  interview regarding noneducational journals, though, found that the largest percentage of users (80%) were TLs. The interview data provided insights into several explanations why such a large percentage of educators consulted educational journals when evaluating and selecting C D R O M s . One of the most popular was that they provided useful lists and reviews of recommended educational C D R O M s . One TL/TT remarked she used reviews from journals to help her select and present information on a C D R O M : Yes, but also the librarians that we talk to and have meetings with, such as the 108  one that is coming up soon, we have to show up with our favorite C D R O M , and discuss how we use it with our classes. The Teacher Librarians of British Columbia put out a journal and in there are informative reviews of C D R O M s . (TL/TT1) Another TL stated that educational journals helped her to determine which C D R O M s were less likely to be a waste of money: There are some magazines that I use also to compare annotations of C D R O M books. New C D R O M s that have just come out. The Computer Teacher is one that I use for reference.  You don't get a chance to look over the C D R O M s  before you buy them so you have to look somewhere to get a perspective on the C D R O M . (TL3) One T emphasized that although journal reviews were more informative and trustworthy than publishers' reviews, she never relied completely on reviews: Things that help my evaluation are perhaps reviews. I don't generally trust the ones put out by the companies, but I consult computer magazines and teaching magazines. The references to C D R O M software that might appear in those magazines they might help me. I would not rely on them one hundred percent, but it would give me a place to start. (T3) Provincial and District Resources Subjects indicated they also consulted district resource centers and Provincial Ministry of Education Publications as resources they would consult when evaluating and selecting C D R O M s . Forty percent of the sample indicated they made use of the Province of B C ' s Ministry of Education publications during the interviews and on the questionnaire. District resource centers were mentioned by thirty percent of the 109  educators on the questionnaire and ten percent during the interview. All of the subjects coded as using provincial documents as a resource, referred to the new Ministry of British Columbia's Technology Integrated Resource Packages or other similar Ministry documents. These documents were used to help guide the subjects in both the development of their C D R O M selection criteria and developing general technology learning outcomes. One T/TT explained how the Ministry of Education's Learning Resources Branch Software Evaluation Form helped guide his evaluation and selection of his C D R O M software, while ensuring it related back to the provincial IRPs: The ministry has different sections [in the LRB's evaluation form] that relates back to the content of the IRP. Is it meaningful to the student? And so on. Then there is the use of technology, if you are using a book you just cross that one out. If you are doing video or computers then you fill this one in. If you are doing computers, it's got a computer section or video or audiotape section. So there are general criteria that apply to everything, to meet ministry standards and appropriate standards. (T/TT2) Another TL explained that she and her school were using the new B C Ministry of Education technology IRP to guide them not only in their evaluation of C D R O M s but in the implementation of their new technology resources and program: There was so much money put into the computer technology this year.  We  have started to use the IRP. Because part of the IRP's is aimed at this multi media kind of thing, they [the school] want the kids involved in that [and] if the ministry in particular wants, which they seem to want, with the IRP's, they want to have technology in every aspect of the class room and every subject level. 110  (TL2) Six educators also mentioned using district resource centers. Of these, four indicated that they used the district resource centers as a way of previewing recommended C D R O M s . They did not indicate whether they used the center regularly or not. A s one TT stated: "We also preview C D R O M s with the help of a place called [name of district resource center]" (T3). Two other educators spoke of using a resource center in the past, but due to budget cutbacks they no longer had access to such materials and services: A lot of it [evaluating and selecting C D R O M software] is done on your own time. W e use to have a distribution center but it got shut down due to lack of funding. So it is up to you to go out and find the software and evaluate it. (TL/TT3) One educator made reference the British Columbia Teacher's Federation (BCTF). Although he did not specifically state he used materials from this center, he did mention that some of his content had been shaped by some information he had received from or about the organization: "Stereotyping would maybe be another content appropriate criteria. Sex is another, and we have to watch out for this because it is a B C T F thing. You have to have a good representation of men and women" (T/TT3). Internet Two teachers mentioned they referred to the internet when looking for C D R O M s to be used with their students. One listed the internet in with the other resources she used to obtain information about potential C D R O M software: "I read [journal] reviews, the software packages themselves, and also internet reviews" (T1). 111  The second teacher used the district's local network to obtain information from other educators in her district, I usually look to other teachers that have used them before, as a resource, like a recommendation. W e have [name of district's email network], which is pretty good, which is interschool net mail. You pose a question and then you get a thousand answers. (T2) School-Based Resources Educators mentioned two resources which fit into this category: A school based technology committee and previewing C D R O M s at other schools. One TL/TT worked closely with a technology committee in helping her select C D R O M s . The committee included teachers, students, and parents. A second TL/TT mentioned that she first listened to how TLs in a neighboring district used C D R O M s and then once she received her C D R O M player, she borrowed C D R O M s from a neighboring school to preview the software. Summary In summary, intermediate educators in this study relied heavily on word of mouth when assessing and selecting C D R O M s for use with their students. Although they consulted a variety of different people, the majority sought advice from other Ts, TLs, and TTs. Retailers and publishing representatives were another popular contact for fifty to seventy percent of the educators. Few educators consulted parents and students when evaluating and selecting C D R O M resources. The sample also indicated that they made use of a variety of other resources in addition to people. The second most popular resource that educators consulted when evaluating and selecting C D R O M s were educational periodicals. Periodicals 112  were consulted for reviews on C D R O M s . Only forty percent or less of the educators made use of resources like the internet, district resource center, Ministry of Education documents, and noneducational journals. Research Question Five What factors are most responsible for influencing the development of experienced intermediate educators' criteria when evaluating and selecting C D R O M s for intermediate students? Data reflective of the factors most responsible for influencing the development of experienced intermediate educators educational C D R O M software selection criteria was collected from the research subjects' interviews. Six factors were identified through the content analysis of the interview descriptions: (a) Cost, (b) Time, (c) Hardware/Software Compatibility, (d) Previewing, (e) Usage, and (f) Training.  Figure 25 presents these factors in descending order according to the  percentages of educators to mention each factor. Some of these factors are also listed in Appendix N as C D R O M selection criteria codes as many educators also referred to them as selection criteria in their responses to question three of the interview. Cost Of these six factors, cost was mentioned in the interview by the largest percentage of subjects (85%) as an important influential factor in the evaluation and selection of intermediate educational C D R O M s (see Figure 23).  Cost was viewed  by the sample as such an influential factor that fifty-five percent of the subjects indicated it as a selection criteria in response to questions three and four, of the interview (see Table 6). Furthermore, in response to question eighteen of the 113  interview, fifty-five percent mentioned cost as an important difficulty educators face when selecting C D R O M software. 100%  -,  Figure 25. Factors most responsible for influencing the development of the research subjects' educational C D R O M software selection criteria.  The percentage of educators to mention cost during the interview also varied by groups of educators. For example, in response to question eighteen of the interview, Ts were the least likely to cite cost as an important difficulty educators faced when selecting C D R O M S , whereas TLs and TL/TTs were the most likely to cite cost (see Figure 26). Similarly, Ts were the least likely to mention cost as a criteria throughout the interview and C D R O M activity, whereas TTs were the most likely (see Figure 27). Educators cited several reasons why they felt cost was an influential factor in the development of their C D R O M selection criteria. Most of the factors cited by the  114  T L  TL/TT  T / T T  T  T  T  Type of Educator  Figure 26. Percentage of sample to mention cost as an important difficulty educators must face when selecting C D R O M s displayed by type of educator.  Type of Educator Figure 27. Percentage of sample to mention cost prior to and during the C D R O M activity displayed by type of educator.  115  educators negatively influenced their assessment and selection of C D R O M software. For example, in answering whether she was satisfied with her current collection of C D R O M s , one TL working in her school's new fully equipped computer lab replied that the cost of C D R O M s exceeded her software budget, thus limiting her on the number of C D R O M s she could purchase and have available to students and staff: There's not enough money. Its a matter of funding. There are many, many more [CDROMs] that I would like to get, such as the one I mentioned about immigration. It's called The Immigrants, but it is very expensive. It is about four to five hundred dollars. Its a wonderful C D R O M , but the expense is way too much. Its basically money that has held us back. There are many more I would like to order but there is a limit. So if you have to pick and choose, and weigh the time for the funding to come in [pause]. If I had the money then I would have a lot more [CDROMs]. (TL2) A T/TT explained that along with the initial purchase price of the C D R O M , the cost of site licences and future software upgrades restricted the expansion of the schools C D R O M selection: If the program is great but costs four hundred dollars, the schools can't afford it no matter how good the C D R O M is. A good example of this is All the Right Type. It is a great typing program for learning keyboarding skills but they just sent us an update that we can update to the latest version of 2.0 for a mere five hundred dollars for a school site license. W e can't afford that. (T/TT2) One TL explained that the cost of hardware curtailed her ability to upgrade her machines to run C D R O M s . This in turn limited the number of C D R O M s she had available in the library: 116  I need more hardware. I need more stations. I need a better printer. My printer doesn't support the graphics that I do have. I need more C D R O M s . I have a hard time buying C D R O M s when I only have one C D R O M drive. I also don't have the [hard drive] capacity to hold all the [CDROM] information. These are low end machines that we have, and we can't afford new ones. I'm hardware curtailed. (TL4) Another T/TT explained that the cost of purchasing C D R O M s for his networked computer laboratory makes him much more thorough and careful when selecting CDROMs: One thing about Encarta is that, being that it is one hundred dollars, that would mean about one thousand dollars for our network. That is one of the things that we run into sometimes. So whenever we buy C D R O M s we have to be very careful. (T/TT3) Subjects cited the costs of computer hardware, C D R O M software, and site and network licences as a factor in determining what C D R O M s they can select and purchase. In many cases, the cost of these items exceeded the technology budget with which they were working. One TL summed up this predicament for many of the educators when she stated: "Well, the big problem in evaluating C D R O M s is the cost, because of the limited funds allocated to us" (TL2). Although the majority of educators mentioned cost had a negative influence on the development of their C D R O M collections, one TL did speak of cost in a more positive light: I look for encyclopedia C D R O M s , because it's the cheapest way to get updated knowledge over buying a full hard cover set of encyclopedias. But [hard 117  covered encyclopedias] cost so much, its way out of our budget. Where as with eighty to ninety dollars you can get it all on C D R O M . Y o u just can't afford not to [buy C D R O M s ] . (TL1) Time Educators also referred to time as another factor influencing their selection of C D R O M s and collection development.  It was interesting to note, though, that the  percentage of the sample to mention time as a factor changed substantially over the duration of the interview. For example, prior to the C D R O M activity, only twenty percent of the subjects referred to time as a factor (see Table 6), whereas after the C D R O M activity, and in response to question number eighteen, an average of seventy percent of the educators mentioned the influence of time. Like the factor, cost, groups of educators also differed in respect to time. For example, in response to question eighteen of the interview, only fifty percent of the TTs cited time as a difficulty educators face when selecting C D R O M s , whereas sixty percent of the TLs, seventy-five percent of the Ts, and one hundred percent of the T/TTs and TL/TTs cited time (see Figure 28). Many educators explained that their job descriptions limited the amount of time they could spare on evaluating and selecting C D R O M s . A s the following T/TT explained, an evaluation of a C D R O M takes a great deal of time: It takes a long time to evaluate them [CDROMs]. Sometimes it is impossible to evaluate them. You just don't have the time. You take an hour or two to evaluate software properly, even on a weekly basis, and that's a big chunk of time. Trying to do that, and get lessons planned, and marking grades, and so forth, there is not enough time in the week to do all that. (T/TT3) 118  120% -,  T/TT  TL/TT  T Type  of  TL  TT  Educators  Figure 28. Percentages of sample to cite time as a difficulty educators face when selecting C D R O M s displayed by type of educator.  One TL/TT explained that cutbacks in her district have eliminated one of the resources which would have helped reduce the amount of time she spent evaluating resources like C D R O M programs: Time and opportunity. A lot of it [ C D R O M selection] is done on your own time. W e use to have a district resource center but it got shut down due to lack of funding. So it is up to you to go out and find the software and evaluate it. (TL/TT3) Reflecting upon her hands on evaluation of the encyclopedia C D R O M s , another T explained that not only do educators need to be given time to evaluate C D R O M s , but they need to be given time to learn how to operate the computer so that they can evaluate C D R O M s : "Time. More time to give a better evaluation. Some teachers might not be computer literate, and so they don't know what to do. Even I was  119  stumbling around looking for the back keys" (T3).  Another T/TT went one step  further and suggested that before time is spent teaching educators how to select C D R O M software, time must be put aside to teach educators how to use the existing computer hardware: Most teachers don't have the time to evaluate or select software. Most teachers don't have the knowledge of these computers, and if they're not familiar and don't know how to use them, they can't stick a disk in and say oh what do I do now. They need to first be familiar with the computers and have ease of use, which I have down here. (T/TT2) One T stated her agreement when asked what factors might hinder or help her evaluation and selection of C D R O M s .  "First off how to use the darn things  [Computer and CDROMs]. Time and energy, I think, because I don't know when I'm going to have the time, unless somebody forces me to sit down" (T4). One TL also explained that selecting C D R O M s is a time consuming process not only because educators must spend time conducting a hands on evaluation with the C D R O M , but because searching references (e.g., periodicals and other educators) in an effort to initially locate a C D R O M is also a time consuming process: "Another difficulty is the amount of time spent on evaluating the software. There is not enough time in the day to search references and then evaluate with hands on time." She then suggested a solution to the problem, "Having a well to do list of criteria is a big help in this factor because it [selecting C D R O M s ] is so new" (TL2). Finally, one T summed up many of the educators comments on time as an influential factor in their evaluation and selection of C D R O M s by paralleling the lack of time needed to thoroughly evaluate C D R O M s with the lack of time teachers are 120  given to accomplish their daily professional duties: The key is to actively evaluate. There is a difference between just looking at something for five minutes and being told it is a good tool. Having the time to think through anything, I think that this is a big problem in our profession, just having time to look at stuff. (T4) Hardware and Software Compatibility During the interview, forty-five percent of the educators indicated that hardware and software compatibility was an important influential factor, making it, on average, the third most important factor.  Hardware and software compatibility  issues were numerous, differed from school to school, and ranged from the fear of a new C D R O M freezing in older school computers, to concern that the C D R O M may not be compatible with the school's network and existing software. Different groups of educators also differed considerably in the number of educators reporting this as a factor. For example, none of the Ts, and only twenty-five percent of the TTs indicated that hardware software compatibility issues were a factor to be considered when selecting C D R O M s . However, fifty percent of the TL/TTs, sixty percent of the TLs, and all of the T/TTs suggested this was a influential factor when evaluating and selecting C D R O M software (see Figure 29). Nine of the twenty educators mentioned hardware and software compatibility as a factor influencing the development of their C D R O M collection and their C D R O M selection criteria, four mentioned that finding C D R O M software to run on their old computer hardware was a concern. One T/TT explained that "We need new computers because the equipment is old and will not support the new C D R O M software. I must always make sure it can run on our computers before I buy it" 121  (TVTT1). A TT had similar problems: "We have a Mac Plus, two IBM's that are new, and two Macs. W e are having a hard time finding software to run on the machines, noncomplex programs" (TL3). Another TT simply stated, "I want C D R O M s that won't freeze on my computers" (TT3).  100%  T/TT  TL  TL/TT  T  TT  Type of Educator Figure 29. Percentages of sample to cite hardware and software compatibility as a factor which influences their evaluation and selection of educational intermediate C D R O M software.  Two of the subjects reported that the problem was that their computers lacked the hardware necessary to run the C D R O M s . One TL reported, I need more C D R O M s . I have a hard time buying more C D R O M s when I only have one C D R O M player.  I also don't have the [hard disk] capacity to hold  the information. These are low end machines that we have. I am hardware curtailed. There is a good C D R O M that takes the kids up and down the Nile  122  which is great for the grade seven curriculum but I don't have the hardware to run it. (TL4) Another TL/TT also stated, The problem is not so much the C D R O M s as it is the fact that I only have one machine at a time to use it with. I would like to have quite a few more than that. But in the sense that the computer itself will only handle one [CDROM] at a time, I don't know if there is a whole lot of point in expanding the [CDROM] collection too greatly. (TL/TT2) In summing up his hardware and software situation, one T/TT reflected on one a recent hardware/software incompatibility which he and many other educators in his district were experiencing at the time. His anecdote also helps to summarize the hardware and software difficulties that almost half of the subjects in this study were experiencing: You have to be aware of the C D R O M s that are coming out with versions that are requiring a certain type of processor or Windows ninety-five only. And if your hardware doesn't match it, then you are out of luck. The ministry [B.C. Ministry of Education] has sent out a C D R O M disk on the IRP's that needs Windows three point one, which we don't have, or a Macintosh, with a higher processor than we have. The criteria that they want for the disk makes it unusable for seventy percent or more of B C schools! (T/TT2) Previewing Previewing was the fourth factor which educators indicated influenced their evaluation and selection of C D R O M s . Previewing was often defined by the research subjects as the act of taking time to review or test the C D R O M before purchasing it. 123  Thus in many cases previewing was mentioned in context with time and cost. When asked what were the most important difficulties that educators faced when selecting C D R O M s , thirty percent of the educators responded that there were no opportunities to preview the C D R O M S before purchasing them. Except for T/TTs and TTs, thirty percent of the subjects remarked on this issue of previewing C D R O M s . Almost seventy percent of three T/TTs mentioned previewing as a factor, whereas none of the TTs mentioned previewing (see Figure 30).  E d u c a t o r s by  Type  Figure 30. Percentages of educators to cite previewing as an important difficulty which educators face when selecting C D R O M s .  Of the six educators to refer to the importance of previewing C D R O M s , four viewed this factor as a method of minimizing costs. One TL explained that many software vendors did not make returns or exchanges once the plastic shrink wrap was broken, making previewing C D R O M s in this manner an expensive activity:  124  Very few places will let you try the C D R O M . If you don't like it then you take it back. Y o u can't tell just from the cover. It would be really nice to preview them [CDROMs] before because they are expensive. (TL5) However, one TT indicated that although not the norm, there are some retailers which allow some teacher C D R O M software previewing and exchanges: Computer City is very good and they will let me try it out and if it doesn't work because of our hardware or if it is inappropriate, they will let me take it back and exchange it. A lot of times you don't have a lot of access to try them out. (TT4) Another TL/T also noted that previewing C D R O M s allowed her to test them in the environment they would be used (her school). She felt that this made the whole evaluation and selection process easier: "Having the availability of these things [CDROMs]. A lot of places won't let you just take them on spec and evaluate them at school, which is the easiest way to do it" (TL/TT1). One TL emphasized the close tie between previewing and time: I have holes in the collection. For example, I don't have ancient history, and I need that quite desperately. I have been given some recommendations, but when I go to order it they are no longer available. I now have a one eight hundred number to phone to be able to get a preview of the software. I haven't done that because I haven't had the time yet. (TL3) Another TL suggested that preview C D R O M s were limited because educational C D R O M s were relatively new resources and not yet popular in the home market. This in turn did not make it popular and/or profitable to have educators previewing CDROMs:  125  I don't think that there is a big difference between an educational C D R O M and what is seen out there as a commercial market and what I would use here. There is a definite overlap. W e haven't gotten to the point with the traveling salesman coming around with preview encyclopedias for home use in C D R O M s . The ones that are more curriculum based such as social studies or science based have not made it to a popular kind of place in the market yet. (TL4) One TT disagreed, indicating that publishing representatives visited his school and allowed him to preview a variety of different C D R O M s . W h e n asked if there was an advantage to having publisher representatives come to him, he responded, "Yes, you get to preview the software without having to buy" (TT1). Usage and Training The two remaining factors, usage and training, were mentioned sporadically by fifteen percent or less of the educators during the interview. Due to the low percentages of educators, there were no substantial differences between groups of educators. These factors were often mentioned by educators as they spoke about the cost of the C D R O M or about the amount of time it took to locate and evaluate the appropriate C D R O M . For example, one TL discussed the usage of a C D R O M in terms of the C D R O M ' s worth, both in terms of time and cost, The C D R O M s don't get used as much as I would like, but I have to constantly remind myself that only one to two kids can be on it [the C D R O M capable computer] at the same time. But if not, then, only one can be on it. That is where it is hard to justify it. It's not just the money, it's the time. (TL1) Another T/TT spoke about training on software, like C D R O M s , also in terms of time 126  and money, C D R O M s is one thing, getting them [other educators in the school] familiar with the internet so they can teach the kids, using the software, repairing the machines. But even just using software, the teachers need in-services on how to use word processors, the spreadsheets and the data bases, and other things... What they [the B C Ministry of Education] want and what we can do are two different things. Time, money and in-services. I have brought this up in computer meetings and the [technology] curriculum is going nowhere if teachers don't know how to use it. (T/TT3) Summary In summary, six different factors were identified as being responsible for influencing this study's research subject's evaluation, selection and purchasing of intermediate educational C D R O M s . The two factors to be mentioned by the largest percentage of the sample were, not having enough time in the day and money in the budget. It was also noted that the lack of time and money were at the root of several other factors as well. For instance, and as one TL explained, without a adequate budget, she couldn't update her hardware. Without new hardware, students and staff could not run the newer C D R O M software. And since it was hard to find old C D R O M software, she was severely limited in the number of C D R O M s she could preview, evaluate, use, select, and purchase. A s this TL summarized: I am being held back due to the lack of resources and operating systems. The hardware that we have I'm not happy with either. They are only four eightysixes and you can't run Windows ninety-five on them. I am limited because I don't have Windows ninety-five, and I can't buy what [the C D R O M s ] I want until 127  I get Windows ninety-five. (TL5) Finally, although there were trends for the sample, it was found that different groups of educators do not emphasize the same factors equally. For example, it was found that only twenty-five percent of the Ts cited cost as an influencing factor, compared to eighty percent of the TLs. Similarly, only half of the Ts cited time as a factor whereas all of the TL/TTs and T/TTs found they didn't have enough time to adequately evaluate and select C D R O M s . Research Question Six Do different groups of educators (teachers, teacher librarians, and technology teachers, teachers/technology teachers, teachers/technology teachers) use similar C D R O M selection criteria and are they influenced by similar factors when selecting intermediate educational C D R O M software? Data from the previous six questions indicated that different groups of educators both do and do not use similar selection criteria when evaluating and selecting intermediate educational software. For example, only one educator (25% of the group) in each group of Ts, TTs, and TL/TTs indicated multimedia as a C D R O M selection criteria. This suggested that multimedia was not an extremely important criteria for these three groups, and, in this manner, these groups are similar. On the other hand, if we compare these groups with the TL and T/TT groups we notice a substantial difference. For example, sixty percent of the TLs mentioned multimedia as a criteria, thirty five percent more educators than the three previous groups. Furthermore, all of the T/TTs mentioned multimedia, seventy percent more than the Ts, TTs, and TL/TTs. These results also indicate that the C D R O M selection criteria between groups also differ. 128  Similar to these findings on C D R O M selection criteria, an analysis of the data representative of the factors influencing research subjects in their evaluation and selection of intermediate educational C D R O M software also found both similarities and differences between groups of educators. For example, time was reported as an influential factor by half of the Ts, and all of the TL/TTs and T/TTs. This suggested that when it came to time as an influencing factor, TL/TTs were more likely to be similar to T/TTs than Ts.  129  C H A P T E R FIVE D I S C U S S I O N , R E S E A R C H LIMITATIONS, and R E C O M M E N D A T I O N S Discussion This study was exploratory and descriptive in nature and was intended to provide an initial step towards identifying experienced intermediate educator's (teachers, teacher librarians, technology teachers, teacher librarian/technology teachers, teacher/technology teachers) current C D R O M selection criteria, and the factors affecting the development and application of these selection criteria. The data collected to meet this main objective are discussed in reference to the study's six research questions. Research Question One What selection criteria do experienced intermediate British Columbia educators use to evaluate C D R O M s to be used with intermediate students? The research team identified forty-six C D R O M selection criteria which subjects used when evaluating and selecting intermediate educational C D R O M s (see Appendix N). These were further grouped into five general content categories (e.g., content, instructional design, technical design, social content, and other) based on categories outlined in the B C Learning Resources Branch's resource selection guide, Evaluating. Selecting, and Managing Learning Resources (Ministry of Education, 1996b), and as indicated in the current literature. The bulk (87%) of the forty-six criteria were categorized as content (29%), instructional design (29%), and technical design (29%). It was noted that the sample mentioned very few social content criteria (2%), and that nine percent of sample's criteria fell into the category, other. The criteria categorized  130  as other consisted of previewing, usage, cost, and time (see Appendix N). Prior to the subjects' hands on evaluation of two C D R O M s , subjects were asked to list and rank order their C D R O M selection criteria.  The sample's top eleven most  important general C D R O M selection criteria were (a) provincial curriculum, (b) age appropriateness, (c) hardware/software compatibility, (d) intuitiveness, (e) complimentary resource, (f) educational, (g) learner diversity, (h) interactivity, (i) teacher curriculum, (j) Canadian content, and (k) critical thinking. Subjects listed provincial curriculum (including district and school curriculum) as the most important general C D R O M selection criteria. It was also noted that six of these top eleven criteria were classified as technical design criteria, four were classified as content criteria, and the rest were classified as instructional design and other criteria. Research Question Two Which of these selection criteria do experienced intermediate educators use when evaluating two encyclopedia C D R O M s to be used with intermediate students? It was found that very low percentages of research subjects used all of the same C D R O M selection criteria during the hands on activity as they had stated that they used in their selection of C D R O M s . Prior to the activity, subjects were speaking of C D R O M s in general terms and therefore listed criteria which suggested an equal evaluation of a C D R O M ' s content, technical design, and instructional design. During their evaluation of two C D R O M encyclopedias, subjects emphasized the evaluation of the C D R O M ' s technical design. The differences between the subjects' general C D R O M selection criteria and the criteria they used while evaluating the two C D R O M s were not surprising. Subjects  131  were well aware that there was not one set of criteria which could be applied to all C D R O M s . For example, when subjects were asked if they agreed with the statement, "Selection criteria from other media formats (e.g., books, video, radio) could be applied to C D R O M software.", all but one of the subjects answered both yes and no. All subjects felt that there were general criteria which applied to all resources, including C D R O M s , but that there were also specific criteria which applied to particular types of resources. One TL/TT suggested that content criteria could span all resources, but the interactivity of C D R O M s was particular to C D R O M s and Laser Disks: Yes, I think that there is quite a bit of carry over. There are [criteria] specific to C D - R O M s . The interactive nature is the main one, I think.  That is what  C D - R O M s do best, interactivness....The content questions would be pretty similar to content questions you would answer in selecting a book. But the format questions [of a C D R O M ] are quite different. W h e n you buy a book you look at the binding and the way it is laid out, the white space, provision of index and all those kinds of things. Whereas for C D - R O M s the requirements of formatting are quite specific and different. The quality of graphics would apply to a lot of mediated formats. You look in a book and see how clear and good the pictures are. What about the color printing? Are the pictures appropriate or are they just decoration? That kind of thing. Of course if you are selecting a picture book, that's a different set up too. Y o u want different criteria slightly. There are some carry overs, sort of like a Venn diagram. I mean there are some common, and also some specifics. Thus, it was not surprising to find that educators adapted their general C D R O M  132  selection criteria and only used the criteria specific to the two C D R O M encyclopedias. This would explain why subjects placed a much greater emphasis on evaluating the encyclopedias' technical design during the hands on evaluation. Therefore, it could be concluded from this study that each C D R O M has its own specific characteristics and that educators will adapt their criteria to evaluate these characteristics. It can also be argued that the difference between the subjects' general selection criteria and the specific criteria they used during their evaluation of the encyclopedias is not just a design factor inherent to these two C D R O M encyclopedias. Instead, this difference could be seen as a factor of C D R O M s in general. That is, because many C D R O M s make use of advanced dynamic multimedia, hypertext, sound, graphics, and other slick technical features, teachers may be drawn towards evaluating these features and tend to ignore or place less emphasis on criteria like instructional design, content, and social content. Shade (1996) conducted a study in which he found that the majority of award winning software programs were not developmentally appropriate. In the conclusion of his article, he warned that educators must "look carefully beyond the bright packaging and wonderful graphics to see what the software is really doing" (p. 21). It was clear that there were differences between the general C D R O M selection criteria the subjects listed prior to the C D R O M activity, and the criteria they listed specifically for the two C D R O M s . For example, only thirty one percent of the sample to mention content prior to the hands on C D R O M evaluations, mentioned it during the evaluations. And twenty two percent more educators mentioned selection criteria based on technical design during their hands on evaluation than they had mentioned  133  prior to the C D R O M exploration. The differences may be due in part to the type of C D R O M s evaluated or the educators expectations of C D R O M s . Why these differences existed appears unexplainable at this time. Further studies should be conducted to determine if educators consistently place a heavy emphasis on evaluating the technical design for a variety of C D R O M s from a variety of subjects. At the same time, different groups of educators (e.g., teachers, teacher librarians, technology teachers, teacher librarians/technology teachers, teachers/technology teachers) can be asked to conduct their evaluation and selection of C D R O M software using a variety of different types of software. For example, a teacher librarian and a teacher can be asked to evaluate a C D R O M book or a social studies simulation program. Such a study would help to determine which selection criteria can best assist educators in the effective selection of a variety of C D R O M s for use with intermediate students. Research Question Three How different or similar are the criteria selected by experienced intermediate educators to those listed in the current literature? Subjects' C D R O M selection criteria were found to be both similar and different to that uncovered in the literature review. The criteria were different in that prior to the hands on activity subjects placed less emphasis on technical design than was indicated in the literature review. The criteria was similar in that subjects' emphasized evaluating the technical design of two C D R O M encyclopedias during their hands on evaluation of the resources. The differences between the selection criteria mentioned in the literature and by  134  the sample prior to their hands on C D R O M evaluation, may be due to the fact that the literature did not focus on the intermediate educators' C D R O M selection criteria, whereas this study's focus was specific to the criteria intermediate educators use to select C D R O M s for use with intermediate students. More research needs to be done on C D R O M selection criteria for this and other age groups (e.g., primary and secondary educators and students). Research Question Four What resources do intermediate educators consult when evaluating and selecting C D R O M software for intermediate students? Subjects consulted sixteen different types of human and nonhuman information resources when evaluating and selecting intermediate C D R O M s (see Figure 23). The variety and number of resources they consulted was high and may be due to the fact that they felt they needed additional support and training to develop confidence in selecting C D R O M programs for their own use. Word of mouth was the most common method of acquiring information when evaluating and selecting C D R O M s . The sample's four most popular resources (e.g., used by the largest percentages of educators) were teachers, educational periodicals, teacher librarians, and technology teachers. It was also found that the number and types of resources used by some of the five groups of educators varied. According to Russell (1994), in the absence of proper guidance and training (an advice vacuum), teachers are more likely to uncritically select educational software. According to a recent survey on the B C Ministry of Education's Learning Resources Branch (LRB), educators experienced a similar advice vacuum. Fifty two percent of  135  456 B C teachers surveyed claimed that the L R B was not a proactive promoter of technical resources in B C schools, and eighty-four percent felt that the L R B should sponsor more training on how such technologies can be used in the classroom (Ministry of Education, 1996a). Without proper training on how to select computer software, like C D R O M s , B C educators are left to evaluate and develop selection processes and criteria on their own. This would help to explain why the educators in this study placed such a heavy emphasis on consulting each other and educational journals when evaluating and selecting C D R O M s , and that none of the subjects mentioned that they received training from the B C Ministry of Education. , With the number of computers increasing in elementary schools, it is important that educators be trained how to evaluate and select their own C D R O M s .  Without  adequate training and proactive leadership from the Ministry of Education, educators are left to struggle with the development of their own selection criteria and selection processes. This in turn can lead to frustration and the purchase of uneducationally sound C D R O M software. Training is especially important for new educators learning to integrate computer technology and C D R O M software into their curriculum. For without the knowledge and skills to select their own computer software, their experiences with these new technologies can be so discouraging that they end up ignoring the tools altogether. Research Question Five What factors are most responsible for influencing the development of experienced intermediate educators' criteria when evaluating and selecting C D R O M software for intermediate students?  136  Lack of money (85%) and time (70%) were cited by the largest percentage of educators as factors influencing their evaluation and selection educational C D R O M s . Hardware and software compatibility (45%) and the opportunity to preview the C D R O M s (30%) were the next two most popular factors. Usage (15%) and training (15%) were the least common factors. It was noted that hardware and software compatibility were closely tied into the cost factor, and that educators often spoke of previewing in relation to the lack of time they already have in their busy schedules. It was also noted that groups of educators were both similar and different to each other in the numbers of educators mentioning each factor. Examples of these differences and similarities are explored further in response to the following question. Research Question Six Do different groups of educators (teachers, teacher librarians, and technology teachers, teachers/technology teachers, teachers/technology teachers) use similar C D R O M selection criteria and are they influenced by similar factors when selecting intermediate educational C D R O M software? The study's findings showed that groups of educators were both different and similar in the percentages of educators specifying (a) types of C D R O M selection criteria, and (b) factors which influenced their evaluation and selection of C D R O M s . The following discussion highlights a few of these representative differences and similarities. A complete description of these differences can be found in chapter four. Provincial curriculum was the C D R O M selection criteria to be given the highest ranking of importance by the sample.  However, there were differences between  groups of educators with regards to the amount of importance they placed on this  137  criteria. For example, when asked to list and rank order their selection criteria, at least fifty percent of the TLs, TTs, TL/TTS and T/TT mentioned provincial curriculum as one of their top three most important C D R O M selection criteria. The one exception was the group of teachers. Only one of their members listed provincial curriculum in their top three selection criteria, and she ranked it second. Similarly, prior to the hands on activity, it was noted that provincial curriculum was mentioned by the third largest percentage of the sample. Again, there were differences between groups of educators. For example, only half of the Ts and TTs and sixty-five percent of the T/TTs mentioned they selected C D R O M s using provincial curriculum criteria, whereas all of the TLs and TL/TTs mentioned this criteria. In both cases, TLs and TLTTs indicated that they not only selected C D R O M s on the basis that they matched the provincial, district, and school curriculum, they ranked this criteria highly important. Teachers, on the other hand, did not indicate that they used this particular criteria as extensively, nor did they place as high an importance on this criteria as did the TLs. Although the relatively small size of the two groups makes reliable comparisons difficult, the researcher believes that a tentative explanation can be made to begin to justify the differences between these two groups. One of the most likely reasons for the difference between TLs and Ts, is that TLs were much more concerned than Ts with the evaluation and selection of C D R O M s based on provincial, district, and school curriculum. During the interview, all of the TLs and TL/TTs indicated that they not only selected C D R O M s for their library but they selected C D R O M s to meet the curricular objectives and needs of other educators in the school. One TL/TT highlighted this  138  aspect of a TLs job when the researcher asked her to describe how she would choose C D R O M s to be used in her resource center, First off, I would see what the teachers needs are in the terms of curriculum areas. Whatever teachers are doing is a direct link to [the library]. All curriculum is planned around that. W e deal with what teachers needs are and then carry on from there. Each teacher submits each year a yearly plan in September with what they expect to cover and I base it on that. During that year I will be actively looking for materials. Conversely, teachers focused more on their own students' needs. For example, Ts indicated that eleven of their twelve top three most important C D R O M selection criteria were based on selecting C D R O M s that would meet the needs of their own classrooms and students. A s one teacher concluded, "I try to choose [CDROMs] that will compliment the class.", and another remarked "There has to be some interaction between what the C D R O M is offering and what the student is getting out of it." Although there were many differences between the five groups' selection criteria, there were also similarities. For example, none of the groups placed emphasis on social content selection criteria.  Only three of the forty-six criteria  identified were classified as social content, and none of the educators' top three criteria focused on evaluating social content. Only fifteen percent of the sample indicated social content in their lists of general selection criteria, and only thirty-five percent of the sample spoke of social content during their hands on evaluation of two C D R O M encyclopedias. (In the last example, thirty-five percent of the sample thought the use of slang in the Ultimate Children's Encyclopedia was not socially appropriate.).  139  This overall lack of social content criteria in this study (6.5% of the total number of criteria mentioned) contrasted sharply with the amount of social criteria listed in the LRB's resource selection guide (Ministry of Education, 1996b). Of the fifty software selection criteria listed in the LRB's computer software checklist, thirteen (or 27%) of the criteria listed emphasized the evaluation of social content. Subjects' relatively low emphasis on social content may have been due to a variety of factors.  For example, it  may be suggested that educators in this study were less inclined to evaluate social content in C D R O M s because (a) subjects were focusing on the technical aspects of the C D R O M , (b) the two C D R O M encyclopedias did not have serious social content flaws, or (c) the subjects were not expecting to focus on evaluating social content in an encyclopedia (as compared to a C D R O M on sexism). Data also indicated that there were both differences and similarities between groups of educators with regards to the types of factors influencing their selection of C D R O M s . A n example of a difference could be found in the issue of hardware and software compatibility. All of the T/TTs indicated they were concerned with hardware and software compatibility issues, whereas only one quarter of the Ts and TTs were concerned with this factor. At first glance these differences seem unusual, especially since these three groups of educators have a great deal in common. However, a review of the background information for all groups found that the T/TTs were working with the oldest computers (six years and older) and Ts were working with the newest computers (one to three years old). T/TTs in this study cited hardware and software compatibility as a primary factor because their computers needed to be upgraded or replaced.  A s one T/TT indicated, "We need to replace most of the machines. W e  140  have ordered five more, with C D R O M s , and they are on their way. W e need at least thirty more." It was noted that groups also shared some influential factors. For example, a large percentage of educators within all groups indicated that cost and lack of funds influenced their selection of C D R O M s . Cost was so prevalent a factor in subjects' minds that it was even mentioned as a selection criteria by ninety percent of the sample. Whether it was the cost of hardware, the price of a single C D R O M , or the money required to set the C D R O M up on a network, almost all educators explained that the price tag of the C D R O M and the lack of funds limited the number of C D R O M s they could evaluate and purchase. In conclusion, data showed that groups of educators used both different and similar types of C D R O M selection criteria and indicated that there were different and similar types of factors affecting their selection of C D R O M s .  However, the scope and  size of this study did not allow for an in depth analysis of the reasons why these similarities and differences exist. Similar studies dealing with larger samples and more diverse sets of C D R O M s could generate a much more detailed and reliable set of data, and produce a more comprehensive representation of the complexity of these differences and similarities. Recommendations for Administrators and Practitioners Groups of educators in this study were both different and similar to each other in the number and type of C D R O M selection criteria that they (a) indicated they used when selecting C D R O M s , and (b) used while evaluating two C D R O M encyclopedias. In light of these findings, it is recommended that educators be trained to select  141  C D R O M software in groups of educators which are both similar and different to themselves. For example, it is important that TLs meet with each other to share information with regards to C D R O M selection criteria, selection procedures, and recommended C D R O M s .  Such a meeting, or training session, helps these TLs share  similar knowledge, skills, and experiences and needs specific to their professional duties. However, because it was also found that there were many similarities and differences between groups of educator's C D R O M selection criteria and factors affecting the development of these criteria, educators must also be encouraged to attend in-services and training sessions in which they are exposed to a variety of other types of educators' knowledge, experiences, and skills in the area of C D R O M software selection. Such in-services or training sessions would help to build a team approach as well as help educators develop more effective, inclusive, and comprehensive selection criteria. In a training session like this, Ts would become more aware of C D R O M selection criteria based on school/district/provincial curriculum, and TLs would become more sensitive to the specific needs and C D R O M selection criteria of teachers. A large percentage of educators noted that cost and time were two factors which hindered their evaluation and selection of C D R O M software. Without adequate funding for computer hardware, the number of C D R O M capable computers will remain limited. Schools in this study averaged seven C D R O M capable computers. All of the educators indicated they were not satisfied with their current hardware and that they needed more C D R O M capable computers. One TL/TT also indicated that this lack of hardware affected the number of C D R O M s she evaluated and selected :  142  I don't have enough C D R O M players. I would use more [ C D R O M programs] if I had more hardware. I have held off purchasing because we don't have the hardware yet. So once they have the hardware then I will be looking very closely as specific curriculum areas. Without adequate hardware, the number of students able to access the C D R O M s becomes limited as does the demand for software. (TL/TT4) It is recommended that current provincial policies maintain or even increase funding for computer hardware, software, and the training of educators in the use this hardware and software. Time was another factor mentioned by seventy-five percent of the sample. The need for time to preview and evaluate the C D R O M s , time to be trained how to use the computers, and time to learn how to select C D R O M software were foremost in relation to the subjects' time concerns. Without being given release time, many of the educators felt that they could not adequately preview, evaluate, and select C D R O M s . Examples of the importance of having time to evaluate C D R O M s was highlighted by two educators in the interview. A TL/TT was thankful to be a subject in the study because it gave her the time to evaluate a C D R O M she had been wanting to evaluate: "I thank you for showing me the Encarta, I was wanting to take a look at it for evaluation." A TT found that being given the hands on time helped him to clarify the difference between the influence of cost and time on his selection process: This has been a valuable learning experience for me. I thought that cost was the biggest factor but now looking at the difference between them [time and cost] perhaps cost isn't the biggest factor. (TT4)  143  Without being given appropriate release time to learn how to evaluate C D R O M s and actively evaluate C D R O M s , many educators in this study felt that they could not do an adequate job of selecting C D R O M s to meet their needs and their students' needs. Consequently, it is recommended that provincial technology specialists and district resource specialists should approach the educators in their schools and districts and create a professional development plan which gives educators the opportunity to evaluate, select, and apply C D R O M software. W h e n searching for C D R O M s to add to their collection, several educators noted the limitations of consulting resources like journals and other educators. At the same time, the advantages of previewing C D R O M s was noted, especially if the C D R O M could be tested with the intended students. Unfortunately, educators also indicated that they had difficulties obtaining C D R O M s to preview.  Retailers and  publishers were unwilling to offer preview copies, some district resource centers had been closed or were closing, and public libraries did not always have copies available. Many educators indicated that they wished it was much easier and less costly to preview C D R O M s . A s eighteen of the twenty educators in this study noted that their schools were connected to the internet (The other two stated that their schools would be slated to be connected in the next year.), it is recommended that the provincial government, or district computer specialist, work with other schools, districts, and Ministry of Education agencies to create a web site for previewing C D R O M s on line. Not only could the site give educators a forum to test drive the C D R O M s they're interested in, but it could also provide a forum for educators to record their own annotations and  144  suggestions and lesson plans for using the software in the classroom. Such a resource would help educators preview C D R O M s more easily, and feel much more comfortable knowing that the resources have already been identified as educationally appropriate by other educators.  Such a project could also encourage the  collaboration between retailer, publishers, educators, and existing educational institutions. For the schools not connected to the internet, it is recommended that educators be encouraged to share their C D R O M s with other educators from nearby schools, and that the district computer specialist (a) help circulate a newsletter and computer journals containing annotated lists of recommended C D R O M s and other relevant information, (b) setup up C D R O M workshops in his/her district, and (c) circulate C D R O M s for educators to preview, evaluate, and test with their students. The researcher also found that several educators were using their district's or another district's software evaluation checklists. One T/TT indicated that his students also evaluated the new C D R O M s using the district software checklist. In fact, the researcher like Squire and McDougall (1994) could not find evidence of any other type of evaluation tool being used by the subjects in this study. Of the two evaluation forms presented to the researcher by subjects, neither went beyond a one or two page simple checklist which encouraged the evaluator to rate the software using a five point Likert Scale. The checklists did not allow for differences in the content, teaching styles, and objectives. They also did not provide any opportunity for formative evaluation. These checklists may be adequate for educators to evaluate C D R O M s on a general level, but they may not provide adequate evaluation for educators wishing to evaluate C D R O M s on a specific level. In addition,  145  in light of the different selection needs and selection criteria between individual educators and groups of educators, it is recommended that educators be made more aware of the limitations of checklists and that not all items on a checklist may be applicable to all C D R O M s and the specific needs of all educators. It is also recommended that educators also adopt a more formative evaluation of the C D R O M s they are thinking of purchasing. Limitations of the Study The study's design made use of a small and specific sample randomly selected from four different school districts. Since five types of different educators were addressed in this study, the numbers in each group had to be small. Consequently, the small sample size makes it difficult to generalize the results. For example, the study focused on grade four to seven teachers, thus the results may not be applicable to primary or secondary teachers. Also, the study does not focus on a large enough sample to be applicable to all districts in the Lower Mainland. Research subjects' interview data were analyzed using content analysis. The study's validity was strengthened as the researcher and two research assistants (reviewers) were able to identify, compare, categorize, and generalize C D R O M selection criteria and selection trends from large quantities of the samples' verbal transcriptions. Unfortunately this strength was also a weakness. That is, the reliability of the data was somewhat compromised as the originality, complexity, and multiplicity of meaning of each subjects' speech act was interpreted by the reviewers and reduced to only one of forty-six available C D R O M selection criteria. In addition, these forty-six criteria were further categorized into one of five content categories.  146  Such interpretation and redefinition of subjects' oral communication is confining and distorting in that it ignored (a) combinations and permutations of the existing forty-six C D R O M selection criteria and five categories, and (b) C D R O M selection criteria or categories not defined in this study. The findings in this study also indicated that educators focused on evaluating the technical design of a C D R O M while actively assessing two C D R O M encyclopedias. This finding may not be applicable to all C D R O M programs. For example, educators in this study may have placed a greater emphasis on technical design during the hands on evaluation because that is what they expected from a C D R O M encyclopedia. If a math or social studies or drama C D R O M had been used for the C D R O M exploration, the educators may have emphasized a much different type of criteria (e.g., content or instructional design). If a C D R O M on racism would have been used instead of the encyclopedias, educators may have placed a much greater emphasis on social content than on technical design. Similarly, the criteria educators applied during their evaluation of the two C D R O M s may not be consistent for all educators and across all grade levels. For example, educators may have different expectations of a technology C D R O M as compared to a C D R O M on dance. Grade four and grade seven educators may have quite different expectations and selection criteria when evaluating, for example, the same science C D R O M . Recommendations for Further Research This study focused on exploring the existing C D R O M selection criteria and current influential factors for experienced intermediate educators. Further studies  147  could investigate the development of C D R O M selection criteria for inexperienced educators, or those just beginning to integrate technology into their classrooms and/or schools. For example, a case study of inexperienced educators just beginning to select C D R O M s for themselves, their library, computer laboratory or school, could determine how long it takes these educators to develop their selection criteria, and whether they develop similar criteria and are influenced by similar factors other than those used by more experienced educators. Similarly, other studies could involve the random selection of a larger group of educators, both technophiles (people who enjoy working with computer hardware and software) and technophobes (people afraid of working with computer hardware and software), so as to include the views of those both positive and more doubtful of the roles C D R O M s can play in intermediate education. Such a study could also investigate whether one group of educators (e.g., teacher librarians) are more involved with the evaluation and assessment of C D R O M s over another, and if there are differences in the types of C D R O M s each educator evaluates and the selection criteria they have developed. Further studies could also involve larger samples to determine if the trends and findings in this study can be applicable to a larger group of educators (e.g., Lower Mainland, B C , Canada).  Studies should also be conducted over longer periods of  time to determine if the rapid change in software and hardware is a factor in the development of educators C D R O M selection criteria. According to the findings in this research, subjects emphasized the evaluation of the technical qualities of C D R O M s when they evaluated two C D R O M encyclopedias. Research should also be conducted with educators using a variety of  148  different types of C D R O M s to determine if this finding is applicable to all C D R O M s . This exploration should also be expanded to include a variety of genres, software formats, or media. Two examples of research questions could be (a) How different or similar are the selection criteria for selecting a C D R O M as compared to book, audio, and video?, and (b) Are the selection criteria and evaluative processes for C D R O M s the same for the same resources in different formats? (e.g., a print book, the same book in C D R O M format). This study also revealed that some educators used parents, students, other educators, and school committees to assist them in evaluating and selecting their C D R O M s . Another important study would to be to look at the types of criteria and methods of selection these different groups develop to determine (a) any strengths and weaknesses, (b) differences and similarities in the selection criteria, (c) the processes in which they evaluate and select C D R O M criteria, and (d) the manner in which they deal with influential factors like time and cost. A large percentage of the subjects indicated that they consulted educational journals when selecting educationally sound C D R O M s for their students. Another study could focus on this resource and determine which journals are being used, how influential these resources are in developing an educator's selection criteria, and do the C D R O M selection criteria of journal editors and contributors match those of their readers? As was pointed out in the literature review, the majority of educators currently assessing C D R O M s used software selection checklists. Squire and McDougall (1990) argued that these checklists are much too restrictive to assist educators to determine if  149  the software they are evaluating meets their goals. Another study could have one or more educators implement a formative model of C D R O M software selection, like those designed by Squire and McDougall (1990) or Flagg (1990), and one or more educators could continue to select C D R O M s using checklists or in the manner in which they have either been trained or have developed on their own. An evaluation of the two methods over a period of time could determine how effective the selection criteria might be, and if there was a difference in each educator's integration of C D R O M s into their curriculum. Such a study might also indicate that a combination of these two may be a more effective approach to evaluating and selecting educational CDROMs. It was also noted that some groups of educators varied by age. For example, the youngest group of educators were the TTs with an average age of twenty-six, and the oldest were the TLs averaging fifty-five years of age. A study involving a larger sample could determine if these trends exist in a larger population. At the same time, researchers could study age groups in general to determine if age is a factor in an educators' enthusiasm or resistance to the general integration of technology into their curriculum, especially with respect to their evaluation and selection of computer software programs like C D R O M s . Finally, the researcher noted some distinct gender differences between several of the groups. (Recall that all the teachers (Ts), teacher librarians (TLs), and teacher librarian/technology teachers (TL/TTs) were all female, whereas all but one of the technology teachers (TTs) and teacher/technology teachers (T/TTs) were male). Another study could look into these differences on a much larger scale and determine  150  if these gender differences exist on a much larger scale and if gender plays a significant role in the development of C D R O M selection criteria and of the types of C D R O M s each sex would choose for their classrooms or purchase for the school. In conclusion, many of the intermediate teachers in this study were teaching in elementary schools which had either recently acquired computers or were planning on acquiring computers in the near future. Along with the increasing numbers of computer hardware comes the need to (a) adequately prepare preservice and inservice educators in the use of computer software and for the development and application of educationally sound selection of computer software, like C D R O M s , and (b) give educators adequate release time to evaluate and become familiar with the C D R O M s for use in their classrooms. At the present time, and as was indicated by this study, this does not seem to be the case. A s this teacher concluded: There is not enough information being shared among the teachers. There is not a forum where we can discuss software. Y o u are still dependent on your context to see what is a good C D - R O M . Without a forum then it is like going to Future Shop or a stranger off the street and buying software from either. And the choice you make affects a lot of kids and your budget. W e need more communication to be able to select and choose software wisely. (T1) If educators are expected to make good use of their new computer hardware, they must be trained on how to use and select educationally sound C D R O M software, and the lines of communication must be kept open amongst various types of educators across the district and the province so they can share C D R O M resources and C D R O M selection information and skills.  151  BIBLIOGRAPHY  Alberta Education. (1996). Technology integration funding for computer upgrading. [On-line]. Available U R L : http://ednet.edc.gov.ab.ca/technology /default, html.  Arnold, S. (1987). A baker's dozen of C D - R O M myths. Electronic and Optical Publishing Review 7(2). 58-63.  Baker, G. J . (1997). An investigation into the criteria for evaluating world wide web documents. Unpublished major paper for master of education, University of British Columbia, Vancouver, British Columbia, Canada.  Bakker, H. E., & Piper, J . B. (1994). California provides technology evaluations to teachers. Educational Leadership 51(7), 67-68.  Barry, C.L. (1994). User-defined relevance criteria: An exploratory study. Journal of American Society for Information Science, 45(3), 149-159.  Begin, M., Hall, J . , Jeffrey, S., Lukasevich, A., Meister, S., & Pronovost, J . (1992). Evaluation  techniques and resources: Book II. B C : British Columbia  Primary Teachers' Association.  Berger, P., & Kinnell, S. (1994a). C D R O M for schools: A directory and practical handbook for media specialists. Wilton, CT: Eight Bit Books.  Berger, P., & Kinnell, S. (1994b). A progress report for the disc-interested. School Library Journal 40(10). 26-31.  Berry, J . (1992). C D - R O M : The medium of the moment. Library Journal 117(2). 45-47. 152  British Columbia Ministry of Education. (1987). Report of the provincial advisory committee on computers. Victoria, B C : Queen's Printer for British Columbia.  British Columbia Ministry of Education. (1990). Primary program: Foundations document. Victoria, B C : Queen's Printer for British Columbia.  British Columbia Ministry of Edcuation. (1991). Developing independent learners: The role of the school library resource centre. Victoria, B C : Queen's Printer for British Columbia.  British Columbia Ministry of Education. (1992). The intermediate program: Foundations (draft). Victoria, B C : Queen's Printer for British Columbia.  British Columbia Ministry of Education. (1995). Technology in British Columbia public schools: Report and action plan 1995 to 2000. Victoria, B C : Queen's Printer for British Columbia.  British Columbia Ministry of Education. (1996a). Summary report of the learning resources branch needs assessment project. [On-line]. Available URL: http: www. est. gov. be. ca/acctblty/eval.  British Columbia Ministry of Education. (1996b). Evaluating, selecting, and managing learning resources: A guide. Victoria, B C : Queen's Printer for British Columbia.  British Columbia Ministry of Education. (1998). Learning resource branch: Recommended learning resources. [On-line]. Available U R L : http://www.est. gov.be. ca/curriculum/lr/resource/res_main.htm.  Bryson, M. (1993). New technologies/new practices? Teachers, machines, and the cultures of primary schooling. Vancouver, B C : University of British Columbia 153  Research Team.  Callison, D., & Haycock, G. (1988). A methodology for student evaluation of educational microcomputer software. Educational Technology 28(1). 25-32.  Clyde, A. (1996). Info tech: What C D - R O M s are other schools using? Emergency Librarian 23(4), 51-53.  Collura, K. (1995). Shopping for software. Momentum (26)4. 33-35.  Cuban, L. (1986). Teachers and machines. NewYork, NY: Teachers' College Press.  Dudley-Marling, C , & Owston, R. (1987). The state of educational software: A criterion-based evaluation. Educational Technology 27(6). 15-19.  Ekhami, L , & Brown, C. (1993). C D - R O M and the curriculum. School Library Media Activities Monthly 10(3). 41-44.  Ely, D. (Ed.). (1996). Trends in educational technology. Syracuse, NY: ERIC Clearinghouse on Information and Technology.  Flagg, B. (1990). Formative Evaluation for Educational Technologies. Hillsdale, N J : Lawrence Erlbaum Associates, Publishers.  Fullan, M. (1982). The meaning of educational change. Toronto, O N : Teachers College Press.  Fullan, M. (1991). The new meaning of educational change. Toronto, O N : OISE Press.  154  Fullan, M. (1993). Change forces: Probing the depths of educational reform. New York, N Y : Falmer Press.  Gill, B. J . , Dick, W., Reiser, R. A., &Zahner, J . E . (1992). Educational technology research section: A new model for evaluating instructional software. Educational Technology 32(3) 39-48.  Government of Canada (1998). Computers for Schools: Donations to Date. [On-line]. Available U R L : http://www.schoolnet.ca/cfsope/english/Donation_to_date.html  Graf, N. (1996). Making the C D - R O M choices. The Book Report 15(2). 13-23.  Gibson, S. E. (1997). Teacher and librarian partnering to integrate computer technology in the classroom. School Libraries Worldwide 3(1), 47-54.  Hackbarth, S. (1996). The educational technology handbook: A comprehensive guide. Englewood Cliffs, N J : Educational Technology Publications.  Hayes, J . (1995). Multimedia in schools. Educational IRM Quarterly 3(3/4). 4648.  Heinich, R., Molenda, M., Russell, J.D., & Smaldino, S . E . (1996), Instructional media and technologies for learning. Upper Saddle River, N J : Prentice Hall, Inc.  Heller, R. S. (1991). Evaluating software: A review of the options. Computers in Education 17(4). 285-291.  Henri, J . (1997). Information technology literacy in schools: Let's look after the teachers. School Libraries Worldwide 3(1). 31-38.  155  Herther, N.K. (1989). Those lingering doubts about C D - R O M . Database 12(1). 102-104.  Jasco, P. (1996). State-of-the-art multimedia in 1996: The "big four" general encyclopedias on C D - R O M . Computers in Libraries 16(4). 26-32.  Jolicoeur, K., & Berger, P. (1986). Do we really know what makes educational software effective? A call for empirical research on effectiveness. Educational Technology 28(9). 7-13.  Jolicoeur, K., & Berger, P. (1988). Implementing educational software and evaluating its academic effectiveness: Part 1. Educational Technology 28(9). 7-13.  Krippendorf, K. (1980). Content analysis: A n introduction to its methodology. Beverly Hills, C A : Sage.  Lamy, G. (1994). Discis books and living books: Electronic books for practicing reading. Early Childhood Education 27(1). 47-52.  Large, A., Beheshti, J , Breuleux, A., & Renaud, A. (1995). Multimedia and comprehension: The relationship among text, animation, and captions. Journal of the American Society for Information Science 46(5), 340-347.  Litchfield, B.C. (1992). Evaluation of inquiry-based science software and interactive multimedia programs. The ComputingTeacher 6. 41-43.  Lue, D., & Kinzer, C. (1987). Effective reading instruction in the elementary grades. Columbus, O H : Merrill Publishing Company.  Manitoba Education and Training. (1997). Manitoba government news release: Improved technology resources for students. [On-line], http://www.edu.gov.mb. 156  ca/mepubl/publnews/publnews.html.  McCade, J . (1995). Educational reform and technology education. Technology Teacher 54(8V 31-39.  McCarthy, R. (1993, October). C D - R O M spins into schools. Electronic Learning, Special Edition, 10-13.  Microsoft Encarta Deluxe 98 [Computer software]. (1997). Redmond, WA:Microsoft Corporation.  Miller, L., & Olson, J . (1994). Putting the computer in its place: A study of teaching with technology. Journal of Curriculum Studies 26(2). 121-141.  Miller, L., & Olson, J . (1995). How computers live in schools. Educational Leadership 53(2). 74-77.  Moursund, D. (1991). Restructuring education for the information age. The Computing Teacher 19(4).  N C E T . (1992). C D - R O M in Schools Scheme Evaluation Report. Coventry, England: National Council for Educational Technology.  New Brunswick Department of Education. (1997). New Brunswick news release: Improved technology for schools (97/04/23). [On-line],  http://www.gov.nb.ca  /cnb/news/edu/7e0523ed.htm.  Newfoundland Ministry of Education and Training. (1997). Education news release: Government invests $2.5 million dollars in computers for schools. [On-line]. http://www.gov.nf. ca/releases/1997/edu_n97.htm.  157  Nordgren, L. (1993). Evaluating multimedia C D - R O M discware: Of bells, whistles and value. C D - R O M Professional 6(1). 99-105.  Nova Scotia Department of Education and Culture. (1996). Government news releases: Computers in schools program gets big boost. [On-line]. http://www.EDnet.ns. ca/ educ/govnews.htm.  Ontario Ministry of Education and Training (1995). Ministry of education news release: Bob Rae announces initiatives to prepare Ontario schools for the information age. [On-line], http://www.edu.gov.on.ca/eng/document/nr/95.03/ initiat1.html.  Parker, D. J . (1992). The dirty dozen: Twelve ways to build a better C D - R O M . C D - R O M Professional 5(9). 77-81.  Pearce, K.J. (1988). C D - R O M : Caveat emptor. Library Journal 113(2). 37-38.  Pearlman, L. (1992). School's out: Hyperlearning. the new technology, and the end of education. New York, NY: William Morrow.  Prince Edward Island Department of Education. (1997). Vision statement: Information technology in education. [On-line], http://www.gov.pe.ca/educ/resources/ tech/milestones.asp.  Ragsdale, R. (1988). Permissible computing in education: Values, assumptions, and needs. New York, NY: Praeger.  Reese, J., & Steffey, M. (1988). The seven deadly sins of C D - R O M . Laserdisk Professional 1 (4). 19-24.  Reiser, R., & Kegelmann, H. (1994). Evaluating instructional software: A 158  review and critique of current methods. Educational Technology Research & Development 42(3). 63-69.  Richards, T., & Robinson, C. (1993). Evaluating C D - R O M software: A model. C D - R O M Professional 6(9). 92-101.  Russell, G. (1994). Implications of context-based software development for education. Australian Journal of Education 38(2). 157-173.  Safford, B. (1994). Encyclopedias: The first step in the information seeking process. School Library Media Activities Monthly 11(4). 30-32.  Schwartz, C. (1993). Evaluating C D - R O M products: Yet another checklist. C D R O M Professional 6(1), 87-91.  Shade, D.D. (1990). Computer in early education: Issues put to rest, theoretical links to sound practice, and the potential contribution to microworlds. Journal of Educational Computing Research 6(4), 375-384.  Shade, D.D. (1992). A developmental look at award winning software. Day Care and Earlv Education 20(1). 34-36.  Shade, D.D. (1994). Here we go again: Compact disc technology for young children. Dav Care and Earlv Education 22(1). 44-46.  Shade, D.D. (1996). Software evaluation. Young Children 51(6). 17-21.  Shank, R., & Cleary, C. (1995). Engines for education. Hillsdale, N J : Lawrence Erlbaum Associates Publishers.  Shuell, T., & Shueckler, L. (1989). Towards evaluating software according to 159  principles of learning and teaching. Journal of Educational Computing Research 5. 135-149.  Squire, D., & McDougall, A. (1994). Choosing and using educational software: A teachers' guide. Washington, DC: The Falmer Press.  Stempel, G.H. (1981). Content analysis. In G.H. Stempel III & B. Westley (Eds.), Research methods in mass communications (pp. 119-143). Englewood Cliffs, NJ: Prentice-Hall.  Stewart, B. (1987). The media lab: Inventing the future at MIT. New York, NY: Viking Press.  Sullivan, B.M. (1988). A legacy for learners: The report of the royal commission on education. Victoria, B C : Queen's Printer for British Columbia.  Tolhurst, D. (1992). A Checklist for evaluating content-based hypertext computer software. Educational Technology 32(3). 17-21.  Ultimate Children's Encyclopedia [Computer software]. (1996). Fremont, CA: Softkey Multimedia Inc.  U.S. Congress Office of Technology Assessment. (1988). Power on! New tools for teaching and learning. O T A Set-379, Washington DC.  Zinc, S. D., (1991). Toward more critical reviewing and analysis of C D - R O M user interfaces. C D - R O M Professional 4(1). 16-22.  160  Appendixes A  Draft Questionnaire  C D R O M Study, Part 1 Section 1 - Background Information: 1) Name (optional): 2) Gender:  • Female • Male  3) Age:  •  • • • •  25 years or younger 26-35 36-45 46-55 56-65  4) Teaching position: • • • •  Teacher Teacher Librarian Computer Teacher Other  5) Years of teaching experience: • • • • • •  5 or less 6-10 11-15 16-20 21-25 26 or more  6) Position Classification: • •  Full time (35 or more hrs/wk) Part time (Under 35 hrs/wk) Part time, please specify work hours/wk:  7) Number of students in your school  .  8) Do you currently use a computer in your classroom? If no, skip to question 12. 161  •  Yes,  •  9) How many computers are located in your classroom? 10) How old is/are your classroom computer/s? • • • •  14710  3 years 6 years 9 years or more years  11) How many of your classroom computers are equipped with a C D R O M player? 12) Does your school have a computer room (or any other computers) that you and your students are able to use? Y e s • No • . If yes, how many of these "shared" school computers have C D R O M players installed? 13) In what room/s are these C D R O M players located?  .  14) Approximately how many hours a week does your class have access to these computers? • Less • 1-3 • 4 -6 • 7-9 • 10 or  than 1 hour a week hours hours hours more hours  15) Approximately how many hours per month do you use C D R O M s with your class? • Less • 1-3 • 4-6 • 7-9 • 10 or  than 1 hour a week hours hours hours more hours  16) How many educational C D R O M software programs does your school have? 17) How many years have you evaluated/selected educational C D R O M software? • • • •  Less than a year 1 - 3 years 4 - 6 years more than 6 years  18) How many years have you been using C D R O M software with your students? •  Less than a year  • 162  4 - 6 years  •  1 - 3 years  •  more than 6 years  19) Approximately how many hours a month do you spend evaluating and selecting C D R O M software for use with your students? • • • •  Less than an hour 1 - 3 hours 4 - 6 hours more than 6 hours per month  20) Approximately how many new elementary educational C D R O M s do you evaluate each year? 1 -5 6 - 10 11 - 1 5 16 - 2 0 21 or more  • • • • •  21) How many new educational elementary C D R O M s does your school purchase in a year?  • • • • •  1 -5 6 - 10 11 - 1 5 16 - 2 0 21 or more  22) Does your school have a computer specialist? • Yes,  • No.  23) Does your school have a computer and software selection committee? • Y e s • No. 24) Do you own a computer at home?  Yes • ,  No • .  (If no, go to question no. 28).  25) How long have you owned your home computer? • • •  1 - 3 years 4 - 6 years 6 or more years  26) Do you have a C D R O M player in your home computer? Y e s • , No • . approximately how many C D R O M s do you own? • • • •  1-5 6-10 11-15 16-20 163  If yes,  •  21-25  •  more than 26  27) How many hours per week do you spend using your home computer? •  Less than 1 hour a week  • 1 - 3 hours • 4 - 6 hours • 7 - 9 hours • 10 or more hours 28) Which of the following C D R O M s have you evaluated and selected for use in your classroom and/or school: • • • • • • • • • • • • • • •  Encyclopedias (e.g., Encarta, Comptons, Canadian) Children's Literature (e.g., Discus Books, DK Multimedia) Word Processing (e.g., Microsoft Word, Claris) Publishing (e.g., Kids Works) Language Education Science Social Studies Learning for Living Math Music Drama and Dance Physical Education Technology Professional Development Other, please specify:  29) Do you consult other information sources when selecting C D R O M software for your classroom? • Yes, • No. If yes, please use the list on the next page to indicate these other information sources: • Word of Mouth - please specify below: a) • other teachers, b) • teacher librarians, c) • computer teachers, d) • other: ; • Professional Days, Inservices • District Resource Center • Ministry of Education Learning Resource Branch • Publishers • Software reviews in Educational Journals and Periodicals • Software reviews in non-educational Journals and Periodicals 164  .  • Computer Stores • other:  Section 2 - Evaluation Statements  SA A D SD NO  = = = = =  Strongly Agree Agree Disagree Strongly Disagree No Opinion  Please circle your level of agreement with following statements: 30. It is important to purchase C D R O M software for elementary students.  SA  A  D  SD  NO  31. Most C D R O M software today is of high quality and does not need evaluating.  SA  A  D  SD  NO  32. Teachers are in the best position to adequately evaluate and select C D R O M software for use in the elementary classroom  SA  A  D  SD  NO  33. Teachers should be trained to select their own C D R O M software.  SA  A  D  SD  NO  34. I feel that the multimedia capabilities of C D R O M s meet a wider variety of student needs than many traditional resources.  SA  A  D  SD  NO  165  B  Draft Interview Guide Questions  C D R O M Study, Part 2 Section 1 - Main Interview 1) List the selection criteria that you use to evaluate elementary C D R O M software. 2) Numerically rank your selection criteria by order of importance. most important, and ten (10) being least important.  One (1) being  3) Look back over your list of selection criteria. From where do you feel these selection criteria might have originated? That is, how have you developed your selection criteria? 4) If someone asked you How do you decide which criteria to use when selecting elementary C D R O M software ? What would you answer? 5) In your opinion, could you apply all the selection criteria on your list to all CDROMs? 6) Do you believe your current selection criteria is adequate for evaluating and selecting elementary C D R O M resources. 7) Do you use students and student developed selection criteria to help evaluate and select C D R O M software? 8) Do you feel you select and purchase an appropriate number of C D R O M software for a) your student's needs, and b) your school's needs? 9) Do you agree with the following statement: Selection criteria from other media formats (e.g., video, radio, audio, laser disks, television) should not be applied to C D R O M software . 10) Have you ever used C D R O M software that you have not personally evaluated or selected? 11) Are there any other factors that might influence (hinder or help) your evaluation and selection of C D R O M resources that we may not have discussed yet? 12) Would you like to make any other comments with regards to selecting C D R O M s for elementary students?  Section 2 - C D R O M Activity 166  13) Question: Keeping in mind the intermediate students you are currently instructing, I want you to explore and critically evaluate these two C D R O M s [Encarta 98 and Children's Encyclopedia] for use with these students  Section 3 - Follow Up Interview 14) Look back over the C D R O M selection criteria that you listed prior to your hands on evaluation of the C D R O M s . Would you like to add/remove any criteria to/from this list, or change the ranking? 15) Do you think it important that teachers be given time to actively evaluate and select C D R O M software? 16) Recalling your recent hands on evaluation of the C D R O M s , what factors influenced your decision the most when selecting one educational C D R O M over another? 17) In your opinion, what are the most important difficulties that educators face when evaluating and selecting C D R O M software? 18) Would you like to provide any final comments, suggestions, or insights about how you select C D R O M s for elementary students?  167  C  Letter Requesting Panel Experts' Assistance Principal Researcher: Dr. Ann Lukasevich Ph: 822-2102; email: ann.lukasevich@ubc.ca  Director of the U B C Office of Research Services and Administration: Dr. Richard Spratley, Ph: 822-8598; email: rds@orsil.ubc.ca Student Researcher: Keith McPherson Language Education Research Center Department of Language Education, Faculty of Education 2125 Main Mall, Vancouver, B.C. V6T 1Z4 June 16, 1997  Mrs./Ms./Mr. Panel Expert 196 West 4 8 Avenue Institution Vancouver, B.C. V 5 Y 2Y7 th  Dear Ms./Mrs./Mr. (Name of Panel Expert): As a graduate student in U B C ' s Faculty of Education, Department of Curriculum and Instruction, I am conducting a Master of Arts research study titled: "Investigating intermediate teachers' C D R O M software selection criteria" The purpose of this study is to investigate the current selection processes and criteria that teachers, teacher librarians, and computer specialists in British Columbia use to evaluate and select C D R O M software for use in the intermediate classroom. The study will improve our understanding of how educators form and apply computer software selection criteria, and provide information that may assist curriculum developers and educators in creating and selecting educational C D R O M software. The validity of this study rests upon the ability of the enclosed research questions to stimulate responses with content representativeness and appropriateness. That is, for the questions to generate responses from the research participant which would give the most representative set of selection criteria data that educators would use when selecting C D R O M software, and of the factors shaping this criteria. The questionnaire has been developed from my own experience in using and evaluating educational software, suggestions and criteria generated from a pilot study of this 168  D  C D R O M Research Study Overview  Research Study Overview Investigating intermediate educator's C D R O M selection criteria Keith R. McPherson June 21, 1997 A) Purpose: The purpose of this M.A. thesis research is to investigate the factors shaping the current selection processes and criteria that British Columbia intermediate educators (Grades 4-7 teachers, teacher librarians, and computer teachers) use when evaluating and assessing educational C D R O M software. The questions that guide this study are: 1) What processes of evaluation and selection criteria do intermediate British Columbian educators employ when selecting C D R O M software for intermediate classes? And, 2) What factors are most responsible for influencing the development of these intermediate educators' criteria when selecting C D R O M software? B) Hypothesis: Selecting C D R O M software resources is a relatively new and complex process which intermediate teachers are becoming more involved with as schools purchase (or plan to purchase) more C D R O M hardware and software. Unfortunately, the criteria for selecting educational C D R O M s has not been made explicit in practice or in the research literature. The hypothesis of this study is that "Teachers' C D R O M selection criteria can be identified and made more explicit by encouraging teachers' to reflect upon their own C D R O M selection criteria (questionnaire and interview) and by observing and discussing teachers' application of selection criteria in realistic C D R O M selection situations." C) Sample: The research Sample will be comprised of 5 teachers, 5 teacher librarians, and 5 computer teachers. Subjects will be grade 4-7 teachers, randomly selected from four Lower Mainland school districts, and who have either had a years experience selecting intermediate educational C D R O M S , or who have a year or more experience using educational C D R O M software with intermediate students.  D) Time Expectations: 170  Teachers are expected to attend a one hour session during which they will answer a short survey, respond to questions in an interview, and be observed while they actively select a C D R O M . The interview and the C D R O M selection activity will be audio taped.  E) Participant Protection: There are no risks involved with the study, and the participants will neither be rewarded or penalized for their involvement or lack of involvement in the study. Participants, school affiliations, and any other identifying information will remain anonymous. Pseudonyms will be used in all reports and documents. F) Data Analysis: Survey, interview and observational data sets for all participants will be contrasted and compared using the content analytic method as explained by Krippendorf (1980) and Stempel (1981). Content analysis of the data sets will be used by the researcher and one or more independent content data analysts to inductively identify and code response categories from each educator's raw interview data. The reliability of each category is then expressed in terms of a percentage calculated by dividing the number of coded agreements between researcher and independent coder by the number of possible category agreements. The minimum percentage considered reliable in this and any exploratory content analysis study is 80%. If reliability of the coded categories does reach an acceptable percentage, the categories will be revised and recalculated until an acceptable level of reliability is obtained. Coded categories of teachers' user-defined selection criteria will be expressed in tables and bar graphs, and compared to coded categories of criteria drawn from current literature. G) Possible implications of study: I expect analysis of data will reveal that educators' C D R O M selection criteria and the factors influencing the development of this criteria, will support hypothesis proposed in current literature. That is, C D R O M selection criteria will incorporate some criteria from other media, but it will also will be different enough to warrant identifying and developing selection criteria specific to the tool. Also, I expect that a comparative analysis of the three groups of educators' C D R O M selection criteria will find similarities and differences, both between groups and between individuals in these groups. Similarities will probably be reflected in more broad criteria, and differences will probably arise due to each group's specific teaching objectives and expertise. Although each of the educators will benefit from their participation in this study (e.g., they will be exposed to new C D R O M s and they will gain insights into their own and other educators' software selection criteria), it is my hope that this research data 171  will help decrease the 'mystery' associated with the selection of sound educational C D R O M software, will increase our understanding of the selection processes and criteria that educators apply when selecting these new electronic resources, and may well develop into a set of C D R O M selection criteria/processes that would assist district educators and resource specialists select educational C D R O M s for use with intermediate-aged students.  H) Selected Bibliography: Fullan, M. (1993). Change forces: Probing the depths of educational reform. New York, NY: Falmer Press. Heller, R. S. (1991). Evaluating software: A review of the options. Computers in Education. 17(4),285-291. Henri, J . (1997). Information technology literacy in schools: Let's look after the teachers. School Libraries Worldwide 3(1 V 31-38. Krippendorf, K. (1980). Content analysis: A n introduction to its methodology. Beverly Hills, C A : Sage. McCade, J . (1995). Educational reform and technology education. Technology Teacher 54(8). 31-39. Miller, L., & Olson, J . (1995). How computers live in schools. Educational Leadership 53(2), 74-77. Nordgen, L. (1993). Evaluating multimedia cd-rom discware: Of bells, whistles and value. C D - R O M Professional 6(1). 99-105. Stempel, G . H . (1981). Content analysis. In G.H. Stempel & B. Westley (Eds.), Research methods in mass communications (pp. 1 1 9-143). Englewood Cliffs, N J : Prentice-Hall.  172  E  Panel Expert's Covering Letter  Keith McPherson Language Education Research Center Department of Language Education, Faculty of Education 2125 Main Mall, Vancouver, B.C. V6T 1Z4 Telephone (604) 822-5368 Email: keitnr@unixg.ubc.ca October 5th, 1997  Dear Mr./Ms./Mrs. (Name of Panel Expert), Thank you for consenting to assist in the development of my research questions. Please find enclosed an overview of the study and a questionnaire rating form. Please read the overview and become familiar with the study's objectives. Then, 1) critically review each question ensuring that it elicits response from the subject that will help inform the stated research objectives, 2) use the attached rating form to record your opinion of the appropriateness and/or relevance of each question in meeting the research objectives, and, 3) include your suggestions for improving the instrument's validity and reliability. I must remind you that your participation in this study is entirely voluntary, and that your identity will remain anonymous. If you need any assistance in completing the rating form, or need more information, please do not hesitate to contact me. Again, I thank you for your assistance.  Sincerely,  Keith McPherson end.  173  F  Panel Expert's Rating Form  C D R O M Questionnaire Rating Form Instructions: Please rate each question in terms of how relevant and appropriate you feel the elicited response would be at fulfilling the research study's objectives. Indicate your response by circling the most appropriate number to the right of the question. One (1) being the most relevant/appropriate and four (4) being the least relevant/appropriate. 1 = very relevant / appropriate 2 = relevant / appropriate 3 = slightly relevant / appropriate 4 = not relevant / appropriate If you would like to make any suggestions, comments, or changes to the questions, please indicate these responses on the lines just below the question. Example: Very Relevant  i)  Not Relevant  Do you ski, ice skate, or participate in winter sports? Comment: Keith, this question will not generate any data relevant to your research  C D R O M Study, Part 1: Questionnaire Rating  Section 1 - Background Information Very Relevant  1)  Name (optional):  .  174  9  Not Relevant  „  A  Very Relevant  2)  Gender:  • Female • Male  Comment:  3)  Age:  • 25 or younger • 26-35 • 36-45 • 46-55 • 56-65  Comment:  4)  Teaching position: • • • •  Teacher Teacher Librarian Computer Teacher Other  Comment:  175  Not Relevant  Very Relevant  5)  Not Relevant  Years of teaching experience: • • • • • •  5 or less 6-10 11-15 16-20 21-25 26 or more 1  2  3  4  1  2  3  4  Comment:  6)  Position Classification: •  Full time (35 or more hrs/wk)  •  Part time (Under 35 hrs/wk) Part time, please specify work hours/wk:  Comment:  7)  Number of students in your school Comment:  176  Very Relevant  8)  Do you currently use a computer in your classroom? •  Y e s , • No.  If no, skip to question 12.  Comment:  9)  1  How many computers are located in your classroom?  .  Comment:  10)  Not Relevant  How old is/are your classroom computer/s? •  1 - 3 years  •  4 - 6 years  •  7 - 9 years  •  10 or more years  Comment:  177  2  3  4  Very Relevant  11)  Not Relevant  How many of your classroom computers are equipped with a C D R O M player? Comment:  12)  Does your school have a computer room (or any other computers) that you and your students are able to use? Yes •  No • .  If yes, how many of these  "shared" school computers have C D R O M players installed?  . 1  Comment:  13)  In what room/s are these C D R O M players located? Please list room/s.  Comment:  178  2  3  4  Very Relevant  14)  Not Relevant  Approximately how many hours a week does your class have access to these computers? •  Less than 1 hour a week  •  1 - 3 hours  •  4 - 6 hours  •  7 - 9 hours  • 10 or more hours  1  Comment:  15)  Approximately how many hours per month do you use C D R O M s with your class? •  Less than 1 hour a week  •  1 - 3 hours  •  4 - 6 hours  •  7 - 9 hours  • 10 or more hours  Comment:  179  2  3  4  Very Relevant  16)  How many educational C D R O M software programs does your school have?  .  Comment:  17)  .  How many years have you been evaluating/selecting educational C D R O M software?  •  Less than a year  •  1 - 3 years  •  4 - 6 years  •  more than 6 years  Comment:  18)  How many years have you been using C D R O M software with your students? •  Less than a year  •  1 - 3 years  •  4 - 6 years  •  more than 6 years  Comment:  180  Not Relevant  Very Relevant  19)  Not Relevant  How many hours a month do you spend evaluating and selecting C D R O M software for use with your students? •  Less than an hour  •  1 - 3 hours  •  4 - 6 hours  •  more than 6 hours per month  Comment:  20)  How many new educational elementary C D R O M s do you evaluate in a year? •  1-5  •  6-10  •  11-15  •  16-20  •  21 or more  1  Comment:  181  2  3  4  Very Relevant  21)  Not Relevant  How many new educational elementary C D R O M s does your school purchase in a year? •  1 -5  •  6-10  •  11-15  •  16-20  •  21 or more  1  2  3  4  1  2  3  4  Comment:  22)  Does your school have a computer specialist?  •  Yes, • No Comment:  23)  Does your school have a computer and software selection committee?  • Yes,  • No.  Comment:  182  Very Relevant  24)  Do you own a computer at home?  Yes • ,  Not Relevant  No • .  (If no, go to question no. 28). Comment:  25)  How long have you owned your home computer? •  1 - 3 years  •  4 - 6 years  •  6 or more years  Comment:  26)  1  2  3  4  Do you have a C D R O M player in your home computer? Y e s • , No • . own?  If yes, how many C D R O M s do you •  1-5  •  6-10  •  11-15  •  16-20  •  21-25  •  more than 26  1  Comment:  183  2  3  4  Very Relevant  27)  Not Relevant  How many hours per week do you use your home computer? •  Less than 1 hour a week  •  1 - 3 hours  •  4 - 6 hours  • •  7-9hours 10 or more hours  1  Comment:  28)  Which of the following C D R O M s have you evaluated and selected for use in your classroom/school: •  Encyclopedias (e.g., Encarta)  •  Children's Literature (e.g., Discus)  •  Word Processing (e.g., Word 5.0)  •  Publishing (e.g., Kids Works)  •  Language Education  •  Science  •  Social Studies  •  Learning for Living  •  Math  •  Music  •  Drama & Dance  •  Physical Education  •  Technology  •  Professional Development  •  Other, please specify:  Comment:  184  2  3  4  Very Relevant  29)  Not Relevant  Do you consult other information sources when selecting C D R O M software for your classroom? • Yes, • No.  If yes, please use the following list to  indicate these other information sources: • Word of Mouth - specify below: a) •  other teachers,  b) •  teacher librarians,  c) •  computer teachers,  d) • other:  .  • Professional Days, Inservices • District Resource Center • Ministry Learning Resource Branch • Publishers 1 • Software reviews in Educational Journals • Software reviews in noneducational Journals • Computer Stores • other:  .  Comment:  185  2  3  4  Very Relevant  Not Relevant  Section 2 - Evaluation Statements (Rating note: In the actual study, research subjects will be asked to read the following evaluation statements, and to circle one of these five responses: SA=strongly agree, A=agree, D=disagree, SD=strongly disagree, and NO=no opinion. You do not need to circle SA, A, D, S D , or NO on this form. Please continue to use the rating numbers, 1 through 4, and the comment lines to rate and comment on the relevance and appropriateness of the following statements to the research objectives).  30)  It is important to purchase C D R O M software for elementary students.  SA  A  D  SD  NO  Comment: 1  31)  Most C D R O M software today is of high quality and does not need evaluating.  SA  A  Comment:  186  D  SD  NO  2  3  4  Very Relevant  32)  Teachers are in the best position to adequately evaluate and select C D R O M software for use in the elementary classroom SA  A  D  SD  NO  Comment:  33)  Teachers should be trained to select their own C D R O M software.  SA  A  D  SD  NO  Comment:  34)  I feel that the multimedia capabilities of C D R O M s meet a wider variety of student needs than many traditional resources.  SA  A  D  Comment:  187  SD  NO  Not Relevant  Very Relevant  Not Relevant  C D R O M Study, Part 2: Interview Guide Questions Rating Section 1 - Main Interview (Rating note: Some of the following interview questions are followed by italicized information. This information is not seen by the subjects and is used by the researcher to either clarify answers, encourage discussion, or to help impart procedural knowledge to the subject).  1)  List the selection criteria that you use to evaluate elementary C D R O M software. Comment:  2)  1  2  3  4  1  2  3  4  Numerically rank your selection criteria by order of importance.  One (1) being most important, and ten  (10) being least important. Comment:  3)  Look back over your list of selection criteria. From where do you feel these selection criteria might have originated? That is, how have you developed your selection criteria? Comment:  188  Very Relevant  4)  Not Relevant  If someone asked you "How do you decide which criteria to use when selecting elementary C D R O M software?", What would you answer? Comment:  5)  1  2  3  4  1  2  3  4  1  2  3  4  1  2  3  4  In your opinion, could you apply all the selection criteria on your list to all C D R O M s ? Comment:  6)  Do you believe your current selection criteria is adequate for evaluating and selecting elementary C D R O M resources. Comment:  7)  Do you use students, and student developed selection criteria, to help evaluate and select C D R O M software? Comment:  189  Very Relevant  8)  Not Relevant  Do you feel you select and purchase an appropriate number of C D R O M software for a) your students' needs, and b) your school's needs? Comment:  9)  Do you agree with the following statement: "Selection criteria from other media formats (e.g., video, radio, audio, laser disks, television) should not be applied to C D R O M software". Comment:  10)  Have you ever used C D R O M software that you have not personally evaluated or selected? Comment:  11)  Are there any other factors that might influence (hinder/help) your evaluation/selection of C D R O M resources that we may not have discussed yet? 1 Comment:  190  2  3  4  12)  Would you like to make any other comments with regards to selecting C D R O M s for elementary students? Comment:  Section 2 - C D R O M Activity  13)  Question: "Keeping in mind the intermediate students you are currently instructing, I want you to explore and critically evaluate these two C D R O M s [Encarta 98 and Children's Encyclopedia] for use with these students." Comment:  191  Very Relevant  Not Relevant  Section 3 - Follow Up Interview  (Rating Note: These questions are asked after subjects have had some time to actively evaluate several C D R O M s . Subjects are video taped during this exploration).  14)  Look back over the C D R O M selection criteria that you listed prior to your hands on evaluation of the CDROMs.  Would you like to add/remove any  criteria to/from this list, or change the ranking? Comment:  15)  1 2  Do you feel it important that teachers be given time to actively evaluate and select C D R O M software? Comment:  192  3  4  Not Relevant  Very Relevant  16)  Recalling your recent hands on evaluation of the C D R O M s , what factors influenced your decision the most when selecting one educational C D R O M over another? Comment:  17)  1  2  3  4  1  2  3  4  1  2  3  4  In your opinion, what are the most important difficulties that educators face when evaluating and selecting C D R O M software? Comment:  18)  Would you like to make any final comments, suggestions, or insights into the topic of elementary educational C D R O M selection criteria?  Comment:  193  G  Final Questionnaire  C D R O M Study, Part 1 Section 1 - Background Information: 1) Gender:  • Female • Male  2) Age:  • • • • •  25 years or younger 26-35 36-45 46-55 56-65  3) Teaching position: • • • •  Teacher Teacher Librarian Computer Teacher Other  4) Years of teaching experience: • 5 or less • 6-10 • 11-15 • 16-20 • 21-25 • 26 or more 5) Position Classification: • •  Full time (35 or more hrs/wk) Part time (Under 35 hrs/wk), please specify work hours/week:  6) Number of students in your school  .  7) Do you currently use a computer in your classroom? If no, skip to question 12. 8) How many computers are located in your classroom?  •  Y e s , • No.  .  9) How old is/are your classroom computer/s? • • • • •  1 - 3 years 4 - 6 years 7 - 9 years 10 or more years I do not know  10) How many of your classroom computers are equipped with a C D R O M player? 194  11) Does your school have a computer room (or any other computers) that you and your students are able to use? Y e s • No • . If yes, how many of these shared school computers have C D R O M players installed? 12) In what room/s are these C D R O M players located? • • • •  Library Computer Lab Classroom Other  13) Approximately how many hours a week does your class have access to these computers? • Less • 1-3 • 4 -6 • 7-9 • 10 or  than 1 hour a week hours hours hours more hours  14) Approximately how many hours per month do you use C D R O M s with your class? • Less • 1-3 • 4 -6 • 7-9 • 10 or  than 1 hour a week hours hours hours more hours  15) How many educational C D R O M software programs does your school have? 16) How many years have you been evaluating/selecting educational C D R O M software? • • • •  Less than a year 1 - 3 years 4 - 6 years more than 6 years  17) How many years have you been using C D R O M software with your students? • • • •  Less than a year 1 - 3 years 4 - 6 years more than 6 years  18) Approximately how many hours a month do you spend evaluating and selecting C D R O M software for use with your students? 195  • • • •  Less than an hour 1 - 3 hours 4 - 6 hours more than 6 hours per month  19) Approximately how many new elementary educational C D R O M s do you evaluate each year? • • • • •  1-5 6-10 11-15 16-20 21 or more  20) How many new educational elementary C D R O M s does your school purchase in a year? • 1-5 • 6-10 • 11-15 • 16-20 • 21 or more 21) Does your school have a computer specialist? • Yes,  • No  22) Does your school have a computer and software selection committee? • Y e s • No. 23) Do you own a computer at home?  Yes • ,  No • .  (If no, go to question 28).  24) How long have you owned your home computer? • • •  1 - 3 years 4 - 6 years 6 or more years  25) Do you have a C D R O M player in your home computer? Y e s • , No • . approximately how many C D R O M s do you own? • 1-5 • 6-10 • 11-15 • 16-20 • 21-25 • more than 26 26) How many hours per week do you spend using your home computer? • •  Less than 1 hour a week 1 - 3 hours  • • 196  7 - 9 hours 10 or more hours  If yes,  •  4 - 6 hours  27) Which of the following C D R O M s have you evaluated and selected for use in your classroom and/or school: • • • • • • • • • • • • • • •  Encyclopedias (e.g., Encarta, Comptons, Canadian) Children's Literature (e.g., Discus Books, DK Multimedia) Word Processing (e.g., Microsoft Word, Claris) Publishing (e.g., Kids Works) Language Education Science Social Studies Learning for Living Math Music Drama and Dance Physical Education Technology Professional Development Other, please specify:  .  28) Using the list above, do the following: a) underline the C D R O M s which are used the most frequently by your students, b) circle the C D R O M s that seem to be used the most frequently by the school's teaching staff, and c) put an asterisk in front of the C D R O M s that you use the most frequently with your students. 29) Do you consult other information sources when selecting C D R O M software for your classroom? • Yes, • No. If yes, please use the list below to indicate these other information sources: • Word of Mouth - please specify below: a) • other teachers, b) • teacher librarians, c) • computer teachers, d) • other: • Professional Days, Inservices • District Resource Center • Ministry of Education Learning Resource Branch • Publishers • Software reviews in Educational Journals and Periodicals • Software reviews in non-educational Journals and Periodicals • Retailers • other: Part 2 - Evaluation Statements  S A - Strongly Agree 197  .  A D SD NO  -  Agree Disagree Strongly Disagree No Opinion  Please circle your level of agreement with following statements: 30. It is important to purchase C D R O M software for elementary students.  SA  A  D  SD  NO  31. Most C D R O M software today is of high quality and does not need evaluating.  SA  A  D  SD  NO  32. Teachers are in the best position to adequately evaluate and select C D R O M software for use in the elementary classroom (as opposed to the district selection committee, and Ministry of Education Learning Resource Branch).  SA  A  D  SD  NO  33. Teachers should be trained to select their own C D R O M software.  SA  A  D  SD  NO  34. I feel that the multimedia capabilities of C D R O M s meet a wider variety of student needs than many traditional resources.  SA  A  D  SD  NO  198  H  Final Interview Guide Questions  C D R O M Study, Part 2 Section 1 - Main Interview 1) What is your favorite educational C D R O M , and why is it your favorite? 2) If you were given $500.00 to buy C D R O M s for your class (library/school), how would you go about choosing these C D - R O M s ? 3) What characteristics, features, strengths/weaknesses (selection criteria) do you look for when selecting C D R O M s to be used with your students? List these criteria in the first column of the table printed on the back of this page. 4) Numerically rank your selection criteria by order of importance. most important, and ten (10) being least important.  One (1) being  5) When choosing C D R O M s for different subjects (e.g., math, science, language arts) would you use the same selection criteria? 6) Are you satisfied with the collection of C D R O M s that you currently use with your students? 7) Do students help you choose C D - R O M programs? How do you involve students in selecting? W h y ? 8) Do you feel you have enough C D R O M resources (hardware and software) to meet your students' interests, needs and abilities? 9) Do your students' parents assist in selecting C D R O M s for the school? How? Do you get any feedback from parents on you selection of C D R O M s ? 10) Do you agree with the following statement: "Selection criteria from other media formats (e.g., video, radio, audio, laser disks, television) can be applied to C D R O M software." 11) Have you ever used C D R O M software that you have not personally evaluated or selected? 12) Are there any other factors that might influence (hinder or help) your evaluation, and selection of C D R O M resources that we may not have discussed yet? 13) Would you like to make any other comments with regards to selecting C D R O M s for elementary students? 199  Section 2 - C D R O M Activity 14) Question: "Keeping in mind the intermediate students you are currently instructing, I want you to explore and critically evaluate these two C D R O M s [Encarta 98 and Children's Encyclopedia] for use with these students"  Section 3 - Follow Up Interview  15) Look back over the C D R O M selection criteria that you listed prior to your hands on evaluation of the C D R O M s . Would you like to add/remove any criteria to/from this list, or change the ranking? 16) Why should teachers be given time to actively evaluate and select C D R O M software? 17) Recall your recent hands on evaluation of the two C D R O M s . What factors influenced your decision the most when you were asked to select one educational C D R O M over another? 18) In your opinion, what are the most important difficulties that educators face when evaluating and selecting C D R O M software? 19) Would you like to provide any final comments, suggestions, or insights about how you select C D R O M s for elementary students?  200  I  Pilot Study Research Consent Form  Principal Researcher: Dr. Ann Lukasevich Ph: 822-2102; email: ann.lukasevich@ubc.ca) Student Researcher: Keith McPherson Ph: (O) 822-5368, (H) 321-1505; email: keithr@unixg.ubc.ca Director of the U B C Office of Research Services and Administration: Dr. Richard Spratley, Ph: 822-8598; email: rds@orsil.ubc.ca  October 5, 1997 Pilot Study Research Consent Form  Dear (name of Potential Pilot Project Participant), As a graduate student in U B C ' s Faculty of Education, Department of Curriculum and Instruction, I am seeking volunteers to participate in a study titled: Investigating the factors shaping elementary teachers' C D R O M software selection criteria About a decade ago, the electronic information market was quite new. There were very few C D R O M players, and even fewer educational C D R O M s , available to teachers. Often teachers' selection criteria for educational C D R O M software could be summed up in one question, Should we buy it or not? Following the radical growth of the information market over the last decade comes the proliferation of C D R O M hardware and an explosion of educational C D R O M software. A s a consequence, the selection process and evaluative criteria that teachers employ when selecting educational C D R O M s are very new and complex. This study seeks to investigate the selection processes currently employed by elementary educators when selecting educational C D R O M s for their students. It also seeks to gain insights into some of the more important factors influencing and shaping an elementary educator's C D R O M selection criteria. The findings in this study will be published in my Master's thesis and will provide you, and educators like yourself, with insights into these complex selection processes and criteria. If you choose to volunteer as a participant in this pilot study, you will be asked to attend one, 2 hour long session. The session will be divided up into two activities. In the first activity, you will be asked to fill in a short background survey and discuss, with the primary researcher, the selection processes and criteria you follow when 201  I consent to participate in the research pilot project Investigating the factors shaping elementary teachers' C D R O M software selection criteria . I have been made fully aware of my role in this study, and of the time required of me. I understand that my confidentiality will be maintained. I acknowledge receipt of a copy of this consent form*, and I understand that I may withdraw my consent at any time without explanation.  (name)  (date)  (signature)  (telephone number)  Please complete and return this consent form to the student researcher  I consent to participate in the research pilot project Investigating the factors shaping elementary teachers' C D R O M software selection criteria . I have been made fully aware of my role in this study, and of the time required of me. I understand that my confidentiality will be maintained. I acknowledge receipt of a copy of this consent form**, and I understand that I may withdraw my consent at any time without explanation.  (name)  (date)  (signature)  (telephone number)  ' Please keep this consent form for your records  203  K  Research Consent Form  Principal Researcher: Dr. Ann Lukasevich Ph: 822-2102; email: ann.lukasevich@ubc.ca Student Researcher: Keith McPherson Ph: (0) 822-5368, (H) 321-1505; email: keithr@unixg.ubc.ca Director of the U B C Office of Research Services and Administration: Dr. Richard Spratley, Ph: 822-8598; email: rds@orsil.ubc.ca  Research Consent Form  As a graduate student in U B C ' s Faculty of Education, Department of Curriculum and Instruction, I am seeking volunteers to participate in a Master of Arts research study titled: Investigating intermediate teachers' C D R O M software selection criteria About a decade ago, the electronic information market was quite new. Although computer sales were burgeoning, C D R O M hardware and software were scarce. Essentially, a teacher's C D R O M software selection criteria could be summed up in one question, Should we buy it or not? Since then, there has been an exponential increase of C D R O M hardware and an overwhelming proliferation of educational C D R O M software. A s a consequence, the selection process and evaluative criteria that teachers employ when selecting educational C D R O M s has become very complex. This study seeks to investigate the selection processes currently employed by intermediate educators when selecting C D R O M s to use with their students. It also seeks to gain insights into some of the more important factors influencing and shaping an intermediate educator's C D R O M selection criteria. The findings in this study will be published in my Master's thesis and will provide you, and educators like yourself, with insights into these relatively new and complex selection processes and criteria. If you choose to volunteer as a participant in this study, you will be asked to complete a survey and attend a one hour and fifteen minute long interview session. The survey will be mailed to you and takes fifteen to twenty minutes to complete. In the first half hour of the interview session, you will be interviewed by the student researcher with regards to your current C D R O M selection processes and criteria. In the remaining time, you will be asked to actively evaluate two new educational C D R O M s . All that is required for you to fully participate in this study is that you a) be a Lower Mainland intermediate (Grades 4-7) teacher, teacher librarian, or computer teacher, and b) have at least one years experience using educational C D R O M s with 205  I consent to participate in the research project, Investigating intermediate teachers' C D R O M software selection criteria . I have been made fully aware of my role in this study, and of the time required of me. I understand that my confidentiality will be maintained. I acknowledge receipt of a copy of this consent form*, and I understand that I may withdraw my consent at any time without explanation.  (name)  (date)  (signature)  (telephone number)  * Please complete this consent form and submit it to the student researcher  I consent to participate in the research project, Investigating intermediate teachers' C D R O M software selection criteria . I have been made fully aware of my role in this study, and of the time required of me. I understand that my confidentiality will be maintained. I acknowledge receipt of a copy of this consent form**, and I understand that I may withdraw my consent at any time without explanation.  (name)  (date)  (signature)  (telephone number)  ** Please keep this consent form for your records  207  M  Initial Set of C D R O M Selection Criteria Codes CDROM Selection Criteria Codes  A CONTENT: 1) Educational: Educationally appropriate/valuable; Educationally sound; Arcade-shoot-em ups are kept minimal. 2) Edutainment: Educational content but fun/enjoyable as well; Balance of information and glitz. 3) Teacher's Curriculum: Compliments educator's program goals/resources; Content is relevant to class work; Reinforces classroom themes. 4) Provincial Curriculum: Compliments B.C. Curriculum; Does it meet learning outcomes for Info/Tech IRP; Corresponds to IRPs. 5) Complimentary Resource: Compliments school's existing resources; Fills a resource gap in library; Meets school's needs. 6) Age Appropriate/Readability; Age appropriate content; Appropriate reading level for elementary students; Appropriate vocabulary level; 7) Meaningfulness: Is the C D R O M meaningful to student? 8) Integration/Diversity, Subject: Can it [the C D R O M ] be used/applied/integrated across a range of subjects? 9) Canadian Content: Canadian content; Canadian vocabulary/terms (e.g., metric terminology); Not lacking Canadian content 10) Authenticity/Authority: Authenticity of content 11) Currency: Recency. B  INSTRUCTIONAL DESIGN  12) Transferability: Are skills transferable to other existing programs? 13) Consistency: Format/sequencing/pathways are consistent. 14) Interactivity: User interaction/interactivity encouraged; Interactive format; High level of interactivity; Involves the students to a high degree. 15) Motivation: Student would want to use program again; Format attractive to user; Motivating; Not boring. 16) Feedback: Student gets appropriate feedback. 17) Constructivism: Student is in control; Program encourages students to make choices and control (construct) their learning experiences 18) Technology: Makes appropriate use of technology; Helps instruct students on skills specific to computer technology. 19) Assessment: Keeps records of student progress/performance. 20) Accommodates Learners: Accommodates a variety of learners; Pathways for a variety of student abilities; Able to configure program/data for individuals; Accommodates a range of users; Is suitable for multi-ages; Open Ended tasks; useful to a lot of students.  210  21) Critical Thinking: Promotes the development of critical thinking skills; Thought provoking; Developes higher level thinking skills; Encourages problem solving. 22) Creativity: Encourages creativity. C  TECHNICAL DESIGN  23) Compatibility, Hardware/Software: Is it [CDROM] compatible with our computers?; Is/are software/files transferable to/between other existing programs; No bugs to freeze program; C D wont freeze; Is the C D R O M compatible with existing security software?; How much memory does it use? 24) Dual Platform: C a n the C D R O M be run on our Macintosh and IBM computers? 25) Sound: Clear and age appropriate pronunciation/speech/voices; Verbal instructions clearly support text; Good sound 26) Animation: Prudent use of animation/sound; 27) Multimedia: Makes good use of multimedia features, Virtual Tours; Allows children to use and make multimedia presentations 28) Graphic Design: Good use of color & graphics; High quality graphics; Snappy clear graphics; Not busy or cluttered screens; Good/great graphics; Visually clear/interesting; Modern visual appearance; Clear Icons; Graphics and text support each other. 29) Program Tempo: Does not waste time; C D moves along quickly; Does not have a long introduction. 30) Customization: Teacher can modify/change parameters/settings; Sound can be turned on/off. 31) Documentation/Help: Good documentation; Clear online help features. 32) Intuitiveness: Easy to use; User friendly; User friendly interface; Easy to use, especially for young children. 33) Printing/Copying: C a n you control printing?; Able to print graphics?; Able to copy graphics? D  SOCIAL CONTENT  34) Social Content: Content appropriate; Appropriate values portrayed; Appropriate I language used.  E OTHER 35) Previewing: Is it [CDROM] available to preview?; Can I review/test the C D R O M before having to purchase? 36) Usage: Projected use. 37) Cost/Price: Purchase Price/Cost? 38) Time: Have the time and opportunity to evaluate C D R O M .  211  N  Final Set of C D R O M Selection Criteria Codes CDROM Selection Criteria Codes  (Criteria definitions are those of the participants. Words in italics are the researcher's clarifications.) A CONTENT: 1) Educational: Educationally appropriate/valuable; Educationally sound; Arcadeshoot-em ups are kept minimal; Not just electronic worksheets. 2) Edutainment: Educational content but fun/enjoyable as well; Balance of information and glitz. 3) Teacher's Curriculum: Compliments educator's program goals/resources; Content is relevant to class work; Reinforces classroom themes. 4) Provincial Curriculum (Also District & School): Compliments B.C. Curriculum; Does it meet learning outcomes for Info/Tech IRP; Corresponds to IRPs. 5) Complimentary Resource: Compliments school's existing resources (does it duplicate print resources, etc?; Is the CDROM an improvement over existing resources?); Fills a resource gap in library; Meets school's needs. 6) Age Appropriate/Readability; A g e appropriate content; Appropriate reading level for elementary students; Appropriate vocabulary level; A g e appropriate spoken/written language; Not too much text on screen; Good amount of text on screen. 7) Meaningfulness: Is the C D R O M (content and activities) meaningful to student? 8) Integration/Diversity, Subject: Can it [the CDROM] be used/applied/integrated across a range of subjects? 9) Canadian Content: Canadian content (versus American); Canadian vocabulary/terms (e.g., metric terminology); Not lacking Canadian content 10) Authenticity/Authority: Authenticity of content; Is the person/group an expenV(s) on the subject? 11) Accuracy: Accuracy of content; Is the content believable and true? 12) Currency: Recency; Is the information/content updated with recent content? 13) Depth of Content: Depth of information; C D R O M should not be lacking content Information must go beyond surface facts.  B  INSTRUCTIONAL DESIGN 14) Transferability: Are skills (e.g., hypertext linking methods) transferable to other existing programs?; Are navigational features the same as those in existing programs so students don't have to learn a new set of skills. 15) Consistency:Format/sequencing/pathways are consistent 16) Interactivity: User interaction/interactivity encouraged; Interactive format; High level of interactivity; Involves the students to a high degree. 212  17) Motivation: Student would want to use program again; Format attractive to user; Motivating; Not boring. 18) Feedback: Student gets appropriate feedback (e.g., Tells you through text, sound or animation that your answer is correct or not). 19) Patronizing: The program must not talk down to the student; C D R O M must not underestimate the child; Shouldn't be too cute or cutesy; Program must not put on an air of superiority. 20) Constructivism Student is in control; Program design is based on simulation or on the child discovering information; Program encourages students to make choices and control (construct) their learning experiences 21) Technology: Makes appropriate use of technology (e.g., can the same objective(s) be reached as quickly/effectively or more quickly/effectively with another low tech resource); Helps instruct students on skills specific to computer technology (e.g. helps to develop mouse skills). 22) Assessment: Keeps records of student progress/performance 23) Learner Diversity: Accommodates a variety of learners; Pathways for a variety of student abilities; Able to configure program/data for individuals; Accommodates a range of users; Is suitable for multi-ages; Open Ended tasks; useful to a lot of students; Can be used in 3 or more grade levels. 24) Individual, learner; Design focuses on instructing a specific grade, age, or small group. Limited to instruction in two or less grade levels. Is good for the E S L students in my class? 25) Critical Thinking: Promotes the development of critical thinking skills; Thought provoking; Developes higher level thinking skills; Encourages problem solving. 26) Creativity: Encourages creativity.  C TECHNICAL DESIGN 27) Compatibility, Hardware/Software: Is it [CDROM] compatible with our computers (or with older computers)?; Is software [e.g. .graphics] transferable to/between other existing programs; No bugs to freeze program; C D wont freeze; Is the C D R O M compatible with existing security software?; How much memory does it use?; Upgradeable? 28) Dual Platform: Can the C D R O M be run on our Macintosh and IBM computers? 29) Multimedia Prudent use of animation/sound; Clear and age appropriate pronunciation/speech/voices; Verbal instructions clearly support text; Good sound; Makes good use of multimedia features, Virtual Tours; Allows children to use and make multimedia presentations. 30) Font/Text Clarity: Clear readable font/labels/captions; Text is clear. 31) Graphic Design: Good use of color & graphics; High quality graphics; Snappy clear graphics; Not busy or cluttered screens; Good/great graphics; Visually clear/interesting; Modern visual appearance; Clear Icons; Graphics and text support each other. 213  32) Program Tempo: Does not waste time (i.e. quickly involves user with fun/educationally sound material); C D moves along quickly; Does not have a long introduction. 33) Customization: Teacher can modify/change parameters/settings; Sound can be turned on/off. 34) Documentation/Help: Good documentation; Clear online help features (whether the help is directly off the CDROM or off the internet via the CDROM, like Encarta). 35) Intuitiveness: Easy to use; User friendly; User friendly interface; Easy to use, especially for young children; logical organization of material. This is a general intuitiveness, and not just limited to navigation. 36) Accessibility: Easy to run; Easy to start up; Easy to move around in program; Easy to get into and out; Easy access; Is it obvious how to proceed; Easy to escape/exit from program; Clear, easy instructions [navigation] Do not have to struggle with two disks; Search engine is easy to access and use. 37) Printing/Copying: Can you control printing (e.g., can you print selected material)!; Able to print graphics?; Able to copy graphics? 38) Networking (Also Site License): Can it [CDROM] be used on a Network?; Can I get a site license? 39) Internet: Advantageous to have C D R O M connect to the internet.  D  SOCIAL CONTENT 40) Producer's Values: Does not have commercial content; Production values parallel school's. 41) Social Content (General): Content appropriate (e.g., no racism, violence, sex, stereotyping); Appropriate values portrayed; 42) Language Use: Appropriate language used (e.g. no slang like the slang expression forget it").  E  OTHER 43) Previewing: Is it [CDROM] available to preview?; C a n I review/test the C D R O M before having to purchase? 44) Usage: Projected use (How much will the CDROM be used?). 45) Cost/Price: Purchase Price/Cost?; Is it expensive Update/Upgrade & Site/Package pricing? 46) Time: Have the time and opportunity to evaluate C D R O M (e.g. release time)  214  O Example of Content Analysis  Code Matches: Researcher 1 & 2: 1  1  1  Code Matches: Researcher 2 & 3:  1  1  1  1  1  Code Matches: Researcher 1 & 3: 1  1  1  1  1  Researcher 1's Criteria Codes:  9,15  Researcher 2's Criteria Codes:  9,15  Researcher 3's Criteria Codes:  9,15  2 16  1  16 16  1  2  1  1  1  13  5  11  12  1  13  5  11  2  12,4  1  13  1,5  11  12  Code Matches: Researcher 1 & 2:  2  1  1  1  1  Code Matches: Researcher 2 & 3:  2  1  1  1  1  Code Matches: Researcher 1 & 3:  2  1  1  1  Researcher 1's Criteria Codes:  2  9  7  Researcher 2's Criteria Codes:  2  9  7  Researcher 3's Criteria Codes:  2  9  7  Code Matches: Researcher 1 & 2:  1  1  1  Code Matches: Researcher 2 & 3:  1  1  1  1  1  1  Code Matches: Researcher 1 & 3: Researcher 1's Criteria Codes:  7  Researcher 2's Criteria Codes:  7  Researcher 3's Criteria Codes:  1 8  1  1  8 1  1 5  4  3  13  5  4  3  13  7  5  4  3  13  Code Matches: Researcher 1 & 2: 1  1  1  1  1  1  1  1  1  1  Code Matches: Researcher 1 & 3: 1  1  1  1  1  Code Matches: Researcher 2 & 3:  6  2  10  Researcher 1's Criteria Codes:  9  4  3  5,6  2,8  Researcher 2's Criteria Codes:  9  4  3  5,6  2,8  Researcher 3's Criteria Codes:  9  4  3  6  2,8  Code Matches: Researcher 1 & 2: 1  1  1  2  2  Code Matches: Researcher 2 & 3:  1  1  1  1  2  Code Matches: Researcher 1 & 3: 1  1  1  1  2  Note: The spreadsheet on this page, and on the next two pages, is an example of the content analysis conducted in this study. The analysis is for only one of the three stages of the interview, and for only one of the five group of educators.  215  18) Feedback  1,7,9  16) Interactivity  13  15) Consistency  12  14) Transferability  Researcher 3's Criteria Codes:  13) Depth of Content  7  12) Currency  9  11) Accuracy  9  1  10) Authentic/Authority  1  13  9) Canadian Content  13  12  17) Motivation  8) Integration/Diversity, Subject  4) Provincial Curriculum  12  Researcher 2's Criteria Codes:  7) Meaningfulness  3) Teacher Curriculum  6) Age Appropriate/Readability  2) Edutainment  5) Complimentary Resource  1) Educational  Teacher Librarian Teacher Librarian 2 Teacher Librarian 3  5  Researcher 1's Criteria Codes:  Researcher's Codes and Code Matches  CO  Teacher Librarian 4  5  CO  Teacher Librarian 5  7  CO  Teacher Librarians  CDROM Selection Criteria 1-17 (see Appendix N for Definitions)  36) Accessibility  4  6  ,15  Code Matches: Researcher 1 & 2:  1  1  1  1  Code Matches: Researcher 2 & 3:  1  1  1  1  21) Technology 16  Code Matches: Researcher 1 & 3:  1  Researcher 1's Criteria Codes:  29) Multimedia  15  14  Researcher 3's Criteria Codes:  26) Creativity  15  3  22) Assessment  35) Intuitiveness  34) Documentation/Help  33) Customization  32) Program Tempo  31) Graphic Design  30) Font/text Clarity  28) Dual Platform  27) Compatability  25) Critical Thinking  24) Individual Learner  23) Learner Diversity  20) Constructivism  3  4  19) Patronizing  4  14  6,16  Researcher 2's Criteria Codes:  1 1  17  3,4,10  6  1  3,4,10  2,6  2  3,4,10  6  Code Matches: Researcher 1 & 2:  3  1  Code Matches: Researcher 2 & 3:  3  1  Code Matches: Researcher 1 & 3:  3  1  Researcher 2's Criteria Codes: Researcher 3's Criteria Codes:  17  Researcher 1's Criteria Codes:  3  10  Researcher 2's Criteria Codes:  21  10  4,6  Researcher 3's Criteria Codes:  3  10  4,5,6  Code Matches: Researcher 1 & 2:  1  2  Code Matches: Researcher 2 & 3:  1  2  Code Matches: Researcher 1 & 3:  1  5  8  11,12  6  1  6  1  Researcher 1's Criteria Codes:  10  Researcher 2's Criteria Codes:  4,5,6  3 11,12  OD  Teacher Librarian 4  14  Researcher 1's Criteria Codes:  OO  Teacher Librarian 5  Researcher's Codes and Code Matches  OO  Teacher Librarian 3  Teacher Librarian 2  Teacher Librarian  Teacher Librarians  CDROM Selection Criteria 19-36 (see Appendix N for Definitions)  11,12  6  Code Matches: Researcher 1 & 2:  1  2  1  Code Matches: Researcher 2 & 3:  1  2  1  1  2  Researcher 3's Criteria Codes:  10  Code Matches: Researcher 1 & 3: Researcher 1 's Criteria Codes:  1 1  10  Researcher 2's Criteria Codes:  10  Researcher 3's Criteria Codes:  5  10  Code Matches: Researcher 1 & 2:  1  Code Matches: Researcher 2 & 3:  1  Code Matches: Researcher 1 & 3:  1  216  1  1 7  ;  7  1  1  7  cu o  SI  cu x: cj  CDROM Selection Criteria 37-45  CU  cu  CO  cu V) cu  CC  •g CO  *l_ 0)  cu  °c O T3  Researcher's Codes and Code Matches  or  c cu cu  cu m cu  o  3  CD  03 CU  o CO  r— JO  o  CO CU  Researcher 1's Codes Researcher 2's Codes Researcher 3's Codes Code Matches: 1 & 2 Code Matches: 2 & 3 Code Matches: 1 & 3 Researcher 1's Codes Researcher 2's Codes Researcher 3's Codes Code Matches: 1 & 2 Code Matches: 2 & 3 Code Matches: 1 & 3  cu o CO  cu  14 10 11  11.67  16  87.5% 62.5% 68.8%  72.9%  14 13 13  13.33  17  82.4% 76.5% 76.5%  78.4%  8 8 11  9.00  11  72.7% 72.7% 81.8% 100.0%  12 13 13  12.67  14  85.7% 92.9% 90.5% 92.9%  9 10 8  9.00  11  81.8% 90.9% 72.7%  17  Researcher 1's Codes Researcher 2's Codes Researcher 3's Codes Code Matches: 1 & 2 Code Matches: 2 & 3 Code Matches: 1 & 3 Researcher 1 's Codes Researcher 2's Codes Researcher 3's Codes  sz  10 10 5,10  14  Code Matches: 1 & 2 Code Matches: 2 & 3 Code Matches: 1 & 3 Researcher 1's Codes Researcher 2's Codes Researcher 3's Codes  o CO cu  Code Matches: 1 & 2 Code Matches: 2 & 3 Code Matches: 1 & 3  217  81.8%  

Cite

Citation Scheme:

        

Citations by CSL (citeproc-js)

Usage Statistics

Share

Embed

Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                        
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            src="{[{embed.src}]}"
                            data-item="{[{embed.item}]}"
                            data-collection="{[{embed.collection}]}"
                            data-metadata="{[{embed.showMetadata}]}"
                            data-width="{[{embed.width}]}"
                            async >
                            </script>
                            </div>
                        
                    
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:
http://iiif.library.ubc.ca/presentation/dsp.831.1-0054790/manifest

Comment

Related Items