UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

A study of an educational game for learning programming McKee-Scott, Jamie 2015

Your browser doesn't seem to have a PDF viewer, please download the PDF to view this item.

Item Metadata

Download

Media
24-ubc_2015_september_mckee-scott_jamie.pdf [ 2.7MB ]
Metadata
JSON: 24-1.0166288.json
JSON-LD: 24-1.0166288-ld.json
RDF/XML (Pretty): 24-1.0166288-rdf.xml
RDF/JSON: 24-1.0166288-rdf.json
Turtle: 24-1.0166288-turtle.txt
N-Triples: 24-1.0166288-rdf-ntriples.txt
Original Record: 24-1.0166288-source.json
Full Text
24-1.0166288-fulltext.txt
Citation
24-1.0166288.ris

Full Text

A Study of an Educational Game forLearning ProgrammingbyJamie McKee-ScottB.Sc., The University of British Columbia, 2012A THESIS SUBMITTED IN PARTIAL FULFILLMENT OFTHE REQUIREMENTS FOR THE DEGREE OFMASTER OF SCIENCEinTHE COLLEGE OF GRADUATE STUDIES(Interdisciplinary Studies)THE UNIVERSITY OF BRITISH COLUMBIA(Okanagan)June 2015c© Jamie McKee-Scott, 2015AbstractIntroductory, first-year programming classes are challenging for manystudents. Unlike other first year science courses, computer science is notrequired for entry to most first-year science programs, and students strug-gle with bug hunting and code planning, especially if they have never pro-grammed before. This thesis introduces the development of a web-basededucational game, called Code Age, as a tool for supporting the learning ofbasic programming concepts. Code Age is a mini-game style game designedto be used at the students convenience, and currently consists of three mini-games related to introductory programming concepts encountered through-out the process of learning programming. A pilot study was designed toevaluate the prototype in terms of using mini-games for learning, findingthe correct measures for evaluating engagement, and continuing the project.The pilot study consisted of a heuristic evaluation and usability study. Theheuristic evaluation found few usability errors in the software, which didnot impact the results and analysis of the usability study. The usabilitystudy contained three questionnaires, the Intrinsic Motivation Inventory, anengagement pre/post-test, and a usability questionnaire. The IMI showedthat students found value and enjoyment from the game. The engagementquestionnaire was adapted from the NSSE and its results showed promise,despite the fact that the study did not have enough participants and theadapted questions were too course-oriented. Suggestions for modified ques-tions are proposed. The usability study demonstrated that student opinionof the mini-games was diverse, but a wide variety of games is necessaryto maintain engagement and throughout these games, the balance betweenconceptual and physical challenge must be given much attention. The pilotstudy ultimately concluded that the engagement questionnaire needed revi-sion, the mini-games were a good choice, and the system was intuitive andeasy to use despite the usability errors. Finally, the study showed that eventhe students who did not care for Code Age itself felt that games are usefulto help learning and to help learning programming.iiPrefaceThe study detailed in this thesis was conducted with the approval ofthe UBC Okanagan Behavioral Research Ethics Board (BREB) under pro-tocol H14-01105. This work has been supported by a university graduatefellowship, a teaching assistantship, and a research grant.iiiTable of ContentsAbstract . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iiPreface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iiiTable of Contents . . . . . . . . . . . . . . . . . . . . . . . . . . . ivList of Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . viiList of Figures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ixAcknowledgements . . . . . . . . . . . . . . . . . . . . . . . . . . xDedication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xiChapter 1: Introduction . . . . . . . . . . . . . . . . . . . . . . . 1Chapter 2: Background . . . . . . . . . . . . . . . . . . . . . . . 32.1 Digital Game-Based Learning . . . . . . . . . . . . . . . . . . 32.2 Education Theory . . . . . . . . . . . . . . . . . . . . . . . . 62.3 Critique of Educational Gaming Research . . . . . . . . . . . 92.4 Usability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11Chapter 3: Game Development . . . . . . . . . . . . . . . . . . 133.1 Game Design Process . . . . . . . . . . . . . . . . . . . . . . 143.2 Development . . . . . . . . . . . . . . . . . . . . . . . . . . . 163.2.1 Prensky’s Six Strucural Elements . . . . . . . . . . . . 173.2.2 Game Mechanic 1: Buttons . . . . . . . . . . . . . . . 183.2.3 Game Mechanic 2: Q&A . . . . . . . . . . . . . . . . . 203.2.4 Game Mechanic 3: Point&Shoot . . . . . . . . . . . . 213.2.5 Application of Prensky’s Six . . . . . . . . . . . . . . 233.2.6 Good Computer Game Design . . . . . . . . . . . . . 25ivTABLE OF CONTENTSChapter 4: Methodology and Game Analytics . . . . . . . . . 274.1 Usability Study . . . . . . . . . . . . . . . . . . . . . . . . . . 274.1.1 Demographics and Background Questionnaire Pre-Test 284.1.2 Engagement Pre/Post-Test . . . . . . . . . . . . . . . 284.1.3 Intrinsic Motivation Inventory Post-Test . . . . . . . . 294.1.4 Usability Questionnaire Post-Test . . . . . . . . . . . 294.2 Heuristic Evaluation . . . . . . . . . . . . . . . . . . . . . . . 294.2.1 Demographics and Background Questionnaire . . . . . 304.2.2 Heuristic Evaluation Questionnaire . . . . . . . . . . . 304.2.3 Heuristic Follow Up . . . . . . . . . . . . . . . . . . . 314.3 Game Data Collection . . . . . . . . . . . . . . . . . . . . . . 31Chapter 5: Usability Study Analysis and Results . . . . . . . 365.1 Demographics and Background Questionnaire Analysis . . . . 365.2 Engagement Pre/Post-Test Analysis . . . . . . . . . . . . . . 405.2.1 Pearson’s Chi-Square Test . . . . . . . . . . . . . . . . 475.3 Intrinsic Motivation Inventory Analysis . . . . . . . . . . . . 505.3.1 Activity Perception Questionnaire . . . . . . . . . . . 505.3.2 Task Evaluation Questionnaire . . . . . . . . . . . . . 545.4 Usability Questionnaire Analysis . . . . . . . . . . . . . . . . 575.4.1 General System Questions . . . . . . . . . . . . . . . . 575.4.2 Aesthetics of the General System . . . . . . . . . . . . 595.4.3 The Caveman Island . . . . . . . . . . . . . . . . . . . 615.4.4 Game 2 . . . . . . . . . . . . . . . . . . . . . . . . . . 635.4.5 Game 3 . . . . . . . . . . . . . . . . . . . . . . . . . . 645.4.6 Game 4 . . . . . . . . . . . . . . . . . . . . . . . . . . 665.4.7 Game 5 . . . . . . . . . . . . . . . . . . . . . . . . . . 665.4.8 Final Questions . . . . . . . . . . . . . . . . . . . . . . 67Chapter 6: Heuristic Evaluation Analysis and Results . . . . 706.1 Demographics and Background Questionnaire Analysis . . . . 706.2 Heuristic Evaluation Analysis . . . . . . . . . . . . . . . . . . 716.3 Heuristic Follow Up Results . . . . . . . . . . . . . . . . . . . 736.4 Plot Line . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74Chapter 7: Conclusion and Future Work . . . . . . . . . . . . . 767.1 Conclusions of the Game . . . . . . . . . . . . . . . . . . . . . 767.2 Conclusions of the Study . . . . . . . . . . . . . . . . . . . . . 77Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81vTABLE OF CONTENTSAppendix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86Appendix A: Supplementary Data . . . . . . . . . . . . . . . . . . 87Appendix B: Screenshots . . . . . . . . . . . . . . . . . . . . . . . . 89Appendix C: Game Content . . . . . . . . . . . . . . . . . . . . . . 100Appendix D: Introductory Emails . . . . . . . . . . . . . . . . . . . 102D.1 Instructor Consent Email . . . . . . . . . . . . . . . . . . . . 103D.2 Usability Study - Student Email . . . . . . . . . . . . . . . . 104D.3 Heuristic Evaluation - Evaluator Email . . . . . . . . . . . . . 105Appendix E: Consent Forms . . . . . . . . . . . . . . . . . . . . . . 106E.1 Usability Study Consent Form . . . . . . . . . . . . . . . . . 106E.2 Heuristic Evaluation Consent Form . . . . . . . . . . . . . . . 109Appendix F: Questionnaires . . . . . . . . . . . . . . . . . . . . . . 112F.1 Usability Study . . . . . . . . . . . . . . . . . . . . . . . . . . 113F.1.1 Demographics and Background . . . . . . . . . . . . . 113F.1.2 Engagement Pre-test . . . . . . . . . . . . . . . . . . . 117F.1.3 Engagement Post-test . . . . . . . . . . . . . . . . . . 120F.1.4 Intrinsic Motivation Inventory . . . . . . . . . . . . . 124F.1.5 Usability . . . . . . . . . . . . . . . . . . . . . . . . . 128F.2 Heuristic Evaluation . . . . . . . . . . . . . . . . . . . . . . . 141F.2.1 Demographics and Background . . . . . . . . . . . . . 141F.2.2 Heuristic Evaluation . . . . . . . . . . . . . . . . . . . 143F.2.3 Heuristic Follow Up . . . . . . . . . . . . . . . . . . . 164viList of TablesTable 4.1 Logins and High Score Page Views Summary . . . . . 32Table 4.2 Game Score Statistics . . . . . . . . . . . . . . . . . . 33Table 4.3 Game Play Statistics . . . . . . . . . . . . . . . . . . . 34Table 4.4 Game Time Statistics . . . . . . . . . . . . . . . . . . 35Table 5.1 Demographic Summary . . . . . . . . . . . . . . . . . 37Table 5.2 Memory Games Agreement . . . . . . . . . . . . . . . 38Table 5.3 Reflex Games Agreement . . . . . . . . . . . . . . . . 39Table 5.4 Self-Rated Programming Ability . . . . . . . . . . . . 39Table 5.5 Engagement Questionnaire Part 1 . . . . . . . . . . . . 42Table 5.6 Engagement Questionnaire Part 2 . . . . . . . . . . . . 43Table 5.7 Engagement Questionnaire Part 3 . . . . . . . . . . . . 45Table 5.8 Engagement Questionnaire Part 4 . . . . . . . . . . . . 46Table 5.9 Engagement Questionnaire Part 5 . . . . . . . . . . . . 47Table 5.10 Activity Perception Results Part 1 . . . . . . . . . . . 51Table 5.11 Activity Perception Results Part 2 . . . . . . . . . . . 53Table 5.12 Task Evaluation Results Part 1 . . . . . . . . . . . . . 55Table 5.13 Task Evaluation Results Part 2 . . . . . . . . . . . . . 56Table 5.14 General System Questions . . . . . . . . . . . . . . . . 58Table 5.15 General System Aesthetic . . . . . . . . . . . . . . . . 59Table 5.16 Caveman Island Questions . . . . . . . . . . . . . . . . 61Table 5.17 First Game Questions . . . . . . . . . . . . . . . . . . 62Table 5.18 Second Game Questions . . . . . . . . . . . . . . . . . 64Table 5.19 Third Game Questions . . . . . . . . . . . . . . . . . . 65Table 5.20 Fourth Game Questions . . . . . . . . . . . . . . . . . 66Table 5.21 Fifth Game Questions . . . . . . . . . . . . . . . . . . 68Table 5.22 Final Usability Questions . . . . . . . . . . . . . . . . 68Table 6.1 Game Preferences . . . . . . . . . . . . . . . . . . . . . 70Table 6.2 Error Severity Data . . . . . . . . . . . . . . . . . . . . 73viiLIST OF TABLESTable A.1 The Chi-Square Data of the 29 Engagement Items . . 88Table C.1 Correct Reserved Words Used in Game . . . . . . . . . 100Table C.2 Correct Method Names Used in Game . . . . . . . . . 101Table C.3 Correct Class Names Used in Game . . . . . . . . . . 101Table C.4 IncorrectWords Used in Game . . . . . . . . . . . . . . 101viiiList of FiguresFigure 3.1 Button Mechanic figure. . . . . . . . . . . . . . . . . . 19Figure 3.2 Q&A Mechanic figure. . . . . . . . . . . . . . . . . . . 21Figure 3.3 Point&Shoot Mechanic figure. . . . . . . . . . . . . . 22Figure 5.1 Participant Age by Gender figure. . . . . . . . . . . . 36Figure 5.2 Participant Self-Ratings of English and Math Skills. . 38Figure 5.3 Participant Previous COSC Experience. . . . . . . . . 40Figure B.1 The Login Screen . . . . . . . . . . . . . . . . . . . . 89Figure B.2 The World Map . . . . . . . . . . . . . . . . . . . . . 90Figure B.3 The Caveman Island . . . . . . . . . . . . . . . . . . . 91Figure B.4 The entrance of the Cave . . . . . . . . . . . . . . . . 92Figure B.5 The Sabretooth Kitten . . . . . . . . . . . . . . . . . 93Figure B.6 The Caveman Guide . . . . . . . . . . . . . . . . . . . 94Figure B.7 The Greek Island . . . . . . . . . . . . . . . . . . . . 95Figure B.8 Inside the Greek Temple . . . . . . . . . . . . . . . . 96Figure B.9 The Greek Kitten . . . . . . . . . . . . . . . . . . . . 97Figure B.10 The Greek Guide . . . . . . . . . . . . . . . . . . . . 98Figure B.11 The High Score Page . . . . . . . . . . . . . . . . . . 99ixAcknowledgementsThere are a great many people who have contributed to the work of thisthesis, and without these hard-working and dedicated people, this projectwould never have gotten as far as it has.First, thank you to Dr. Patricia Lasserre. Your inspiration and guid-ance saw this project grow enormously. Without your support this projectprobably would have floundered like a fish out of water.Second, without Brayden Pellegrini we really could not have done thisproject without you. Your hard work, perserverance, and patience are in-credible. The number of times you had to go back and modify minisculeproperties of transitions and animations could probably be likened to someform of torture. Thank you for all of the time you put in to tinkering,developing, and testing.Third, Kate Eggleston’s art is lovely. Your work brought a whole newdimension of interest and legitimacy to the project. Without your art, thegame would not be a game.Fourth, Tony Culos’ dedication, ideas, and enthusiasm helped the cre-ative juices flow during development. You always brought a sense of cheerand optimism, and it was greatly appreciated on many occasions.Finally, thank you to my supervisory committee, Dr. Ramon Lawrenceand Dr. Susan Crichton. I sincerely appreciate your support, and thankyou for dedicating your time to my work. Your ideas and guidance wereinvaluable to me.xDedicationTo everyone who saw the potential in Code Age, and helped to see itcome to fruition.xiChapter 1IntroductionIntroductory computer science courses are often described as quite chal-lenging and as having a large drop-out rate. Unlike other introductoryscience courses, many students have never been introduced to programmingprior to university. Learning how to program is a process of many steps, andstudents often struggle with it. One study examined science students, in-cluding 459 who succeeded in introductory programming, and 119 studentsthat dropped the course, and determined that students dropped the coursedue to several reasons including, but not limited to, course scheduling andcourse difficulty [KM08]. This same study identified that students strug-gled most with bug hunting and planning original code. In general, someof the key identifiers of successful students versus unsuccessful students canbe described as “Students who showed academic interest were likely to pass,whereas students lacking self-direction or work efficiency were likely to dropout” [LI13].A method of helping students learn that has been gaining in popularity iseducational gaming, or digital game-based learning. In light of the strugglethat many students have with learning how to program, a prototype of aneducational programming game was developed, called Code Age. This workrevisited the Java-based prototype described in [LK10, MSL11] and createdan online system containing web-based implementations of the games de-scribed. This thesis work discarded the Java-based system and began thedevelopment of a purely web-based system. The back-end, server-side devel-opment, as well as the initial game implementation was done by the author,while the aesthetics, transitions, and image loading speed-ups were imple-mented by Brayden Pellegrini. All art work used in the system and gameswas created by Kate Eggleston.Given the popularity of game-based learning, it is clear that Code Age isnot the first prototype of its kind, in fact there are several others in the field[BPCL08, LW11]. However, Code Age appears to be unique in its use of themini-game style system and its focus on basic introductory programmingconcepts, such as reserved words. Most other games tend to focus on morecomplex concepts, such as recursion and iteration, and use more complex1Chapter 1. Introductiongaming systems, such as 3D role-playing game worlds.A prototype such as this raises many academic questions to answer. Willa game like Code Age engage or motivate students to succeed academicallyin a notoriously challenging subject area [GM10]? Will students even likea game for learning programming? Can an educational game for learn-ing programming be useful to students? Is Code Age user friendly? Mostimportantly, this thesis will attempt to evaluate Code Age with respect tomotivation and engagement. Other programming games have been analyzedanecdotally with respect to engagement and motivation, thus this thesis willattempt these analyses using a qualitative method. To answer these ques-tions inspiration and direction were pulled from three distinct fields: GameDesign, Education, and Human-Computer Interaction. The questions thisresearch will address specifically are: are games for supporting learning fea-sible, are mini-games a valid choice for learning games, what is the correctset of measures for evaluating engagement, and should this project be con-tinued?Overall, this thesis work looks at the development and evaluation ofan educational game for learning programming. Chapter 2 discusses theliterature around educational game development, while Chapter 3 detailsthe design and development of the educational game, Code Age. Chapter 4contains a brief discussion of the procedures and measures used in the two-part research study of Code Age, and Chapters 5 and 6 discuss a thoroughanalysis of the results obtained. Finally, the conclusions of the thesis andthe potential future work are discussed in Chapter 7.2Chapter 2BackgroundTackling the academic aspects of game development introduces a varietyof reviews and analyses of the current state of games, and incorporates topicslike best practices, challenges met, and design patterns. To delve directlyinto the most practical source of information for Code Age, one needs onlylook at the burgeoning subject of educational game design. Educationalgame design highlights the most important aspects of creating a game forlearning by describing frameworks, constructing definitions, and drawingconnections between education theory and game design practice.One of the most important aspects of designing a game for learning canbe discussed through the lens of education and education research. Educa-tion research has collected ideas and principles for fostering learning. Whilethere is no absolute “best practice” for getting students to learn, drawingon information from this area during both the development and the studyof an educational game is necessary for success. One of the several crucialtenets of modern day education theory revolves around learner engagementand, subsequently, motivation, which are consequently two key elements ofthis thesis.Drawing information from both Game Design and Education is criti-cal to create the underpinings of Code Age, but to create a well-roundedlearning game, it is necessary to also involve principles of Human-ComputerInteraction. HCI provides the principles and frameworks for good user in-terface design, which is essential in determining student satisfaction withCode Age. These ideas of HCI, combined with the ideas of engagement andmotivation from education, form the basis of the research study performedon Code Age. In this chapter, the current state of Educational Game De-sign, engagement and motivation in Education, and user interface design inHCI will be discussed.2.1 Digital Game-Based LearningA game has been defined to be “a voluntary activity structured by rules,with a defined outcome (e.g. winning/losing) or other quantifiable feedback32.1. Digital Game-Based Learning(e.g. points) that facilitates reliable comparisons of in-player performances”[YSC+12, p. 63]. This definition has been elaborated upon to define ed-ucational games, also known as learning games, as games that“target theacquisition of knowledge as its own end and foster habits of mind and un-derstanding that are generally useful or useful within an academic context.Learning Games may be associated with formal educational environments(schools and universities, online or off), places of informal learning (e.g.,museums), or self-learners interested in acquiring new knowledge of under-standing” [YSC+12, p. 64]. It should be noted that this formal definition ofan educational game clearly rules out models, simulations, and knowledgevisualization tools. In fact, Prensky goes so far as to categorize visualizationand simulations as forms of play, closer to toys than actual tools or gamesbecause they lack the “play-to-win” mentality that is so key to definition ofa game [Pre01a].There is a fair amount of terminology associated with educational gamesresearch, beginning with the concept of game-based learning, sometimescalled digital game-based learning (DGBL). Game-based learning can bedescribed as a set of three paradigms: author-based, play-based, and visu-alization based. Author-based games learning supports the student doinglarge amounts of programming in order to create or complete a game, andthe student’s reward is a fully functioning game at the end. Visualization-based games learning involves the student interacting with a 3-dimensionalworld in which they watch the execution of code and its resulting actions;the students do not focus on coding or game play, they focus on problemsolving. Finally, play-based gaming places less stress on coding and focussesmore on concept comprehension and mastery through the medium of playinggames [LW11]. It is interesting to note the similarities and differences be-tween educational games and DGBL as a whole. There is a clear intersectionbetween play-based DGBL and educational gaming, yet by its own definitioneducational gaming explicitly excludes visualization-based DGBL. A termoften used synonymously with DGBL is serious games. Serious games aredescribed as “games that do not have entertainment, enjoyment or fun astheir primary purpose” [SLL+12, CBM+12]. It is interesting to note thesubtle difference between this definition and the definition of DGBL. WhereDGBL explicitly states a definition of a game as a premise and then buildsa definition of DGBL from this premise, the definition of serious games re-lies upon an implied definition of a game to define itself. This assumptionseems to lend more credibility to the definition of DGBL when comparedside-by-side, as the definition provided by DGBL is more explicit and pre-cise. Code Age clearly falls into both the category of play-based learning42.1. Digital Game-Based Learningand the category of educational gaming.The taxonomy of digital game-based learning carries on to include cat-egories of games - sometimes called genres, which includes Action, Ad-venture, Fighting, Puzzle, Role-Playing, Simulation, Sports, and Strategy[Pre01a, CBM+12]. This system of classification allows for distinguishingsimilarities and differences between games. However, it has been argued thatclassifying educational games by genre is unnecessary, or even irrelevant asthe link between genre and learning is unclear [CBM+12]. If classification bygenre were necessary, Code Age would fall under the category of Adventuregame because you must solve puzzles to find a path through an unknowndigital world. Along with categorizing games themselves, the assessment ofgames can also be categorized. Assessment of gaming is described in twomain ways: game-centred assessment and player/learner-centred assessment.Game-centric assessment is the evaluation of the game and its environmentwith respect to ease of use, fun, education factor, and quality, while user-centric assessment is the evaluation of the user with respect to knowledge,skills, and motivation [GM12]. These evaluations can be done in two ways,summative assessment and formative assessment. Summative assessment isused to assess learning in order to determine if the attempted learning goalsor outcomes of the game were achieved. Summative assessment typically isperformed upon completion of an important learning event [GM12, May12].Formative assessment is used to evaluate the learning process with the hopeof improving it. One type of formative assessment is called stealth assess-ment. Stealth assessment is assessment that is embedded in the educationalgame in such a way that it goes unnoticed by the learners [GM12]. Thistype of assessment is typically done continuously throughout the use of thegame.As an educational game, Code Age clearly falls into the category of anadventure game. In this study, Code Age is assessed in both a game-centricand user-centric fashion. Frameworks have been created for measuring us-ability, fun, game design, and game-centred assessment. In order to focusmore on user-centred assessment, specifically in regards to motivation, morediscussion of the evaluation of educational aspects is required. Prensky be-lieves that “the principal roles of fun in the learning process are to createrelaxation and motivation. Relaxation enables a learner to take things inmore easily, and motivation enables them to put forth effort without resent-ment” [Pre01a]. Code Age has attempted to embody Prensky’s belief byallowing students to interact with the concepts they are learning in a funand risk-free environment.52.2. Education Theory2.2 Education TheoryStudent engagement is a subject of intense scrutiny in modern day Ed-ucation research. Engagement is researched at an international level byuniversities around the world, and in fact is one of the key constructs usedin university rankings [EK14]. Engagement has been described as essentialfor learning, that an engaged learner is open and ready to receive and pro-cess knowledge. Engaged learners set goals and anticipate positive outcomesin their learning; they value learning, and have productive work habits. En-gaged learners also tend to persevere and work harder when they encounterchallenges [SM12]. Indeed, engagement has come out as a multi-faceted con-cept for comprehending and improving outcomes of struggling and successfulstudents alike [FZ12].Engagement consists of both psychological and behavioral components,and in modern research engagement has been subcategorized into four cat-egories: academic engagement, social engagement, cognitive engagement,and affective engagement [FZ12, Ree12, Ain12]. Academic engagement in-corporates the embodiment of behaviors that affect learning, such as payingattention, participating in extracurricular academics, and assignment com-pletion both at school and at home. These types of behaviors are neces-sary for learning to occur [FZ12, Ree12]. Social engagement encompassesstudent behaviors that adhere to the constructs of proper classroom con-duct or etiquette, for example, puntuality, appropriate social interactions(with both students and instructors), in-class participation, and the abilityto not disrupt classmates and lectures. It has been noted that low socialengagement inhibits learning [FZ12]. Cognitive engagement occurs whenstudents go above and beyond the normal methods of comprehension andbegin to demonstrate inquisitive actions, confirmation of clarity, persever-ance, and the search for and incorporation of outside sources of knowledge(e.g., researching class topics on the internet, in the library, or from a col-league/expert). This type of engagement is where higher-level learning andthe learning of complex information occurs [FZ12]. Other literature alsoincludes the constructs or interest, effort, and use of strategy as key deter-minants of cognitive engagements [Ain12]. Finally, affective engagement isthe emotional investment of being involved, whether in the classroom or theschool as a whole, that school is a worthwhile use and investment of timeand energy. Affective engagement is characterized by feelings of belongingand community, and is the source of the motivation required for students toexist in the academically, cognitively, and socially engaged states. In otherwords, without affective engagement, students would not have the energy or62.2. Education Theorydrive to be engaged academically, cognitively, or socially [FZ12].Other literature [Ain12, Ree12] also recognize emotional engagement andagentic engagement. While each paper’s groupings of the main factors thatcomprise each type of engagement are similar, each differs slightly or encom-passes slightly different constructs; meaning that where one author may cat-egorize interest as a subcomponent of cognitive engagement, another papermay categorize interest as a subcomponent of emotional engagement. Whilethese differences do appear, most literature groupings all share common,standardized traits or constructs, in fact, very similar to those discussedabove.It is worthwhile to note, that it is difficult to create a taxonomy ofengagement that categorizes all constructs of engagement without overlap[Ain12]. When discussing engagement, the underlying process of motivationmust also be addressed. Where engagement encompasses a series of openlyobserved behaviors, motivation is a psychological process comprised of unob-servable neural and biological parts [Ain12, Mar12]. In essence, engagementcan be seen as “a descriptor for the level of involvement or connection be-tween a person and activity” [Ain12, p. 285], while motivation is “necessarybut not sufficient for engagement” [Ain12, p. 286].Reeve [Ree12, p. 165] has shown that changes in engagement affect mo-tivation in a causal way. Motivation, or the forces that drive or power be-haviors and actions, can be generated from many sources, and is necessaryto fuel student engagement [Ree12, p. 150]. One way to distinguish motiva-tion from engagement is to view engagement as the student truly connectingwith the task at hand, instead of simply, robotically carrying out the task,while motivation is the force that pushes or energizes the student to makethis connection with the task at hand [Ain12, p. 285]. The key indicatorin measuring intrinsic motivation happens to be interest [The99], howeverother scales are often used to measure motivation in more general ways. Inparticular, self-efficacy scales are often used to analyze motivation in DGBLresearch [GM12].Self efficacy is a person’s belief of self-perception of his or her own abil-ity to learn or perform and a specific level of competency [SM12, p. 219].Self-efficacy has been demonstrated to positively effect learning, achieve-ment, and motivational aspects such as interest, effort, and perseverance.Students who feel self-efficient have been shown to be engaged learners, andas discussed, engaged learners have more skills and tools for success in theirbehaviors [SM12].It is clear that there are many factors involved in research on engage-ment and motivation, and engagement is a critical component of effective72.2. Education Theorylearning. To embody these components in Code Age, to develop it as a toolof successful learning, Code Age will be evaluated on its ability to motivatelearners in introductory programming. This evaluation will be a way todetermine if this specific instance of game-based learning can engage andmotivate students in learning programming.It is because of the belief that without instructor guidance games tendto not reach their full potential for encouraging student learning that CodeAge has been designed as a learning tool to integrate with an introductoryprogramming course. This belief in instructor guidance is one that Greenoagrees upon, with respect to the importance of interaction in learning, com-menting that “Students’ learning can be influenced by the ways they arepositioned for participation in interaction. Students can learn to apply pro-cedures and to explain meanings of concepts, but they learn differently ifthey participate by generating, questioning, and evaluating ideas” [Gre06,p. 813]. Instructor-student interaction is valuable in all circumstances oflearning, and instructors can foster student discussion and influence howstudents learn, by scaffolding the student learning through educational gam-ing. Instructors can also engage in learner scaffolding while addressing theconcern of Young et al. that learning through games is insufficient to mas-ter knowledge of subject matter, and there must also be a bridge betweenin-game conceptions of the subject matter and real-world experience andapplication of the subject matter [YSC+12, p. 73].Educational games may try to bridge this gap alone, but not all gamessucceed. This deficiency is where student-instructor interaction is critical inguiding student learning. Of equal importance to learner-teacher interactionin the learning environment is learner-learner interaction. As games tend tobe intrinsically social in nature, educational games have the potential to bebeneficial to all who are engaged with them, not only those who are playingthem. These benefits occur because the learners who watch their fellowsplay these educational games have more time to think and reflect on thetasks and challenges than the learners who are actively processing how tocomplete and solve these tasks and challenges [YSC+12, p. 75]. This timefor reflection tends to promote discussion and engagement to the benefit ofthe learning process.Motivation has been shown to be an integral part of students, theirachievements, attributions, and beliefs about themselves [Gre06, p. 805].Motivation has also been intrinsically linked to educational games and theirsystems of reward based on the successes and failures of the game player.Howard-Jones et al. discuss motivation, reinforcement learning, and educa-tional gaming through the lens of psychology; through this lens they posit82.3. Critique of Educational Gaming Researchthat simplicity and repetition, like that required of reinforcement learning,may shed light on the successes and failures of educational games [HJ11,p. 33]2.3 Critique of Educational Gaming ResearchOne agreed upon commonality of educational gaming research is that it isdiverse, fractured, inconsistent, and inherently individual, therefore makingit difficult to quantify research on game-based learning [CBM+12, SLL+12,May12, YSC+12]. While there is no definitive cause for this inadequacy,many possible causes and solutions have been posited.One solution described by Young et al. asserts that successful educa-tional games need to be well designed, with game play based on currentlearning and education theories, so that the game operates in a way thatconsiders the learner and his or her capabilities and learning environment,in order to encourage the student to reflect on the knowledge presented bythe game. Furthermore, many learning games only present information tothe learner topic by topic, with no sense of cohesive general knowledge orcurriculum, and this lack of cohesion is unfavorable to the learner’s overalllearning, comprehension, and even conceptual understanding of the sub-ject matter. As a result, it can be stated that educational games shouldbe designed based on “systematic learning objectives”, and in ways that“move beyond the lowest tiers of Bloom’s taxonomy” [LI10] in order tobuild upon student knowledge so that instead of learners experiencing rein-forcement learning, they are actually developing conceptual understandingsof the subject matter. The goals of the games should align with the cur-ricular learning outcome of the subject matter, and the lack of cohesion ofconcepts presented to learners in educational games could be remedied bycreating educational games that utilize relevant, well thought out, narrativesand backstories in order to engage the learners and scaffold their knowledgebases [YSC+12, p. 68-72].Essentially, what educational gaming research is lacking, is a general“science of game-based learning” [May12]. DGBL research is a field thatis highly fragmented, with no comprehensize or cohesive multi-use frame-works of evaluation. There are few, if any, well-known or widely acceptedtheories upon which to postulate and test hypothesis. There are few widely-accepted, valid questionnaires, frameworks, scales or constructs for seriousgames. One of the most easily remedied problems is the lack of a general-ized, standardized tool for stealth formative data-gathering. This inadquacy92.3. Critique of Educational Gaming Researchcould be attributed to the lack of a overarching methodology for DGBL re-search [May12]. This movement towards a science of GBL is importantbecause education researchers are obligated to be accountable and respon-sible to their users. Researchers are held accountable insofar as the learnersthat acquire these learning games should know what they are using andwhat outcomes are achievable from these products. Researchers are held re-sponsible because they are providing education tools to vulnerable membersof society, such as children, and therefore researchers must reflect seriouslyand critically on how the effects and values of the games they are promoting[May12].This fragmentation is a challenge that must be overcome by researchersof educational gaming, and as such has been addressed by many researchers.For example, Suttie et al. have developed a theoretical framework for thecategorization of learning games, for the ultimate purpose of identifying re-lationships between game mechanics and educational outcomes in order todevelop pedagogically sound and effective learning games [SLL+12]. Mayerhas proposed a very extensive methodology or framework for the evalua-tion of educational games; including a thorough list of all of the elements,models, variables, and research design that he deems most important in theresearch of serious games [May12]. Connolly reviews 129 papers that reportevidence of outcomes attributed to games and learning games, and catego-rizes the research methods used by each type of study as a way of identifyingthe claims that educational games research makes [CBM+12]. Ghergulescuidentifies a set of measures that are commonly used by researchers analyz-ing the motivation of learners playing educational games [GM12]. Thesesolutions indicate that the fragmented research in DGBL is apparent to thecommunity, and many researchers are working towards solutions, however,until a formal standard is introduced and accepted, solutions to the frag-mentation problem are fragmented themselves.One final issue often raised is that educational games alone are notenough to immerse students in learning. Howard-Jones discusses this prob-lem in particular how there are many challenges in mixing learning andgaming, and therefore there are many possible explanations for the failuresof educational games [HJ11, p. 33]. Howard-Jones et al. describe the idealinteraction of educational games with the classroom and teacher as using“techniques that exploit the engaging properties of uncertain reward con-joined with the teacher’s scaffolding of learning, feedback to students, andthe general dynamic properties of teacher-learner interaction” [HJ11]. It canbe stated that in order for educational gaming research to move forward,there must be a concentrated effort to cohesively organize the knowledge102.4. Usabilityand tools required for the mastery of this domain. The way educationalgames are created and collectively examined should be improved througha combination of conducting “longitudinal studies that examine the impactof educational video games”, “creating an educational video game reposi-tory” and encouraging “collaborative partnerships among commercial gamecompanies, educational researchers, teachers, administrators, policy-makers,and parents” [YSC+12, p. 82]. This sentiment of partnership is a sentimentclearly shared by Greeno, who stresses the importance of the social aspectsof learning and discourse for concept comprehension and mastery [Gre06].Engagement in games has become obvious by now, given the massivesuccess of the gaming industry, but what is less understood is the ideaof enjoyment in games [CBM+12]. While many models for the evaluationof game-based learning have been proposed, some research suggests thatthere are two main opposing views; namely, that research on either generalmedia enjoyment should take priority or game usability should take priority[CBM+12]. In order to attempt to further the discussion of engagement indigital game-based learning, the concepts of usability and Human-ComputerInteraction must be explored.2.4 UsabilityUsability, along with educational values, enjoyment, and quality of ex-perience, have been considered the main methods of assessing the gameenvironment of an educational game [GM12]. In the interest of performinggame-centered assessment of the usability of Code Age, the field of Human-Computer Interation can be consulted. Human-Computer Interaction is thestudy of interactions between users and user interfaces, and the design anduse of these interfaces [MW04]. Usability is a key component of HCI thatinvolves the study of user-ease and apparent efficiency, specifically payingattention to usefulness [MW04]. HCI provides the framework for a usabilitystudy, in particular how to develop a mixed methods approach to evalu-ating user interface design shortcomings, failures and successes, as well asease of use, learnability, likability, aesthetics, and more. The first step ofthe project involved creating a usability study to analyze every aspect ofthe interface design of Code Age. This analysis should be performed by astudent population enrolled in a first-year programming course in order torepresent the target audience of the educational game, as this populationwill provide data that will allow the game to be more accurately designedto engage and educate first-year programmers.112.4. UsabilityAlso of particular interest for the Code Age project is the work of Nielsenin developing a framework of Heuristic Evaluation. Heuristic Evaluation isan expert analysis of the system for the determination of usability errors andtheir severity [Nie95, Nie92]. The heuristic evaluation involves the use of 3-5expert evaluators who follow a set list of steps to become familiar with a userinterface. As they complete each step, the experts report the level of successat which the system performed on 10 major usability criteria, also calledthe heuristics. The experts are also asked to elaborate upon any usabilityfailures that the system encountered during this process. Once all usershave completed the heuristic evaluation, a list of all usability errors found iscreated. This list is then distributed back to the experts, who are asked toevaluate the severity of each error found, allowing the researcher to createa prioritized list of errors to address in the next iteration of development[NM90]. The success of the heuristic evaluation can be attributed to thefact that the evaluation is most efficient when performed using 3-5 expertevaluators, as this is all that is required to find the majority of the usabilityerrors [Nie95]. This combination of usability and heuristic evaluation shouldcreate a thorough analysis of the interface at its current state, and providea detailed list of design and usability aspects to be improved upon as theproject moves forward.The field of educational gaming research is diverse and lacks recognizedstructure, however it is clear that the development of an educational gameis truly the intersection of three main areas of interest; human-computerinteraction user interface design, education theory, and game design. Thisthesis integrates each of these subjects in turn, and explores the aspects ofstudent motivation related to educational gaming.12Chapter 3Game DevelopmentA person’s attraction to games, both digital and not, is based on severaldifferent factors, including culture, ethnicity, religion, age, and even the eraa person was raised in [Pre01a]. The knowledge of a game’s target audienceenables a game developer to design games that cater to the preferencesand needs of the audience, including the factors mentioned above. Duringthe development of Code Age, the target audience was clearly defined asintroductory programming students; namely, male and female students aged17-25 (but possibly older), with little to no programming knowledge. Peopletend to appreciate experiences and interactions that they enjoyed at earlierimpressionable ages, say pre-teen or teen years, and given this appreciation,as a game designer, it is beneficial to select game styles that the targetaudience is familiar with and that they can appreciate. Since mini-gamestyle games have been popular for several generations of gaming consoles, itseemed a logical step in the design of Code Age to focus on creating mini-game style games that the target audience would be familiar with. ConsiderMario Party, the longest-running mini game series, originally released forthe Nintendo 64 in 1998. The 11 year olds who started playing Mario Partyat its release would be 28 years old now, and anyone younger than thatwould have been born after the release of Mario Party, meaning that thestudents that fall into our target age range of 17-25 should be quite familiarwith mini-games. On a related idea, designing shorter, less intense games tointroduce students to programming concepts seemed to be the best approachbecause it was postulated that students would find these activities accessible,unintimidating, and easy to engage with. The games were designed to beweb-friendly, so that any student with access to a computer could quicklyand easily begin to play a game any time and any place. The games weredesigned to be short in length so that students can easily start playing thegame and then stop, and then pick it up again easily later, to allow studentsto play, without major time commitment, whenever they have a moment ortwo to spare. In this mini-game play style, the students are able to practicesyntax and repetitive programming concepts easily. These traits aim tomake the game accessible and appealing to students.133.1. Game Design Process3.1 Game Design ProcessMany other programming games and tools exist, and they have manydifferent purposes. There are visualization tools and simulations, such asSIMPLE, an environment developed for helping students create game simu-lations [JCS09]. There are development environments, such as BlueJ [BK03]that are designed to be simpler than traditional IDE’s like Eclipse and Net-beans. Some of these development environments have aspect of gamificationor visualization, like GreenFoot and Alice [K1¨0, Gad11], which use drag anddrop visualization to solve programming problems. Ultimately all three ofthese environments are not games by Prensky’s definition of what a gameis, and in fact he classifies these as visualization-based DGBL. BeanGrinder(formerly JavaGrinder) is a web-based, problem solving style tool for learn-ing programming [FP10]. The students are presented with a problem tosolve, and they must write the code required to solve the problem. Thisstyle of game encourages the students to practice programming. There arealso tools that allow students to build their own games using the program-ming language they are learning or using a drag and drop editor. An ex-ample of this type of system is GameSalad [DeQ12], which is a populartool for mobile game development. However these types of the tools arenot truly games, they actually fall into the author-based paradigm DGBL.When looking outside the realm of academically based programming gamedevelopment, a much larger field is encountered:− Collections of gameshttps://www.codingame.com/ is a collection of puzzles that re-quire the player to write code to solve the puzzleshttp://www.kongregate.com/programming-games is a collectionof primarily 2d puzzles that have the player solve logic problems orfunctions. This site also contains a puzzle game where the player is“hacking” into a system and uses real Unix shell commands to do so.Both of these game collections are clear examples of play-based DGBL,however they do not have an engaging story or plot line, they are justcollections of games.− Story-based gameshttps://codecombat.com/ is a role-playing style game for learningscripting languages. The player completes quests by writing functionsto move the player through the dungeon. Code Combat is a larger,143.1. Game Design Processmore involved world of games, and less of a collection of smaller games,but it clearly uses play-based DGBL.− Author-based gamesRobocode (http://robocode.sourceforge.net/) is “a programminggame, where the goal is to develop a robot battle tank to battle againstother tanks in Java or .NET. The robot battles are running in real-time and on-screen.” Robocode clearly uses author-based DGBL witha strong element of live competition against other robot creators.Code Age is different from all of these games. It combines aspects fromsome of these different games, and creates some of its own unique ideas aswell. Where Code Age uses the ideas of practice, like BeanGrinder, andcollections of games, like Kongregate and CodinGame, it uses a collectionof mini-game style games. Code Age has, however, an involved world orstory line, similar to Code Combat. It currently has a lesser level of compe-tition than Robocode, as Code Age uses only the high score boards. WhereCodingame, Robocode, and the development environments force the studentto focus on writing their own code, Code Age does not focus on mini-gamesthat require the player to write code. The design of Code Age attemptsto combine the best aspects of play-based DGBL into a unique tool forsupporting the learning of Java programming.The central theme of having lost all knowledge of programming was theinspiration for the name “Code Age”. This central theme was the mainidea that drove the design of the game, and this story or metaphor wasintrinsically integrated with the game as the game was developed. The factthat a university is a place where knowledge is gathered and built influencedthe choice of using the idea of slowly regaining lost knowledge, and havingsociety advance as you do so. This theme can be seen clearly in the islandprogression from the caves of the caveman island to the temples of theGreek island, and the long houses of the Viking island, and so on. Thedesign of the world map and the choice of the game name came from theidea that the game should be about gaining knowledge. While this storywas used as the theme of the game, a direct plot line was not selected andimplemented. Several plot lines were developed with the intention of havingthe participants of the pilot study select a preferred plot line, the results ofwhich can be seen in section 6.4.Code Age will require more mini-games in order to maintain its edu-cational value in later parts of the course. The mini-games will need toencompass the subjects covered further into the semester, such as condi-153.2. Developmenttionals and loops, arrays, classes, and object usage. The conclusion of thegame section of this thesis covers these details more in-depth.3.2 DevelopmentCode Age was designed as a tool for scaffolding, to be used for studentsto bridge knowledge gaps and apply the theoretical knowledge learned in thecourse. In this way the mini-games were designed to complement the lecturematerials. Concepts that students often struggle with were selected as keysubjects to focus on during game development. These concepts includedrecalling reserved words, proper capitalization of methods and classes, un-derstanding code block statements, and understanding parameter passing.All of these concepts are fundamentals that are covered early on in thecourse, but students struggle with them, and they cannot advance with-out proper understanding of these concepts. By focusing on these concepts,the games have educational value in the scope of the first two weeks or soof an introductory programming course. Mini-games were selected as themethod of delivery for these programming concepts because mini-games arewell-known and well-liked and require low commitment of time. Mini-gamesallow the students to interact and get practice with the game wheneverthey have time to spare, and they do not overwhelm the students or forcetoo much information on the student at one time. The mini-game style al-lows the student to progress at their own speed. By allowing the studentsto spend time interacting with a tool that gives them extra practice withthe concepts they are learning, Code Age is used to support the learningoutcomes of the curriculum.In the early stages of the development of Code Age, a very simple pro-cess was followed for coming up with games to create. First, a commonor popular game mechanic was selected. In this thesis a game mechanicis understood to be a typical method of player interaction with the game,such as using the mouse or certain sets of key combinations to interact withthe elements of the game. One well-known example of this type of gamemechanic is using the WASD keys for walking around a 3D world. Second,this game mechanic was then applied to a programming concept. For exam-ple, when the point-and-shoot game mechanic was selected, the next stepwas to determine what programming concept(s) could be taught using thismechanic. Often the game mechanic could be used as a metaphor for morethan one programming concept. At this point it was important to decidewhich metaphors to prioritize and focus on and begin developing the game.163.2. DevelopmentThe process of game design was based on Prensky’s principles of good gamedesign [Pre01b].Each game was designed to help reinforce the students’ current conceptmap, mental model, or knowledge of each concept. For example, the pro-gramming concept of passing parameters is conceptualized in the game asa cannon shooting things into different baskets. This visualization is oneexample of a mental model of how parameters are passed. If this mentalmodel is similar to the one the student currently has, it can reinforce thatmodel. If it different from the mental model the student currently has then itmay cause the student to reflect upon their own model, and reevaluate it forcorrectness, and in this way reinforce the student’s own mental model. Thegames were not designed to be educationally challenging, because the coursematerial is challenging enough as it is. However, the games were designedto use Prensky’s definition of challenge in games, by physically challengingthe students, either through timing the game, or aiming the cannon, andcausing the student to feel pressure to succeed.3.2.1 Prensky’s Six Strucural ElementsPrensky describes the following six elements as key structures of games:1. Rules2. Goals and objectives3. Outcomes and feedback4. Conflict/competition/challenge/opposition5. Interaction6. Representation or storyThese structures are important because they are crucial to a game’s abil-ity to engage a student. Structure, along with fun and play are the maincomponents of engagement in gaming [Pre01a].Rules are what turn play into organized play, or games, by setting bound-aries and limitations on play, or imposing order on how achievements andgoals can be reached by all players. Rules impose order on how goals maybe achieved, and on the strategies used to accomplish goals. Goals are aclearly defined measure of a player’s performance or success, and becausehumans are goal-oriented by nature, achieving goals is a natural source ofmotivation. Outcomes are how players measure their progress towards goals173.2. Developmentthrough receiving feedback. Feedback can take many forms, numerical orotherwise, but feedback is always an immediate response from the gamethat instructs the player that his or her actions were positive or negative.Feedback also tends to inform players on how they compare to other players,and even when the rules of the game have been broken. Feedback is wherelearning takes place in-game; whether the player is merely learning the rulesof the game, or actually learning concepts via direct instruction, some formof continuous learning occurs through feedback [Pre01a]. Erhel et al. havedemonstrated that deep learning and digital learning games are compatibledue to the use of feedback providing correct answers, which stimulates thecognitive processes of learners [EJ13].Conflict refers to the challenges or obstacles that players are solvingor overcoming. These challenges may be competitive or cooperative, butin both cases the players’ creativity or excitement is stimulated [Pre01a].Connolly et al. note that feedback is a more effective tool than competition isin game-based learning [CBM+12]. While some people tend to avoid conflict,most people appreciate the stimulation provided by a challenging task, andin game design, the balance of matching the difficulty of the challenge to theskill of the player is necessary to prevent the player from becoming boredor discouraged.The two key elements of interaction are feedback, which has been dis-cussed, and the social nature of games. Multi-player games are commonplace and demonstrate the social properties that games tend towards. Fi-nally, representation, or story is the narrative, or the meaning, of the game[Pre01a]. The representation of a game engages the player, and can rangefrom elements as varied and imaginary as fantasy (ex. role-playing andaction adventure games), to elements as logical and basic as conflict andstrategy (ex. chess) [Pre01a]. Without a story to follow or a point to make,games tend to feel empty, and leave the player wondering why he or shebothers to continue playing. Keeping Prensky’s six structural factors inmind, the mini-games designed for Code Age can be discussed.3.2.2 Game Mechanic 1: ButtonsThe first mini-game, pictured in figure 3.1, is a screen containing an as-sortment of buttons and each button contains a word and uses the mechanicof clicking on buttons to receive feedback. The player is told to determinewhich buttons contain the correct words and click on them, without clickingon the incorrect words. Every time the player selects a correct word, thescore increases and the color of the button changes to green. Every time the183.2. DevelopmentFigure 3.1: One of the Code Age mini-games using the button game me-chanicplayer selects an incorrect word, the score decreases, and the color of thebutton becomes red. The player is given 30 seconds to select all of the cor-rect buttons. Once the 30 seconds have elapsed, the player is presented withtheir final score percentage, and an image of gold stars, where the number ofgold stars displayed corresponds to the percentage they achieved (i.e., 100%= 5 stars). See Appendix B for the list of correct and incorrect words usedin Code Age.Three stages have been developed for this game mechanic. In the firststage, every five seconds a button containing an incorrect word is removedfrom the screen. If the player has already selected this button and it hasturned red, the button’s removal from the screen does not affect the player’sscore, as the player has already made the mistake of clicking on the incorrectword. In the second stage, the buttons remain statically on the screen, andno buttons are removed. In the final stage, again, none of the buttons areremoved from the screen, but all of the buttons move randomly around thescreen.Currently in Code Age, this three-stage button mechanic is used fourtimes. First, the player is asked to determine the correct Java reservedwords from a collection of random words that sound like they could be193.2. DevelopmentJava reserved words, but are not, mixed with correct Java reserved words.Second, the player is asked to determine the correct Java class names from acollection of random words and correct Java class names. Third, the playeris asked to determine the correct Java method names from a collection ofrandom words and correct Java method names. Finally, the player is askedto find something different at each stage of the game. In the first stage, whereincorrect buttons are being removed, the player is asked to pick out the Javareserved words, from the class names and the method names. In the secondstage, where no buttons are removed from the screen, the player is asked topick out the Java class names from the reserved words and method names.In the third stage, where the buttons are moving, the player is asked to pickout the Java method names from the reserved words and class names. Thepurpose of this game is to help students learn, through direct instruction,three main concepts. First, students should learn what words are Javareserved words so they cannot be used as anything but for that purpose.Second, students should learn the convention that Java class names beginwith a capital letter, and the names of some common Java classes. Finally,students should learn the capitalization conventions for method names aswell as some common Java method names.3.2.3 Game Mechanic 2: Q&AThe second mini-game, pictured in figure 3.2, provides the player witha screen containing the code for a basic Java class and uses a questionand answer style game mechanic to engage the player. At the top of thescreen the player is presented with a timer counting upwards and a questionpertaining to the code snippet on the bottom of the screen. The playeris able to click on any line of code on the screen to answer the question.Upon selecting the piece of code corresponding to the correct answer tothe question, the selected line of code turns green, the score increases, thetimer resets to zero, and the question disappears off of the screen. Uponselecting an incorrect answer to the question, the selected line of code turnsred, the score remains unchanged, the timer resets to zero, and the questiondisappears off of the screen. In both situations, after the question has beenremoved from the screen, a new question is placed at the top of the screen,and the color of all of the pieces of code returns back to white. Uponcompleting all of the questions, the player is presented with their final scorepercentage, and an image of gold stars, where the number of gold starsdisplayed corresponds to the percentage they achieved (i.e., 100% = 5 stars).Each time the player plays the game, they are asked 5 questions which203.2. DevelopmentFigure 3.2: One of the Code Age mini-games using the Q&A game mechanicare randomly selected from a bank of questions about the code snippet.This increases the replay value of the game because the players are not be-ing asked the same questions each time they play. An example of a samplequestion is, “Where does the class begin?” The player is expected to selectthe opening curly bracket of the Java class that follows the class header.Currently there is only one stage of this game containing basic Java classcode, and the questions for this game apply to constructors, accessors, mu-tators, and instance variables. See appendix for a complete list of questionsused in Code Age. The purpose of this game is to teach students, throughdirect instruction, about block statements and class structure, specificallywhere different elements of Java classes begin and end, and where conven-tion dictates they should be placed in the class. This game could be easilyrepurposed for use in teaching students about the form of if statements,loops, and parameter lists.3.2.4 Game Mechanic 3: Point&ShootThe third mini-game, pictured in figure 3.3, shows a set of baskets con-taining functions or methods and cannon balls containing inputs or param-eters, and the game uses a point-and-shoot game mechanic to pique the213.2. DevelopmentFigure 3.3: One of the Code Age mini-games using the Point&Shoot gamemechanicplayer’s interest. At the bottom of the screen the player is presented withseveral baskets and each basket contains a method. Each basket’s methodcontains a unique set of formal parameters which makes each method itselfunique. At the top of the screen there is a cannon and next to the cannonis a display area indicating the actual parameter that is currently loadedinto the cannon. The player aims the cannon using the left and right arrowkeys, and shoots the cannon using the space bar. The point of the game isto shoot the cannon ball containing the actual parameter into the correctbasket containing the method whose formal parameter(s) matches the actualparameter.When the player successfully shoots the parameter into the correct bas-ket, the score in the top right corner turns green and increases. When theplayer shoots the parameter into an incorrect basket, the score turns red anddoes not increase. The misfired parameter is then reloaded into the back ofthe queue of cannon balls to give the player another chance to put it in thecorrect basket. When the player has successfully shot all of the cannon ballsinto the baskets, the game is over and the player is presented with theirfinal score percentage, time taken, and an image of gold stars, where thenumber of gold stars displayed corresponds to the percentage they achieved(i.e., 100% = 5 stars).An example of a correct pairing in this game would be the player shoot-ing the actual parameter “Hello” into a basket containing the function223.2. DevelopmentmyFunc(String s). Each game, four pairs of actual parameter and basketmethods are randomly selected from a limited bank of pairs. This increasesthe replay value of the game, but only slightly. Ideally the bank of pairswould be larger, and more diverse to significantly increase the replay valueof the game. See appendix C for a complete list of method and parame-ters pairs used in Code Age. The purpose of this game is to help studentsdevelop a visual understanding of the relationship between formal and ac-tual parameters, in order to help students develop their own mental models.More game screen shots can be found in Appendix B.3.2.5 Application of Prensky’s SixThe mini-games described above have very clearly and definitively ap-plied Prensky’s structural factors of good game design.The application of Prensky’s first factor, rules, can be seen in all threegame mechanics. In the Button game mechanic there are clearly definedrules of selecting correct buttons from incorrect buttons, and the corre-sponding score updates. The rules of the Q&A game mechanic are clearlydefined by what the player can and cannot select, especially with respectto correct and incorrect answers. The point&shoot game mechanic rulesare clearly dictated by the fact that the player may only tilt and shoot thecannon using the appropriate keys, as nothing in the game is clickable andno other keys control the game. It is clear that correct shots increase thescore.The three game mechanics used in Code Age all utilize Prensky’s secondfactor of goals. The goal of both the button mechanic game and the Q&Agame mechanic is obvious and straight forward to achieve; select all of thecorrect buttons/answers and none of the incorrect ones to achieve a 100%score. The goal of the point&shoot game is to put all of the parameters inthe correct baskets to achieve 100% in as short a time as possible.Feedback may be the most obvious structural factor implemented in thegames. The different occurrences of the button game provide the playerwith feedback in four specific ways. First, upon selecting a word, the buttonchanges color, which provides the player with feedback on which of theiractions are positive and which are negative. Second, in the first stage of thegame, incorrect buttons are removed from the screen, providing the playerwith feedback about which buttons would be incorrect or negative actions.Third, at the end of the game, the player is provided with a score, allowingthe player to measure his or her progress towards achieving a goal of 100%.Fourth, as the timer decreases from 30 seconds to zero, the color of the233.2. Developmenttimer changes to warn the player that he or she is running out of time. TheQ&A game provides the player with feedback in three ways. First, uponselecting a piece of code, the piece of code changes color and indicates tothe player which of their actions were positive and which were negative.Second, at the end of the game the player is presented with their final score,which allows the player to check his or her progress towards completing thegame with a score of 100%. Third, as the timer increases it changes colorto warn the player that they are taking too long to answer the question.This color change indicates to the player that they are doing well if they areanswering all of the questions correctly while the timer color is still green.The point&shoot game provides feedback in three distinct ways. First, thechanging of the color of the score from red to green depending on whetherthe player shot the cannon ball correctly or not indicates to the player whichof his or her actions was negative and positive. Second, at the end of thegame, the player is presented with a percentage score and number of stars,allowing the player to measure progress towards the objective of obtaining100%. Finally, at the end of the game the player is also presented with atime of how long it took them to complete the game. The feedback of timeto completion allows the player to measure how successful s/he has been,and work on improving his/her performance.Challenge and conflict make up Prensky’s fourth factor, which is readilyapparent in all three game mechanics. The button games have very obvi-ous problems for the player to solve: what buttons are the correct buttons?The game integrates a new challenge at each new stage, first by ceasing theremoval of incorrect buttons at the second stage, and again by moving thebuttons around the screen in the third stage. Both of these aforementionedchallenges are integrated into the game in order to increase the difficulty ofthe game, without increasing the difficulty of the underlying programmingchallenge the student must solve, and therefore retaining the balance be-tween the difficulty of the challenge and the skill of the player. The problemfor the player to solve in the Q&A game is evident: find the correct an-swers to the questions; however the game does not integrate different levelsof challenge to address different levels of players’ skill. One way that thisgame could have integrated more challenge would have been to incorporatetime restrictions, meaning offering the player less and less time to answereach question in order to make the game more difficult without modifyingthe difficulty of the underlying learning of the programming task. The prob-lem the player must solve in the point&shoot game is clearly defined: whichparameters go in which baskets? The game integrates the challenge of mov-ing the baskets while aiming and shooting, which increases the difficulty of243.2. Developmentthe game without increasing the difficulty of the underlying programmingconcept.One component of Prensky’s structural factors that has not yet beentouched on is the factor of competition. As discussed in the introductionof Prensky’s six structural factors, competition is not as strong a tool asfeedback, so the mini-games themselves prioritize providing players withfeedback. However, Code Age also has the competitive element integratedinto it in the form of high score leaderboard. The player may access thisleaderboard at any time and see who has what high score for each stage ofeach game. In this fashion, Code Age has an opt-in style of competitivenessthat allows players who thrive on competition to engage in it, while stillleaving the mini-games attractive for less competitive players.Interaction, the second to last of Prensky’s structural factors has beenincorporated. The interaction between the player and the computer hasbeen thoroughly detailed, but the social element of Code Age is less ap-parent. Currently Code Age is not concurrently multi-player, meaning thatwhile players can all play Code Age mini-games at the same time, they can-not directly play together. Players can compete through the leaderboardscores, which is indirect multi-player meta-gaming. It can be postulatedthat Code Age in its current form promotes face-to-face discussion betweenintroductory programming students playing the game.Prensky’s final factor of representation or story has not been addressedin the earlier discussion of the three current game mechanics because eachindividual mini-game is too short to have its own story. As a result, thegame has been designed such that each mini-game has its own purpose andgoal, but Code Age as a whole has one larger overarching storyline thateach game contributes to. It is this overall story line that engages the playerin meaningful interaction with the game, because the storyline exists todemonstrate that the small mini-games that the player is completing havea purpose and mean something. Four major storylines were drafted forpossible use in Code Age, each sharing similar characteristics of having torebuild society or relearn programming, which is the general synopsis behindCode Age. This synopsis is parallel to the fact that the purpose of CodeAge is to be used by students to learn the basics of programming, just likethe characters in the game must do to win.3.2.6 Good Computer Game DesignIt is anticipated that a storyline of the style described previously will helpthe player to become emotionally invested through the depth and richness253.2. Developmentof the game, creating an enjoyable and memorable experience for the player.Prensky describes this feature of game design as character, and character isa trait that he claims is a principle of good computer game design [Pre01a].These principles of good game design also include balance, creativity, focus,tension and energy. It is clear that Code Age has balance, as the mini-games were designed with this factor in mind. The mini-games were createdto challenge the player physically in the task they must carry out, whilekeeping the underlying programming problem simple and understandable,making the games fair and achievable, yet challenging.By creative, Prensky means games that are not produced stereotypicallyor through following a formula [Pre01a]. While Code Age’s mini-gamesfollowed a design process, applying the programming concepts to the gamemechanics was truly creative and innovative. The principle of focus ensuresthat the design of the game focusses on providing the player mostly withwhat is fun about the game, which Code Age does by primarily focusing onthe mini-games, and running the storyline in the background so it is not adistraction.Tension is the principle of good game design that causes the playersto be invested in the game’s objective even though the game’s objective isprobably quite difficult to achieve [Pre01a]. Each mini-game in Code Agehandles tension differently. Two of the games use timers to apply pressureto the player, while some of the games use in-game motion to achieve thisend (i.e., the moving buttons and baskets). As a whole, Code Age’s overallstoryline will hopefully create emotional investment in the players. Finally,well-designed games have energy, pacing, and flow that keeps the playergoing many hours after they should have gone to bed. It is uncertain ifCode Age has this energy, as its storyline is currently unimplemented, andthere are so few games. Looking to the future it is possible for Code Ageto develop its own unique energy that keeps students interested in learningprogramming through gaming.26Chapter 4Methodology and GameAnalyticsA pilot study of the game was developed to analyze the game in termsof usability and value. The objectives of the study were to evaluate if usinggames for supporting learning was feasible, to determine if mini-games werea valid choice for learning games, to determine a correct set of measures forevaluating engagement, and to see if the project should continue. Thus, atwo-part research plan was developed. The first part of the study was ausability study, which consisted of recruiting beginner programmers to in-teract with Code Age and respond to several questionnaires. A HeuristicEvaluation made up the second part of the study and it consisted of recruit-ing expert evaluators to interact with Code Age and discover any usabilityproblems the system contained. Two primary methods of data collectionwere chosen for this study; questionnaires and automated, in-game, statis-tics collection.4.1 Usability StudyThe purpose of the usability study was to evaluate Code Age and itspotential to engage and motivate students in an introductory programmingclass. The target population for this study was students who were taking orlikely to take an introductory computer science/programming related course.The students registered in the 2014 summer session of COSC111 at UBCOkanagan met this criterion as these students were first year universitystudents with little programming experience prior to entering the course.These students corresponded to the target group for the game.After completing a consent form, shown in Appendix E.1, the partic-ipants were given an initial pre-test on demographics and prior computerknowledge, and an engagement questionnaire to fill out. Upon completionof these questionnaires the participants were then given access to the gameto use whenever they wished. The participants were given access to the274.1. Usability Studygame for a 3-week period, during which the system monitored user statisticssuch as time spent using the system, time spent playing each game, numberof times each game was played, the scores the user achieved each time theyplayed each game, the number of times the user requested high scores fromthe system, and when users gave up playing the game. At the end of the3-week intervention, the participants were asked to complete final post-testson motivation, engagement, and usability.4.1.1 Demographics and Background QuestionnairePre-TestThis measure consists of a series of questions concerning basic demo-graphic information (e.g., age, sex), and academic interests and experiencesthat are relevant to the focus of this study. Participants are also asked toassess their academic level in Math and English as these factors are knownto be the best predictors of success in a programming class. Additionally,the participants are asked to provide a self-assessment of their level of pro-gramming knowledge. The participants are also asked a few questions abouttheir video game preferences. The demographic/background information isintended to help characterize the population (see Appendix F.1.1).4.1.2 Engagement Pre/Post-TestDuring the development of the research project, no measures were foundthat measure student engagement with game-based learning. The NationalSurvey of Student Engagement, used at universities across Canada and theUS to measure collegiate quality based on the two factors of student en-gagement and institutional resource use, was modified to create a measurespecific to computer science and game engagement. The ideas outlined byMark Berends in his discussion of survey use and development in educationresearch [Ber06] were consulted in the modification of the NSSE for thisstudy. The new measure was formed by removing questions concerning in-stitutional resource use, and adapting questions of student engagement to bequestions of student engagement using the game. Examples of questions re-moved from the measure would be questions like, “About how many of yourcourses at this institution have included a community-based project (service-learning)?”, “Indicate the quality of your interactions with the academicadvisors at your institution.”. Two specific examples of questions regardingcomputer science and game engagement are “In this summer semester ofCOSC 111, about how often have you considered a career in Computer Sci-284.2. Heuristic Evaluationence?”, and “In this summer semester of COSC 111, about how often has thegame challenged you to do your best work?” The questions were intended toprovide valuable information regarding student interactions and experienceswith programming and the game, student reflection on the game, and stu-dent reasoning with programming and the game. This modified measure canbe seen in Appendix F.1.2/F.1.3. [EK14, KfPRP01]. This measure was ad-ministered as a pre-test/post-test. The post-test version of the measure wasmodified slightly from the pre-test to include questions about the students’interactions with the game.4.1.3 Intrinsic Motivation Inventory Post-TestThe Intrinsic Motivation Inventory is a free-to-use multidimensional mea-surement device whose validity has been demonstrated successfully [The99].This measure consists of a series of questions concerning participant’s sub-jective opinions related to an activity in an experiment related to intrinsicmotivation and self-regulation. Participants are asked to answer questionsrelated to interest/enjoyment, perceived competence, effort, and other sub-scales related to motivation with respect to the educational games. Thismeasure can be seen in Appendix F.1.4. [MDT89, GM12]4.1.4 Usability Questionnaire Post-TestThe Usability Questionnaire is used to gather feedback on the appeal,ease of use, and design aspects of the game. This measure consists of a seriesof Likert scale and open-ended questions addressing the mechanics of thegame, the comfort of the participant, the effectiveness of instructions, andthe aesthetics and art. The measure also has the participant address severalquestions on improving the game’s ability to engage, including questionsabout the plot, style, and length of the game. This measure can be seen inAppendix F.1.5.4.2 Heuristic EvaluationThe purpose of a heuristic evaluation is to have recognized experts iden-tify any major usability errors in the software. Typically, a heuristic evalu-ation is performed by a small group of experts (3-5).Participants for this study were recruited from upper-level computerscience students who have taken the Human Computer Interaction course,i.e., were considered experts, and were contacted via email. These students294.2. Heuristic Evaluationwere English-speaking undergraduate or recently graduated computer sci-ence students, who were over the age of consent in British Columbia. Uponcompletion of a consent form (see Appendix E.2), the participants wereasked to complete a demographics questionnaire and spend some time withthe system performing tasks. These tasks included, but were not limitedto, activities like logging in and using the different pages of the game. Asthe evaluators interacted with the game, they addressed questions in theHeuristic Evaluation regarding the usability of the interface. As they in-teracted with the game, the system tracked the following statistics abouteach user’s experience with the system: how much time was spent in to-tal with the system, and how long the user spent on each task assigned tothem. After completing the heuristic evaluation of the system, each userwas asked to complete a follow up questionnaire, in which they providedopinions, feedback, and a severity rating of each usability problem that wasfound.4.2.1 Demographics and Background QuestionnaireThis questionnaire is similar to the Demographics and Background Ques-tionnaire used in the usability study, with the exception that it does not askthe participant to self-rate their programming ability because the partici-pants recruited for this portion of the study have been identified as expertsin the field of usability. This measure can be seen in Appendix F.2.1.4.2.2 Heuristic Evaluation QuestionnaireA heuristic evaluation is a series of basic tasks performed by a smallset of evaluators. These tasks guide the evaluators through a thoroughexamination of the interface and allow the evaluators to judge the interface’scompliance with recognized usability principles or heuristics [NM90]. Asthe evaluators perform each task, they are asked to rate how successfuleach task is at fulfilling each heuristic/usability principle. This measuredetermines where the system’s usability is flawed. This information willallow the researchers to demonstrate the feasibility and success of the systemand determine how to improve the system. This measure can be seen inAppendix F.2.2.Heuristic evaluations are typically performed before and during user test-ing. Heuristic evaluations are used prior to user testing in order to identifygrievous usability flaws, which can then be corrected prior to the user testingso that the users are not exposed to errors that may negatively impact the304.3. Game Data Collectionuser experience with the software. Performing heuristic evaluations prior touser testing allows the heuristic evaluation results to be examined because ifduring the user testing more usability errors are found, the heuristic evalua-tion was not as robust as it should have been [Nie92]. Heuristic evaluationsalso can be performed during user testing as usability is an iterative processthrough which usability errors are constantly identified and removed. Inthis study, due to time constraints of the summer semester, and availabilityof participants, the heuristic evaluation was performed in conjunction withthe usability testing.4.2.3 Heuristic Follow UpThis measure consists of a collection of all of the usability problems foundby all of the evaluators during the heuristic evaluation. The measure askseach evaluator to rate the severity of each error, including the errors thatwere found by other evaluators, on a Likert scale. The evaluators determinewhether each error is a minor usability problem or a major usability catas-trophe. This information will allow the researchers to determine whetherthis system is ready for wide spread use, and will allow the researchers toprioritize the correction of any usability errors that were found [Nie95]. Thismeasure also contains several follow up questions regarding the aesthetics,appeal, and plot lines of the game. This measure can been seen in AppendixF.2.3.4.3 Game Data CollectionWhile the participants were playing these games, the application trackedspecific information about each player in each game they played. Two gen-eral pieces of data that were collected for each user are the number of timeseach participant logged into the game, and the number of times they ac-cessed their user statistics, known as the “high score board”. Table 4.1shows a summary of the number of times the participants logged in to thesystem, and the number of times the participants accessed the high scorepage. This table describes the interquartile ranges, or the division of a dataset into four equal parts, where the second quartile is the median. It shouldbe clear that the two participants who never played the game were not in-cluded in the study. This data shows that the participants did not loginor access the statistics as many times as we would have liked. However,there were some participants who performed these actions as many as four314.3. Game Data Collectiontimes, which indicates that these students had an above-average interest ininteracting with the game.Table 4.1: Logins and High Score Page Views SummaryMin 1st Qu. Med 3rd Qu. Max Mean S.D.Logins 1 1 1 2 4 1.56 0.89Statsviews0 0 1 2 4 1.31 1.25The application also recorded the number of times each player playedeach game and the score he/she achieved on each try. Tables 4.2 and 4.3displays a summary of this data, including the standard deviation and thenumber of participants that played each game. A gradual decrease in thenumber of participants playing each level can be seen as the game levelsprogress. It should be noted that the mean and standard deviation arecalculated based on the number of participants that actually played thegame, i.e., participants who ceased playing the game were removed fromthe calculations at their time of cessation as to not treat their quitting as anumber of times played being 0.It is interesting to note that the participants tended to spend more timeplaying the third level of each game, which was the level where the but-tons moved across the screen. This tendency could be explained by thefact that, as the participants explained in the usability study, the movingbuttons made the game more challenging. By being more challenging, theparticipants may have had to play the game several times before succeeding.Participants also had a relatively large average number of times played forgame 5, the cannon game. The reason for this high number could be due tothe fact that both the cannon and the baskets move, making the game morephysically difficult for the students to complete successfully. This added dif-ficulty may have caused some of the students to have to play the game moretimes before they were successful. Also of interest to note is that the Q&Agame, game 4, had the highest overall average score of all of the games.Finally, a trend can be seen in the average scores for each game: the averagescores tend to hover around 60%, which may be due to the fact that in thegame’s design, it is required that player achieve a minimum score of 60%before they can move on to the next game. This data shows that it is likelythat many students were just aiming to move on to the next level, and werenot trying to achieve the 100% high score.324.3. Game Data CollectionTable4.2:GameScoreStatistics-Games1,2,3,and6thebuttonmechanic,game4theQ&Amechanic,game5thePoint&ShootmechanicGame1-11-21-32-12-22-33-13-23-3456-16-26-3Numberofplayers1615141414141414131212121111Min.score201253333300400204383001stquart.60475054.547504758.557653856.255051.5Med.score675667675767836778805772.556.5673rdquart.8372.57579.2568.57087100100100807877.2582.25Max.score1001001008886100100100100100100100100100Averagescore67.958.364.566.068.563.069.975.074.881.160.967.462.866.1StandardDeviation22.524.718.616.714.517.827.522.025.222.229.717.924.222.4334.3. Game Data CollectionTable4.3:GamePlayStatistics-Games1,2,3,and6thebuttonmechanic,game4theQ&Amechanic,game5thePoint&ShootmechanicGame1-11-21-32-12-22-33-13-23-3456-16-26-3Numberofplayers1615141414141414131212121111Min.timesplayed100000000000001st.quart.111.751111110.750.750.7500Med.timesplayed112112111121123rdquart.1.251.253.25124.251.251213114.5Maxtimesplayed3282217325552411344.3. Game Data CollectionAnother statistic that the system collected was how long each participantspent on the game each time they logged in, and this data is shown in table4.4. The median time spent overall per log in was 11.23 minutes. The mediantime participants spent on the first log in was 14.72 minutes. The mediantime spent on the second log in was 6.92 minutes. Only two participantslogged in for a third time, one spent 11.2 minutes and the other 15.9 minutesinteracting with the game. Finally, one participant logged in for a fourthtime and he spent 9.5 minutes with the system. Ideally, it would have beennice if the participants had spent more time playing the games, but giventhat the games were so few and the majority of the participants had someprior programming experience, it could be that the participants did not haveenough incentive to play the games more. The amount of time spent by theparticipants is a positive result because they spent a short amount of timeon the games, 10-20 minutes, which is what Code Age is intended for. Oneof the goals of this project is to make Code Age easy for students to playat any time and any place, for example, on the bus to school, and this goalappears to have been reached.Table 4.4: Game Time StatisticsTime Spent Min 1st Qu. Med 3rd Qu. MaxNum.PlayersOverall(min)56 sec 5.02 11.23 16.9 97.88 NA1st Login(min)56 sec 5.04 14.72 20.6 97.88 162nd Login(min)57 sec 2.25 6.92 15.19 26.9 1035Chapter 5Usability Study Analysis andResults5.1 Demographics and BackgroundQuestionnaire Analysis17−19 20−24femalemaleParticipant Age Range by GenderAge RangeNumber of Participants01234567Figure 5.1: Participants’ Ages and GendersThe usability study consisted of 16 participants, 2 female and 14 male,ranging in ages from 17-24. This population consists of a 12.5% femalepopulation. Studies run by the Nation Center for Women and Informa-tion Technology demonstrate that 18% of CIS graduates were female, whileComputer Research Associate has shown that less than 12% of baccalau-reate degrees in Computer Science in North American Universities in 2012were obtained by women [RES13]. Thus this study has obtained a genderpopulation that is comparable to the typical gender distribution in computerscience, as can be seen in figure 5.1. All participants owned a computer and365.1. Demographics and Background Questionnaire Analysisreported to daily computer use and an average of 2.3 hours spent gamingper week. It is interesting to note that both female participants reportedplaying 0 hours of video games per week. The participants were asked tocomplete a series of questions about their gender, ethnicity, intended degree,and major. Table 5.1 summarizes the demographic makeup of the sample.Table 5.1: Demographic SummaryCount PercentTotal Participants 16 100%Gender Male 14 87.5%Female 2 12.5%Age 17-19 8 50%20-24 8 50%Ethnicity Caucasian 9 56.25%Asian 4 25%South Asian 2 12.5%African 1 6.25%Major Math 5 31.25%Engineering and Applied Science 6 37.5%Computer Science 2 12.5%Unknown 1 6.25%Unspecified 2 12.5%The participants were also asked to self-rate their Math and Englishskills on a scale from 1 to 10, where 1 indicated poor and 10 indicatedexcellent. Figure 5.2 shows the participants’ self rating of their English andMath skills. These figures indicate that the population, on average, wasmore comfortable in Math than in English, but believe themselves to becompetent in both skill sets. A higher comfort level in Math would appearto be typical given that our population is heavily composed of Engineeringand Mathematics students.375.1. Demographics and Background Questionnaire Analysis5 6 7 8 9 10Participants' Self−Rated Ability of English SkillSelf−Rated AbilityNumber of Participants0123456(a) Participants’ English Skill.7 8 9 10Participants' Self−Rated Ability of Math SkillSelf−Rated AbilityNumber of Participants0123456(b) Participants’ Math Skill.Figure 5.2: Participant Self-Ratings of English and Math Skills.The participants were asked to rate their agreement with the statement“I like memory games” and table 5.2 shows participant agreement with thestatement, and further breaks agreement down by the defining characteris-tics of the population. Participant agreement fell primarily between agreeand no opinion, indicating students felt somewhere between positive andneutral about memory games.Table 5.2: Memory Games Agreement - Likert scale from 1 to 7, with 1 beingstrongly disagree, 2 agree, 3 somewhat agree, 4 no opinion, 5 somewhatdisagree, 6 disagree, and 7 strongly agreeCount S.A. A. SW. A. N.O. SW. D. D. S. D.Total 16 2 4 3 6 1 0 0Sex Male 14 2 4 2 5 1 0 0Female 2 0 0 1 1 0 0 0Age 17-19 8 0 3 2 3 0 0 020-24 8 2 1 1 3 1 0 0Comparatively, the participants were also asked to rate their agreementwith the statement “Given the choice, I prefer playing games where I havetime to reflect than games that challenge my reflexes”, and table 5.3 showsthe participants’ results. Participant agreement fell primarily between agreeand no opinion, indicating students felt somewhere between positive and385.1. Demographics and Background Questionnaire Analysisneutral about memory games, but slightly less so than they did about reflexgames. Also of interest to note is that, even with the small sample size, thefemale students tended to prefer games with time for reflection, althoughthey did not feel strongly about either game type.Table 5.3: Reflex Games Agreement - Likert scale from 1 to 7, with 1 beingstrongly disagree, 2 agree, 3 somewhat agree, 4 no opinion, 5 somewhatdisagree, 6 disagree, and 7 strongly agreeCount S.A. A. SW. A. N.O. SW. D. D. S.D.Total 16 2 2 3 7 1 1 0Sex Male 14 2 1 2 7 1 1 0Female 2 0 1 1 0 0 0 0Age 17-19 8 1 1 1 3 1 1 020-24 8 1 1 2 4 0 0 0Finally, the participants were asked to self-rate their programming abil-ity and list if they had taken a Computer Science course before. Overall,about half of the class had some form of previous programming experience,and this number is much higher than in a typical introductory programmingclass. The participants rated themselves on a scale from 0 (I cannot pro-gram) to 4 (I am a competent programmer). The results of this self-ratingcan be seen in table 5.4. It should be noted that no participants self-rated aprogramming ability of 4, and only one participant reported self-rating of 3.One participant provided unusable data for this rating, which was excludedfrom the calculations. It is interesting to see that nearly half of the popula-tion self-identified at a programming level of two. In a typical introductoryprogramming course it would be expected to see a larger number of studentsrate themselves at a zero or one.Table 5.4: Participant Self-Rating of Programming AbilityProgramming ability 0 1 2 3 NA% of participants 12.5 31.25 43.75 6.25 6.25The participants also listed any Computer Science courses they had takenbefore. The majority of the participants reported that they had not takenany Computer Science courses prior to the COSC 111 they were currentlyenrolled in, which is expected for the population of a COSC 111 class. How-ever, there are not usually as many engineering and applied sciences studentsin the regular fall session of COSC 111, so this population is not totally rep-395.2. Engagement Pre/Post-Test Analysisresentative of the norm. More than 40% of the participants had taken aprogramming course prior to the study, this number is way above the av-erage in a typical fall semester of COSC 111. The excess of engineeringstudents in the population could be attributed to the fact it was a summersession course, which is when engineering students would have time to fitthe course in their rigorous schedules. Figure 5.3 shows what ComputerScience courses the participants had previously taken (APSC 177, COSC111, and COSC 121). It should be noted that in order to have taken COSC121, a student must have taken COSC 111 and passed with a grade of 60%or higher. APSC 177 is an Engineering or Applied Sciences introductoryprogramming course; it is described in the UBC Okanagan course calendaras “computer systems, software development, operating systems, compilers,programming in a high-level language, selection and loop structures, func-tions, arrays, pointers, files, data acquisition, solving engineering problemswith computer programs”. While this course is not equivalent to COSC 111,their contents are closely related.None High School APSC 177 COSC 111 COSC 121Participants' Previous Programming ExperienceProgramming ExperienceNumber of Participants02468Figure 5.3: Participants’ Previous Experience in COSC5.2 Engagement Pre/Post-Test AnalysisThe engagement questionnaire was adapted from the National Survey onStudent Engagement used by colleges and universities across North Americato measure students’ engagement at their post-secondary institution. Cur-405.2. Engagement Pre/Post-Test Analysisrently the literature does not contain a survey that looks at measuring stu-dent engagement with an educational game. In order to create this type ofmeasure, some of the items in the NSSE were removed, some were modified,and some were added. Items that revolved mostly around the institutionor that could not be modified to relate to a programming game were re-moved. This includes items such as “About how often have you attendedan art exhibit, play, or other arts performance?” and “How much does yourinstitute emphasize providing opportunities to be involved socially”. Itemslike “About how many hours do you spend in a typical week participating inco-curricular activities?” were modified to refer to a student’s involvementin their introductory programming course, i.e., “About how many hours doyou spend in a typical week practicing programming?”. Examples of itemsthat were added are, “About how many hours do you spend in a typicalweek playing the game?” and “About how many hours do you spend ina typical week playing other video games?”. An example of an item thatremained unchanged was “About how many hours do you spend in a typicalweek preparing for class (studying, reading, assignments)?”The engagement questionnaire was designed to be delivered as a pre/post-test, where the post-test is distributed after the students have interactedwith the game. It should be noted that there were 29 questions on the pre-test and 33 questions on the post-test. The four questions that appearedon the post-test were questions that related directly to student interactionwith the game, because it would not make sense to ask students questionsrelating to their experiences with a game they have not yet played. The fullengagement pre/post-tests can be seen in Appendices F.1.2 and F.1.3. Theresults obtained from the 16 participants can be seen in tables 5.5 through5.9. What the results reveal is that these items were generic items, notgame-specific items, which indicates that future iterations of this measureshould contain more game-specific items. However, it is difficult to askgame-specific questions in a pre-test where the participants have not yetused the game, so in this case it may be worthwhile to word the questionsconditionally, i.e., using “would”.415.2. Engagement Pre/Post-Test AnalysisTable 5.5: Engagement Questionnaire Part 1 Percentage Agreement. Likertscale from 1 to 4, where 1-very often, 2-often, 3-sometimes, 4-never. Eachquestion began with “About how often do you do each of the following:”Question 1 2 3 41. Asked questions or contri-buted to class discussionspre 6.25 31.25 56.25 6.25post 18.75 12.5 62.5 6.252. Prepared several solutionsfor a lab or assignmentpre 12.5 37.5 37.5 12.5post 6.25 18.75 68.75 6.253. Had the professor or TAlook over your solutionpre 6.25 31.25 37.5 25post 0 25 37.5 37.54. Asked another student tohelp you understand materialpre 50 37.5 12.5 0post 50 43.75 6.25 05. Prepared for an exam byworking with other studentspre 43.75 50 6.25 0post 43.75 43.75 6.25 6.256. Worked with others oncourse projects/assignmentspre 62.5 31.25 6.25 0post 62.5 25 12.5 07. Given a course presentationpre 6.25 12.5 37.5 43.75post 0 12.5 12.5 758. Combined ideas fromcourses in completing workpre 37.5 12.5 37.5 12.5post 12.5 31.25 43.75 12.59. Connected your learning tosocietal problems or issuespre 0 43.75 31.25 25post 6.25 25 50 18.7510. Examined the strengths/weaknesses of your own viewspre 6.25 56.25 31.25 6.25post 18.75 43.75 31.25 6.25In questions 1-10 it is clear to see that questions 3, 5, and 7 shouldbe removed from the questionnaire. These three questions are very course-oriented and add no value in respect to measuring engagement with thegame. Question 10 could be left in the questionnaire, as it is not coursespecific, it merely asks the participants to reflect on the validity of theirown views throughout the study. This question could still have value withrespect to measuring engagement with the game. The remaining 6 questionscan easily be modified to be more game related. The modified versions ofthese questions are presented here:1. About how often would/have you ask[ed] questions about or con-tributed to discussion of the game in class?2. About how often would/have you play[ed] the game several times inone day?425.2. Engagement Pre/Post-Test Analysis4. About how often would/have you ask[ed] another student for help withthe game?6. About how often would/have you work[ed] with others while playingthe game?8. About how often have you combined ideas from different courses whencompleting the game?9. About how often have you connected your game play experience tosocietal problems or issues?Table 5.6: Engagement Questionnaire Part 2 Percentage Agreement. Likertscale from 1 to 4, where 1-very often, 2-often, 3-sometimes, 4-never. Eachquestion began with “About how often do you do each of the following:”Question 1 2 3 411. Learned something thatchanged your understandingpre 31.25 50 12.5 6.25post 37.5 31.25 25 6.2512. Connected course ideasto your prior experiencespre 25 37.5 31.25 6.25post 25 25 37.5 12.513. Talked about career planswith a faculty memberpre 18.75 37.5 25 18.75post 12.5 18.75 37.5 31.2514. Considered a career in CSpre 18.75 18.75 31.25 31.25post 31.25 18.75 25 2515. Worked with a prof onthings other than courseworkpre 25 12.5 12.5 50post 6.25 6.25 18.75 68.7516. Discussed course conceptswith a prof outside of classpre 18.75 43.75 31.25 6.25post 6.25 18.75 43.75 31.2517. Discussed your academicperformance with a profpre 12.5 37.5 43.75 6.25post 6.25 31.25 43.75 18.7518. Reach conclusions usinganalysis of algorithmspre 6.25 25 56.25 12.5post 25 37.5 37.5 019. Examined real-world issueusing programmingpre 6.25 25 18.75 50post 18.75 12.5 43.75 2520. Evaluated other’s conclus-ions using algorithmspre 6.25 12.5 43.75 37.5post 12.5 37.5 31.25 18.75Questions 11-20 contain no questions that immediately stand out forremoval. Questions 13 and 14 should remain in the questionnaire as theyare. Question 14 asks how often the students have considered a career in435.2. Engagement Pre/Post-Test AnalysisComputer Science, and 50% of students said often or very often in the post-test, compared to 37.5% in the pre-test. Questions 18, 19, and 20 shouldalso remain in the questionnaire in their current form. Question 18 showedan increase in students from the pre-test to the post-test, who felt like theyhad often used their own analysis of programming and algorithms to reachconclusions. This increase is anticipated, given that students are learninghow to program and design algorithms. These three questions could alsohave alternate versions of themselves included in the questionnaire. Thesequestions would look like:− About how often would/have you reach[ed] conclusions based on yourown experience of playing the game?− About how often would/have you use[d] knowledge from the game toexamine a real-world problem or issue?− About how often would/have you evaluate[d] what others have con-cluded from your experience with the game?The remaining four questions should be modified to be more game-specific, as their current instances are too course oriented. The modifiedversions of these questions should look like:11. About how often has knowledge from the game changed the way youunderstand a concept or an issue?12. About how often would/have you connect[ed] ideas from the game toyour prior experiences and knowledge?15. About how often would/have you discuss[ed] the game with a facultymember outside of class?16. About how often would/have you discuss[ed] the game with peers out-side of class?Question 21, seen in table 5.7, was a very polarizing question. Thestudents were clearly diverse in the value they placed on the textbook for thecourse, which indicates that this question is not at all relevant to the gameand should be removed. Question 22 could be kept and used to characterizethe dedication of the students, but it would more beneficial to modify thisquestion to ask “About how often would/have you used the game to reviewafter class?” or even “How often has the game motivated you to reviewcourse content?” Similarly to the previous question, question 23 is also445.2. Engagement Pre/Post-Test AnalysisTable 5.7: Engagement Questionnaire Part 3 Percentage Agreement. Likertscale from 1 to 4, where 1-very often, 2-often, 3-sometimes, 4-never. Eachquestion began with “About how often do you do each of the following:”Question 1 2 3 421. Identified keyinformation fromreading assignmentspre 25 56.25 18.75 0post 31.25 50 6.25 12.522. Reviewed yournotes after classpre 18.75 43.75 12.5 25post 12.5 37.5 37.5 12.523. Summarizedwhat you learned inclass or fromcourse materialspre 12.5 50 25 12.5post 18.75 31.25 37.5 12.524. Identified key infofrom the gamepost(only)12.5 37.5 43.75 6.25very course oriented and diverse. This question should either be removedfrom the questionnaire, or it could be modified to read “About how oftenwould/have you analyze[d] information from the game?” Question 24 asksthe participants about identifying key information from the game, but itwas intended as a post-test only question. It should be modified to also bein the pre-test, by asking “About how often would you expect to identifykey information from the game?”It is clear that questions 25 and 26, seen in table 5.8, are strictly aboutthe course. There is a clear increase in the challenge of the labs, whichis seen in the participant response to question 26. In the pre-test 68.75%of participants rated a 5 or higher (sometimes to very much) as opposed tothe 93.75% in the post-test, which demonstrates that the students found thelabs challenged them to do their best work. Given that question 27 would bethe game-centric modified version of questions 25 and 26, questions 25 and26 should be removed entirely. Question 27 was created as a post-test onlyquestion, and the results were diverse, but over half of the participants ratedbetween a 4 and a 6. This rating indicates that over half of the participantsfelt that the game challenged them to do their best at least sometimes. Thisquestion needs to be modified for inclusion in the pre-test as well. Thepre-test version would ask “About how often do you think the game wouldchallenge you to do your best work?” Question 33 was included in thepost-test to evaluate how much time the students felt they were investing455.2. Engagement Pre/Post-Test Analysisin playing the game, and this question should remain in the questionnaire.The students reported spending between very little and about half of theirclass preparation time on playing the game. Given that the students tendedto spend between 10 and 20 minutes playing the game, these responsesindicated that either some students do not spend much time preparing forclass, or some students over-estimated the amount of time they spent playingthe game. This question should be modified to be in the pre-test as well, byasking “Of the time you spend preparing for class in a typical 7-day week,about how much time do you think you would spend playing the game?”Table 5.8: Engagement Questionnaire Part 4 Percentage Agreement. Likertscale from 1 to 7, where 1-not at all, 7-very much25. How often has the course challenged you to do your best work?26. How often have the labs challenged you to do your best work?27. How often has the game challenged you to do your best work?33. Of the time spent preparing for class in a typical 7-day week, about howmuch time is spent on playing the game?Question 1 2 3 4 5 6 725.pre 0 0 12.5 18.75 31.25 6.25 31.25post 0 6.25 6.25 18.75 25 18.75 2526.pre 0 6.25 12.5 12.5 18.75 25 25post 0 0 0 6.25 6.25 31.25 56.2527.post(only)0 6.25 25 18.75 31.25 18.75 033.post(only)50 43.75 6.25 0 0 NA NAAll of questions 28 through 32 should be kept, except for question 30.Question 30 demonstrated an increase in student time spent programmingfor the lab, which is expected as the labs become more challenging. Thisquestion should be removed from the survey as it is too course oriented.Questions 28 and 29 should remain in the survey in order to characterize howmuch time students spend preparing for class and practicing programming,so that when they are asked how much of their preparation time is spent onthe game, the answers can be compared. Question 31 asks the students howmuch time they spent playing the game, and the question is in the post-testonly. This question should be modified to be in the pre-test as well, perhapsasking how much time the students think they will spend playing the game,465.2. Engagement Pre/Post-Test AnalysisTable 5.9: Engagement Questionnaire Part 5 Percentage Agreement. “Abouthow many hours do you spend in a typical week doing each of the following?”28. Preparing for class29. Practicing programming30. Programming for the lab31. Playing the game?32. Playing other video gamesQuestion 0 1-4 6+ 11+ 16+ 21+ 25+ 30+28.pre 0 37.5 31.25 18.75 6.25 6.25 0 0post 0 31.25 25 25 18.75 0 0 029.pre 6.25 75 12.5 6.25 0 0 0 0post 12.5 56.25 6.25 6.25 12.5 0 6.25 030.pre 0 56.25 37.5 0 6.25 0 0 0post 0 12.5 62.5 25 0 0 0 031.post(only)18.75 75 6.25 0 0 0 0 032.pre 18.75 50 18.75 6.25 6.25 0 0 0post 18.75 62.5 0 0 0 0 6.25 12.5or possibly this question should be removed, given its similarity to question33. The results of this question are interesting because it is known thatthe students spend around 10 to 20 minutes playing with the game, yetone student reported playing 6 or more hours, which is clearly impossible.Question 32 is included in the questionnaire to compare how much timestudents spend playing other video games to how much time they spend onthe game. The results show that students spent much more time playingregular video games than they did playing Code Age, which is anticipatedgiven how few games Code Age has.5.2.1 Pearson’s Chi-Square TestFinally, it is interesting to see if there is a significant difference betweenthe participants’ pre and post-test responses to the different items. In orderto determine if there was in fact a difference, Pearson’s Chi-Square Testwas used to analyze each pair of pre-test and post-test questions. It shouldbe noted that there were 29 questions on the pre-test and 33 questions onthe post-test, so the questions that were only on the post-test cannot beincluded in the chi-square test, as they have no question to be paired withfor the analysis. These four questions will be discussed separately.475.2. Engagement Pre/Post-Test AnalysisThis test is used to see if there is a relationship between the two setsof categorical variables, the pre-test questions and the post-test questions.Pearson’s chi-square works by comparing the observed frequencies in eachcategory to the frequencies expected by chance. Pearson’s chi-square testwas computed on all pairs of pre and post-test questions. Conventionally,if the p-value is less than 0.05, the chi-square value is significant, the nullhypothesis is rejected (i.e., reject the hypothesis that the variables are in-dependent) and instead the variables are somehow related. If the p-valueis less than 0.01 the chi square value is highly significant. When the chi-square value is significant, the difference between the pre and post-test hada significant effect on the students’ responses to the question.In the analysis of the 29 item pairs only one chi-square test came backas significant. Question 28 in the pre-test asked the participants “In usualcourses you are registered in, about how many hours do you spend in atypical week preparing for class (studying, reading, assignments)?”, whilethe post-test asked “During this summer semester of Computer Science 111,about how many hours do you spend in a typical week preparing for class(studying, reading, assignments)?” This item appeared to be significantwith a chi-square value of 10.45, at 3 degrees of freedom, and with a p-valueof 0.01. This p-value is significant because it is less than 0.05, and the chi-square value is greater than the associated critical value of 7.81. The factthat this question appears to be significant could be due to introductoryprogramming being a challenging class, where students are more likely tospend time preparing for this class than they are for the typical class. Thesummary of the chi-square statistics for the remaining 28 items can be foundin Appendix A.1.When performing hypothesis tests, the case where the null hypothesisis true but the test rejects the null hypothesis is called type I error. In thecase of the engagement questionnaire, if a standard p-value cut off of 0.05 isused, then given hypothesis tests of 29 items, it is expected that 1.45 itemswould be declared significant by chance. It is clear that the probability ofobtaining at least one false positive increases as the number of hypothesistests conducted increases. To control for type I error, a multiple testingcorrection must be performed. To protect against any false positives, Holm’smethod is selected and run upon the p-values of all items. The correctedp-value for item 28 is p = 0.4371005 and, along with all other items, is notsignificant.Four questions appeared only in the post-test questionnaire. The firstof these questions was “In this summer session of Computer Science 111,about how often have you identified key information from the game?” The485.2. Engagement Pre/Post-Test Analysisparticipants were asked to give a rating between 1 (very often) and 4 (never).From the 16 participants asked, the median value was 2.5 with a minimumvalue of 1, and a maximum of 4. This median value corresponds to anaverage rating of occasionally.The second question presented only in the post-test was “In this summersession of Computer Science 111, about how often has the game challengedyou to do your best work?” This question was rated on a scale from 1(not at all) to 7 (very much). The median rating was 4.5, with a minimumvalue of 2 and maximum of 6. This median value corresponds to a rating ofsometimes, leaning towards moderately.These values indicate that the participants did not find the games par-ticularly challenging, and the games did not reveal a lot of key course infor-mation to the students. However, the participants did seem to believe thatthey imparted some knowledge from the games.The third question that appeared in the post-test but not in the pre-test was “About how many hours a week do you spend playing the game?”The scale of this question went from 0-7, where 0 indicated 0 hours, and 7indicated 30 or more hours. The minimum response value was, in fact, 0,and the maximum was 2. A value of 1 indicated 1-5 hours and 2 indicated6-10 hours. The median value was 1 which indicates that the participantsself-reported playing between 1 and 5 hour.The final question that was only in the post-test was “Of the time youspend preparing for class in a typical 7 day week, about how much timeis spent playing the game?” This question was rated on a scale from 0(very little) to 4 (almost all). The minimum response value was 0 and themaximum was 2. The median response value was 0.5, which indicates thaton average, participants spent “very little” to “some” time playing the gamein preparation for class.The amount of time spent playing the game may seem small but it couldbe attributed to the fact that the games are mini-games and there are notvery many games to play. In fact, just under half of the participants hadsome form of programming knowledge prior to taking the course, and giventhat the games are designed for students with no knowledge, it is reasonableto believe that students with programming knowledge might find the gamestoo easy and consequently spend very little time playing.495.3. Intrinsic Motivation Inventory Analysis5.3 Intrinsic Motivation Inventory AnalysisThe Intrinsic Motivation Inventory is used in different fields to measurethe subjective experience of participants during a target activity or labo-ratory experiment [The99]. The IMI assesses six specific subscales usingspecific sets of questions, referred to as factors or variables. The six sub-scales are: interest/enjoyment, perceived competence, effort/importance,pressure/tension, perceived choice, value/usefulness, and relatedness. Inthis study, two versions of the IMI were administered to the participants.The IMI is used to evaluate questions pertaining specifically to intrinsicmotivation as most of its factors load on the enterest/enjoyment subscale,which is considered the self-report of intrinsic motivation [MDT89].The first version administered to the participants was the task evalua-tion questionnaire, which consists of a series of factors measuring the inter-est/enjoyment, perceived competence, perceived choice, and pressure/tensionsubscales. The TEQ was administered in order to determine if participantinteraction with Code Age could be related to intrinsic motivation. Thefactors of the TEQ were reworded to refer to “the game” instead of “thetask” to indicate to participants that they were specifically evaluating thegame and their interactions with it. The second version of the IMI that wasadministered to the participants was the activity perception questionnaire,which consists of a series of factors that measure the interest/enjoyment,value/usefulness, and perceived choice subscales [DEPL94]. The activityperception questionnaire was administered to evaluate the participants’ per-ceptions about the value or usefulness of Code Age.5.3.1 Activity Perception QuestionnaireThe results of the first 13 questions of the APQ are shown in table5.10. The items of the APQ that load on the interest/enjoyment factorare questions 3, 5, 7, 11, and 12. The participants were presented withstatements and asked to rate them on a scale from 1, not at all true, to 7, verytrue, where 4 is somewhat true. Over half of the participants rated a 3 or lessfor question 3, indicating that the students were not really thinking abouthow much they enjoyed the game while they were playing it. Contrastingly,50% of the participants rated a 5 or higher on the statement “playing thegame was fun to do”. “I enjoyed playing the game very much” received asimilar rating of 5 or higher by 50% of the participants as well. More than50% of the participants rated a 5 or higher on the statement “I felt like Iwas enjoying the game while I was playing it”. Conversely, the statement “I505.3. Intrinsic Motivation Inventory AnalysisTable 5.10: Activity Perception Results Part 1 Percentage Agreement -Likert scale from 1 to 7, where 1 represented not at all true, 4 representedsomewhat true, and 7 represented very true.1. I believe that playing the game could be of some value to me2. I believe that I had some choice about playing the game3. While playing the game I was thinking about how much I enjoyed it4. I believe that playing the game is useful for improved concentration5. Playing the game was fun to do6. I think playing the game is important for my improvement7. I enjoyed playing the game very much8. I really did not have a choice about playing the game9. I played the game because I wanted to10. I think playing the game is an important activity11. I felt like I was enjoying the game while I was playing it12. I thought playing the game was a very boring activity13. It is possible that playing the game could improve my studying habitsQuestion 1 2 3 4 5 6 71. 0 6.25 0 18.75 31.25 18.75 252. 0 0 6.25 6.25 18.75 43.75 253. 6.25 6.25 43.75 25 12.5 0 6.254. 6.25 6.25 18.75 25 6.25 25 12.55. 6.25 18.75 6.25 18.75 25 18.75 6.256. 12.5 6.25 18.75 18.75 18.75 18.75 6.257. 12.5 12.5 6.25 18.75 25 12.5 12.58. 50 18.75 0 12.5 18.75 0 09. 6.25 6.25 6.25 6.25 18.75 31.2525 31.2510. 12.5 6.25 18.75 12.5 31.25 12.5 6.2511. 12.5 6.25 18.75 0 25 18.75 18.7512. 25 25 12.5 6.25 6.25 25 013. 12.5 0 6.25 31.25 18.75 12.5 18.75515.3. Intrinsic Motivation Inventory Analysisthought playing the game was a very boring activity” was rated a 1 or a 2by 50% of the population. What this data tells us is that while the studentsdidn’t think of their enjoyment while playing, in hindsight, nearly half ofthe student seemed to enjoy the game.Items 1, 4, 6, 10, and 13 all belong to the subscale of value/usefulness.The statement “I believe playing the game could be of some value to me”earned a score of 4 or higher from 93.75% of participants, indicating thatmany students found the game valuable. Items 6 and 10 showed that morestudents than not felt that playing the game was both useful for improvedconcentration and was an important activity. Over half of the participantsalso felt that playing the game could somewhat improve their studyinghabits.The results of the items 14-25 are shown in table 5.11. Items 16, 19, 21,and 25 all belong to the subscale of value/usefulness, while 15, 17, and 23load on the interest/enjoyment scale. Items 16 and 19 were rated identi-cally, and item 21 was similar as well. All four items on the value/usefulnessscale were rated a 5 or higher by over 50% of participants, showing thatthe students found use, benefit, and value from Code Age. On the inter-est/enjoyment scale, exactly 50% of participants felt the game was morethan somewhat interesting to play, and would describe the game as morethan somewhat enjoyable. Finally, the statement “I would describe playingthe game as very fun” was very polarizing because it received no neutral rat-ings, as 43.75% of participants rated a 3 or less, and the remaining 56.25%of participants rated a 5 or higher. This data shows that students eithersolidly enjoyed the game, or did not care for it much, but only one partici-pant actually rated this statement as “not at all true”, yet give the resultsof this same questionnaire show that all students would at least be willingto play the game again (shown in question 25).525.3. Intrinsic Motivation Inventory AnalysisTable 5.11: Activity Perception Results Part 2 Percentage Agreement -Likert scale from 1 to 7, where 1 represented not at all true, 4 representedsomewhat true, and 7 represented very true.14. I felt like I had no choice but to play the game15. I thought the game was a very interesting activity16. I am willing to play again because I think it is somewhat useful17. I would describe the game as very enjoyable18. I felt like I had to play the game19. I believe playing the game could be somewhat beneficial for me20. I played the game because I had to21. I believe playing the game could help me do better in school22. While playing the game I felt like I had a choice23. I would describe playing the game as very fun24. I felt like it was not my own choice to play the game25. I would be willing to play the game again because it has some value to meQuestion 1 2 3 4 5 6 714. 43.75 18.75 18.75 6.25 6.25 6.25 015. 6.25 6.25 12.5 25 31.25 6.25 12.516. 0 6.25 12.5 12.5 31.25 12.5 2517. 0 6.25 25 18.75 37.5 0 12.518. 37.5 12.5 12.5 18.75 12.5 6.25 019. 0 6.25 12.5 12.5 25 12.5 31.2520. 43.75 31.25 6.25 0 12.5 6.25 021. 6.25 6.25 6.25 18.75 31.25 12.5 18.7522. 0 18.75 6.25 12.5 12.5 25 2523. 6.25 12.5 25 0 31.25 12.5 12.524. 43.75 25 6.25 12.5 0 12.5 025. 0 6.25 18.75 6.25 25 25 18.75535.3. Intrinsic Motivation Inventory Analysis5.3.2 Task Evaluation QuestionnaireThe first 14 items of the TEQ are described and their results are shownin table 5.12.Items 15-22 of the TEQ are shown and their results are summarized intable 5.13.The items in the TEQ that load on the interest/enjoyment factor are 1,5, 8, 10, 14, 17, and 20. Remembering that the IMI scale is from 1, not at alltrue, to 7, very true, where 4 is somewhat true, it is interesting to see thatitems 8 and 10 achieved similar ratings, with 50% of the participants ratinga 5 or higher. This score indicates that around half the students thoughtthey did at least somewhat well at the game, and somewhat enjoyed playingit. While most students seemed to somewhat enjoy playing the game, theyenjoyed it more than they were interested by it, given the ratings of questions5 and 17, where only 43.75% seriously agreed with the statements “I found[playing] the game very interesting”. The reverse scored item “I thoughtplaying the game was very boring” received a resounding vote of 3 or less by68.75% of participants. This data strongly suggests that very few studentsfound the game boring. The statement “I would describe the game as veryenjoyable” achieved exactly 50% of participants rating a 5 or higher, andoverall this data shows that the students at least somewhat enjoyed thegame.Another interesting scale of the TEQ is the perceived competence scale,which consists of items 4, 7, 12, 16, and 22. The results of question 4 and 12show that 50% of the students felt they were good at the game, and 43.75% ofstudents were satisfied with their performance. While the statement “Afterplaying the game for a while, I felt pretty competent” received a rating of5 or higher from 43.75% of students and a 3 or less from 50%, “I felt prettyskilled at playing the game” only received this same rating from 31.25%.And yet, “Playing the game was fun” received a rating of 5 or higher from37.5% of students. These results show relatively diverse levels of perceivedcompetency and fun. It makes sense to posit that the more competenta player feels, the more fun they have, but the results from such a smallsample size cannot confirm this idea. Finally, the results from both the IMIand TEQ show that overall the interest/enjoyment factor received mostlypositive ratings, or ratings of 4 or higher, which indicates that the studentsmay have in fact been somewhat motivated by the game. Perhaps moredefinitive results on motivation could be obtained with the larger samplesize.545.3. Intrinsic Motivation Inventory AnalysisTable 5.12: Task Evaluation Results Part 1 Percentage Agreement - Likertscale from 1 to 7, 1 - not at all true, 4 - somewhat true, and 7 - very true.1. While I was playing I was thinking about how much I enjoyed it2. I did not feel at all nervous about playing the game3. I felt that it was my choice to play the game4. I think I am pretty good at the game5. I found the game very interesting6. I felt tense while playing the game7. Playing the game was fun8. I think I did pretty well at playing the game compared to other students9. I felt relaxed while playing the game10. I enjoyed playing the game very much11. I didn’t really have a choice about playing the game12. I am satisfied with my performance at playing the game13.item I was anxious while playing the game14. I thought playing the game was very boringQuestion 1 2 3 4 5 6 71. 6.25 6.25 43.75 25 12.5 0 6.252. 6.25 0 12.5 6.25 0 25 503. 0 0 6.25 6.25 18.75 31.25 37.54. 0 12.5 6.25 31.25 18.75 18.75 12.55. 6.25 6.25 12.5 31.25 18.75 18.75 6.256. 43.75 6.25 18.75 6.25 12.5 12.5 07. 12.5 6.25 12.5 31.25 25 0 12.58. 6.25 18.75 6.25 18.75 25 18.75 6.259. 6.25 18.75 6.25 0 31.25 31.25 6.2510. 12.5 12.5 6.25 18.75 25 12.5 12.511. 50 18.75 0 12.5 18.75 0 012. 0 12.5 18.75 25 18.75 12.5 12.513. 43.75 25 12.5 0 18.75 0 014. 31.25 12.5 25 0 18.75 6.25 6.25555.3. Intrinsic Motivation Inventory AnalysisTable 5.13: Task Evaluation Results Part 2 Percentage Agreement - Likertscale from 1 to 7, where 1 - not at all true, 4 - somewhat true, 7 - very true.15. I felt like I was doing what I wanted to do while I was playing the game16. I felt pretty skilled at playing the game17. I thought playing the game was very interesting18. I felt pressured while playing the game19. I felt like I had to play the game20. I would describe the game as very enjoyable21. I played the game because I had no choice22. After playing the game for a while, I felt pretty competentQuestion 1 2 3 4 5 6 715. 6.25 0 12.5 37.5 12.5 12.5 18.7516. 6.25 6.25 12.5 43.75 6.25 6.25 18.7517. 6.25 6.25 12.5 31.25 25 12.5 6.2518. 37.5 18.75 12.5 18.75 6.25 0 6.2519. 37.5 12.5 12.5 18.75 12.5 6.25 020. 0 6.25 25 18.75 37.5 0 12.521. 43.75 31.25 6.25 0 12.5 6.25 022. 18.75 0 31.25 6.25 25 6.26 12.5565.4. Usability Questionnaire Analysis5.4 Usability Questionnaire AnalysisThe participants were presented with a usability questionnaire filled withquestions regarding the system’s usability, aesthetic, and playability. Theparticipants were given a mix of open-ended and Likert scale questions. AllLikert scale questions used a scale of 1, being strongly disagree, to 7, beingstrongly agree.5.4.1 General System QuestionsThe general system section consisted of 5 open ended questions and thefollowing 15 Likert scale questions:A summary of the Likert scale questions can be seen in table 5.14. Fromthe data presented this table, it is clear that the participants somewhatagreed with how simple, comfortable, and easy to use the system was. Theparticipants somewhat felt like the games reinforced programming conceptsand were easy to learn. The participants were less content with the instruc-tions and organization of the games, but still rated these aspects betweenno opinion and somewhat agree.The participants appeared to somewhat like the mechanics and interfaceof the game. Most interesting to note is that the participants highly agreedthat they liked to be able to see other players’ high scores, which impliesthat incorporating the high score board for fostering competition was apositive choice. In fact, 5 participants never viewed the page at all, but oneparticipant accessed the page 4 times in two visits to the site.The participants were also asked 5 qualitative questions. When asked ifthere were any functions or capabilities that the game should have had, butdid not have, two participants provided noteworthy answers. One idea wasthat students who achieved the high scores on the leaderboards should begiven a special ability. The second answer was that the instructions shouldbe more detailed, possibly even giving an example of the game in play.The next set of questions related to mobile device usage. Participantswere asked if they owned a mobile device and all 16 participants respondedthat they did. When asked if they had tried to use the game on their mobiledevice, one participant indicated he had, 13 replied they hadn’t, and two didnot respond, although it should be noted that the participants did not knowthat they could access the game on their mobile devices. The participantswere also asked if they would have liked to have been able to use the gameon a mobile device, and 10 responded in the affirmative, one in the negative,and five did not respond.575.4. Usability Questionnaire AnalysisTable 5.14: General System Questions Percentage Agreement - Likertscale from 1 to 7, where 1 represented strongly disagree and 7 representedstrongly agree.1. Overall, I am satisfied with how easy it is to use the system2. It was simple to use this system3. I feel comfortable using this system4. I feel like the games reinforced programming concepts5. It was easy to learn to use the game6. The instructions provided for the games are effective in helping me7. The organization of information on the different screens is clear8. The interface of the game is pleasant9. I like using the interface of the game10. The game has all the functions and capabilities I expect it to have11. The system was intuitive12. Overall, I am satisfied with the game system13. I like the game mechanic of traveling through an island filled with caves,where each cave is a game to complete14. I like the game mechanic of traveling through an island filled withtemples, where each cave is a game to complete15. I like being able to see other players’ high scoresQuestion 1 2 3 4 5 6 71. 0 0 6.25 18.75 31.25 12.5 31.252. 0 0 12.5 12.5 37.5 12.5 253. 0 6.25 6.25 6.25 37.5 12.5 31.254. 0 6.25 12.5 6.25 31.25 25 18.755. 6.25 6.25 0 6.25 12.5 37.5 31.256. 6.25 6.25 12.5 18.75 25 25 6.257. 0 6.25 12.5 25 25 12.5 18.758. 0 0 12.5 12.5 25 25 259. 0 0 12.5 6.25 31.25 18.75 31.2510. 0 0 6.25 25 18.75 25 2511. 0 6.67 26.67 6.67 26.67 13.33 2012. 0 12.5 0 25 31.25 25 6.2513. 6.25 0 12.5 12.5 12.5 25 31.2514. 6.25 0 6.25 18.75 6.25 25 37.515. 0 0 0 0 12.5 18.75 68.75585.4. Usability Questionnaire AnalysisFinally, the participants were asked why they spent as much or as littletime playing the game as they did. Common responses to this question werethat the participants had forgotten, prioritized other commitments, or werenot interested. These reasons could account for the low play times shownby the data collected from the game system itself.5.4.2 Aesthetics of the General SystemThe participants were asked 14 Likert scale questions and two open-ended questions relating to the appearance and feel of the system in general.Table 5.15 shows a summary of results of the (abbreviated) Likert scalequestions.Table 5.15: General System Aesthetic Questions Percentage Agreement -Likert scale from 1 to 7, where 1 represented strongly disagree and 7 repre-sented strongly agree. Each question began with: “I like the look and feelof the:”Question 1 2 3 4 5 6 71. Login screen 0 0 12.5 18.75 18.75 18.75 31.252. World map 0 0 6.25 18.75 31.25 31.25 12.53. Stats page 0 0 6.25 18.75 37.5 18.75 18.754. Cave island 0 0 6.25 18.75 43.75 25 6.255. Temples 0 0 14.3 14.3 35.7 21.4 14.36. Nav. bar 0 0 13.33 26.67 33.33 20 6.677. Logo place 0 0 12.5 18.75 12.5 31.25 258. Color theme 0 0 6.25 18.75 25 25 259. Art style 0 6.25 6.25 25 25 6.25 31.2510. Roundedbuttons0 0 0 18.75 37.5 12.5 31.2511. Squarebuttons18.75 12.5 18.75 37.5 0 0 12.512. Red/Greenhighlighting6.25 6.25 18.75 18.75 18.75 12.5 18.7513. Load timetoo long18.75 6.25 6.25 25 37.5 6.25 014. Hover effect 0 6.25 6.25 18.75 25 18.75 25In general the participants were positive about the look and feel of thegame’s interface, rating the login, world map, statistics screen, and islands595.4. Usability Questionnaire Analysisaround a 5, while the navigation bar was rated closer to a four. A par-ticular question the researchers were interested in answering was regardingthe shape of buttons. Clearly the users preferred a button with roundededges over a button with sharper edges. The red and green highlighting ofthe available islands was clearly a versatile issue, given that the perecent-age agreement was somewhat evenly distributed. However, the majority ofparticipants fell somewhere between no opinion and somewhat agree, so thehighlighting could be left in the game. The users felt that the load time be-tween screens was not too long, and the hover effect to highlight the currentcave was good. The color theme and art style were similarly well rated.The two open-ended questions asked the participants if they would changeanything about the look of the world map or the look of the system. Overhalf of the participants said there was nothing they would change in eitherthe world map or the system. Some participants did suggest changes to theworld map, including:− Show where to begin on the world map− Use clickable regions instead of islands− Use more animation− Change the perspective away from bird’s eye (bird’s eye shows youhave a lot to do), make it more like a maze or a path in first person− Make it look sharperSome participants also had suggestions for the general system, including:− Make the statistics page more organized− Make the moving buttons textured− Make it look more modern and less childish− Make it clear what to do nextOverall, it appears the participants were positive about the aesthetics ofthe system, with the exception of a few small visual adjustments and somefurther organization and clarification.605.4. Usability Questionnaire Analysis5.4.3 The Caveman IslandThe results of the (abbreviated) Likert scale questions regarding thecaveman island are shown in table 5.16. Overall, the participants seemedpositive about this island and its features. The majority of participantsrated a 4 or higher with all of the statements about liking the differentelements in the Caveman Island.Table 5.16: Caveman Island Questions Percentage Agreement - Likert scalefrom 1 to 7, where 1 represented strongly disagree and 7 represented stronglyagree. The Likert scale questions begane with the statement “I like the:”Question 1 2 3 4 5 6 71. Path 0 6.25 6.25 6.25 37.5 25 18.752. Hover effect 0 6.25 6.25 18.75 18.75 18.75 31.253. Islandtransition0 0 12.5 18.75 25 12.5 31.254. Cavetransition0 6.25 12.5 18.75 25 18.75 18.755. No. gamesper island6.25 6.25 12.5 18.75 6.25 18.75 31.25The four open-ended questions asked of the participants provided inter-esting information. When asked how high scores/progress should be dis-played at each cave, the most common responses were to change the colorof the cave, to display a percentage over the cave, to place a glow/light in-side the cave, and to display the score at the end of the level. Currently,the score already is displayed upon completion of the level. When askedwhat they like about the island, the most common responses were the art,the cannon game, the sabre-tooth kitten, and the games. The participantswere also asked what they disliked about the island, and the most commoncomplaints were that the instructions were unclear or vague and as a re-sult, unlocking caves was confusing. Finally, the participants were given theopportunity to describe any changes they would make to the island, and 7interesting suggestions were made:1. Add a back button2. Make the map bigger3. Make the island less repetitive615.4. Usability Questionnaire Analysis4. Make the scoring system more dynamic, so there can’t be ties5. Add different non-playable characters6. Make it clear how to pass a level7. Create a mandatory tutorial levelGame 1In this section the participants were asked a series of questions relating tothe first stage of the button game, where the incorrect buttons were removed.Table 5.17 shows the results of the 7 Likert scale questions regarding thisgame on the caveman island.Table 5.17: First Game Questions Percentage Agreement - Likert scale from1 to 7, where 1 represented strongly disagree and 7 represented stronglyagree.1. The buttons were removed too quickly2. I liked that the incorrect buttons were removed3. I did not find it overwhelming to have all of the instructions at once4. The amount of time given to complete this game was sufficient5. I liked the animation of the buttons that were removed6. It was helpful to have incorrect buttons removed from the screen 7. Ithought the game was funQuestion 1 2 3 4 5 6 71. 13.33 33.33 6.67 26.67 6.67 6.67 6.672. 0 0 13.33 46.67 13.33 20 6.673. 0 6.67 0 26.67 40 13.33 13.334. 0 0 13.33 20 13.33 33.33 205. 0 0 6.67 26.67 6.67 53.33 6.676. 0 0 0 26.67 0 46.67 26.677. 6.67 13.33 6.67 26.67 13.33 26.67 6.67One interesting result is that the participants did not seem to find thepresentation of the instructions overwhelming, however, in the previous sec-tion, one of the largest complaints about the system was that the instructionswere unclear or confusing. This disparity indicates that there may need tobe two sets of instructions; one set of instructions explaining each mini-game, and a second set of instructions explaining how to progress through625.4. Usability Questionnaire Analysisthe game. Also worth noting is that the participants agreed readily that itwas helpful to have the incorrect buttons removed from the screen.In the open-ended questions the participants were given the opportunityto describe how they would make the game more attractive or engaging.One participant felt that there should be more focus on teaching conceptsthan testing them, however the user does not elaborate on how this shouldbe accomplished. Other participants had the following 7, more definitive,ideas on how to improve the game:1. Add the ability to choose your own difficulty2. Incorporate more keyboard usage3. Give the buttons textures4. Incorporate more diversity5. Add a results screen, showing the correct and incorrect words to helplearning6. Add a negative consequence for clicking too many incorrect buttons7. Add background musicOverall, the participants liked the use of a progress bar as a timer, andthey also liked the colors currently used in the progress bar. When askedwhat other colors could be used on a timer or progress bar, the most commonresponse was red and green, or red, yellow and green.5.4.4 Game 2The results of the following three Likert scale questions for this gameare shown in table 5.18.These questions related to the second stage of the button game, where nobuttons were removed, and no buttons moved on the screen. The percentageagreement for finding the instructions overwhelming increased from the pre-vious section, which could be explained by the fact that the first and secondgames are similar enough that the users learned from the first game, and theinstructions for the second game were irrelevant. It is interesting to notethat the agreement reported for “I thought this game was fun” increasedslightly from the first game, to the second game.The participants were again asked how the game could be made moreattractive or engaging, and many ideas presented were the same as described635.4. Usability Questionnaire AnalysisTable 5.18: Second Game Questions Percentage Agreement - Likert scalefrom 1 to 7, where 1 represented strongly disagree and 7 represented stronglyagree.1. I did not find it overwhelming to have all of the instructions at once2. The amount of time given to complete this game was sufficient3. I thought this game was funQuestion 1 2 3 4 5 6 71. 0 0 6.67 26.67 33.33 13.33 202. 0 0 0 26.67 26.67 26.67 203. 6.67 13.33 6.67 20 20 13.33 20for the first cave. This repetition is to be expected because of how similarthe first and second caves are. Two new ideas were to integrate a voice thatreads the instructions to you, and to create a different background for thegame. Some participants felt that the game would benefit from being morechallenging and less repetitive.5.4.5 Game 3The third game differs from the first and second because the buttons onthe screen move. Table 5.19 shows the percentages of participants agreementof the 8 Likert scale questions.Two participants felt that moving the buttons were unnecessary and theparticipants found the moving buttons made the game more challenging, butthey did not like that the buttons moved. Also, this game had a significantlylower percentage of students who felt the game was fun, than the first twogames. Similarly to the previous games, some participants felt that gamecould use clearer instructions and more instructional aspects. In terms ofthe mechanics of the game, the users seemed satisfied with how smoothlythe buttons moved and how easy they were to click.645.4. Usability Questionnaire AnalysisTable 5.19: Third Game Questions Percentage Agreement - Likert scale from1 to 7, where 1 represented strongly disagree and 7 represented stronglyagree.1. The buttons moved too quickly2. I liked that the buttons moved3. I did not find it overwhelming to have all of the instructions at once4. The amount of time given to complete this game was sufficient5. The buttons moved smoothly6. The buttons were easy to click7. The moving buttons made the game more challenging8. I thought this game was funQuestion 1 2 3 4 5 6 71. 0 0 43.75 31.25 12.5 0 12.52. 18.75 0 18.75 31.25 12.5 12.5 6.253. 6.25 0 12.5 25 31.25 12.5 12.54. 6.25 0 18.75 25 12.5 18.75 18.755. 0 6.25 12.5 18.75 18.75 37.5 6.256. 0 25 18.75 12.5 25 12.5 6.257. 0 6.67 0 13.33 40 26.67 13.338. 13.33 13.33 13.33 26.67 13.33 13.33 6.67655.4. Usability Questionnaire Analysis5.4.6 Game 4This section of the usability study moved away from the button gamesand on to the question and answer game. Table 5.20 contains the results ofthe Likert scale portion of these questions. The open-ended questions didnot provide a large amount of feedback. The participants were asked whatthey would change about the game and how they would make the gamemore attractive and engaging. One participant felt that this game was themost useful out of all of the games, and the majority of the participantsfound the game useful. One participant felt that the game should explainwhy the answer selected was incorrect and what the correct answer was.The participants found this game to be more fun than two of the variationsof the button game.Table 5.20: Fourth Game Questions Percentage Agreement - Likert scalefrom 1 to 7, where 1 represented strongly disagree and 7 represented stronglyagree.1. I did not find it overwhelming to have all of the instructions at once2. The amount of time given to complete this game was sufficient3. I thought this game was useful4. I thought this game was funQuestion 1 2 3 4 5 6 71. 6.25 0 18.75 18.75 18.75 18.75 18.752. 6.25 0 12.5 18.75 25 18.75 18.753. 6.25 0 12.5 6.25 31.25 18.75 254. 6.25 12.5 12.5 25 12.5 12.5 18.755.4.7 Game 5This penultimate section of the usability study asked the participantsa series of questions regarding the cannon game. 10 participants felt thatthis game should have been timed and 4 felt that it should not have beentimed. The participants again were asked if they would change anythingand two participants gave interesting responses. One participant thoughtthe game mechanic could be altered so that the cannon was on the bottomof the screen, and the baskets were on the top. This participant thought itwould be better if the baskets didn’t move and the cannon automaticallyaligns with the baskets, and the player uses the arrow keys to select whichbasket to shoot into. Another participant thought that the baskets should665.4. Usability Questionnaire Analysisbe wider.The final open-ended question asked if there was anything the partic-ipants would do to make the game more attractive and engaging. Thefollowing 6 items were suggested:− Add sound effects/explosions− Add an avatar icon− Have the cannon shoot in an arc (instead of a straight line)− Make the shooting smoother− Add background music− Add more difficultiesTable 5.21 shows the results of the cannon game Likert scale questions.The participants did not feel that the baskets moved too quickly, and theyfelt that the cannon was smooth and easy to use. The participants feltneutrally about the difficulty of shooting the cannon balls into the basketsand the smoothness of the cannon ball motion. It is most interesting to notethat the cannon game received the highest participant agreement of all thegames on “I thought this game was fun” but it came second to the questionand answer game on the question “I thought this game was useful”.5.4.8 Final QuestionsThe last section of questions on the usability study asked the participantsto select options for in-game motivation. The participants were offered alist of features and were asked to select the options that would motivatethem to spend time with the game. Table 5.22 shows the summary of theparticipants’ selections.15 out of the 16 participants agreed that the idea of using games to helplearning is useful. 14 of the 16 participants agreed that using games to learnprogramming is useful, and not intimidating. The participants were asked ifthey would prefer small games or longer games to complement their learning.8 participants chose shorter games and 5 participants chose longer games,3 participants chose not to respond. Even the students who did not carefor Code Age thought that games are useful to help learning. The followingcomments left by the participants are highly indicative of the diverse natureof the responses received about Code Age:675.4. Usability Questionnaire AnalysisTable 5.21: Fifth Game Questions Percentage Agreement - Likert scale from1 to 7, where 1 represented strongly disagree and 7 represented stronglyagree.1. The baskets moved too quickly2. The cannon movement was smooth and easy to use3. It was too hard to shoot the cannon balls into the baskets4. The amount of time given to complete this game was sufficient5. The cannon balls moved smoothly6. I did not find it overwhelming to have all of the instructions at once7. The found the game challenging8. This game was easy to use9. I thought this game was useful10. I thought this game was funQuestion 1 2 3 4 5 6 71. 6.25 12.5 37.5 18.75 0 25 02. 6.25 12.5 6.25 18.75 25 18.75 12.53. 18.75 0 12.5 31.25 6.25 12.5 18.754. 6.25 0 6.25 25 12.5 31.25 18.755. 6.67 6.67 13.33 20 26.67 13.33 13.336. 0 12.5 6.25 18.75 12.5 25 257. 6.25 12.5 6.25 25 25 12.5 12.58. 25 0 12.5 12.5 12.5 25 12.59. 6.25 12.5 6.25 25 25 6.25 18.7510. 6.25 6.25 12.5 12.5 25 25 12.5Table 5.22: Final Usability QuestionsOption CountGain experience points as you progress 12A stamp or reward when you have been successful 8A world in which a character moves to playdifferent games13A professor avatar that follows your progress 3A computer geek avatar that follows your progress 4A student avatar that follows your progress 4685.4. Usability Questionnaire Analysis− “Upload this app online. It can help others interested in ComputerScience”− “If the games were actually exploring things thoroughly it would bemore useful”− “The games are more fun than reading a textbook or listening to alecture”69Chapter 6Heuristic EvaluationAnalysis and Results6.1 Demographics and BackgroundQuestionnaire AnalysisFive participants were recruited for the heuristic evaluation. Of theseparticipants, three were males and two were females, and all had previouslytaken the COSC 341 Human-Computer Interaction course at UBC Okana-gan. All participants reported to be in the age range of 20-24. Similarto the demographic survey from the usability study, the participants wereasked to rate their agreement with two statements, on the scale where 1 =strongly disagree, and 7 = strongly agree. Table 6.1 shows the number ofparticipants that selected each category.Table 6.1: Game PreferencesQuestion 1-3 4 5 6 7I like memory games 0 1 3 1 0Given the choice, Iprefer games whereI have time to reflect0 3 2 0 0When asked on average how many hours a week the participants spentgaming, the results were varied. Two participants reported playing lessthan an hour per week and two participants reported playing 3-4 hours perweek. The final participant reported playing 5 or more hours per week. Allparticipants reported owning an Xbox, a Wii, or both. Over half of theparticipants reported owning multiple gaming devices. Unsurprisingly, allparticipants indicated that they use the computer daily for a broad varietyof activities, including, but not limited to, internet, web design, and work.Interestingly, only one participant did not indicate programming as one ofhis/her activities, all other participants did.706.2. Heuristic Evaluation Analysis6.2 Heuristic Evaluation AnalysisUpon completion of the demographics survey, the participants were in-structed to log into the game using their secure research identification num-ber and password. They were instructed to complete a series of tasks andto rate each of the tasks on a set of heuristics.The tasks that the users were asked to perform were as follows:1. Log into the system2. Navigate around the home screen3. Navigate to and around each of the 6 caves4. Explore the user information and high score page5. Navigate around the system using the navigation bar6. Log outThe set of heuristics that were used to evaluate each task are a collectionof Jakob Nielsen’s ten heuristics for user interface design, applied to the CodeAge user interface [Nie92, NM90]. The participants were asked to rate eachtask at its fulfillment of each heuristic, using a scale from 1 to 7, where 1 issucceeds completely, and 7 is fails completely. The following is a list of theten heuristics:1. Visibility of system status: the user is informed about what is goingon through appropriate feedback within reasonable time2. Match between system and the real world: the system speaks the user’slanguage with words, phrases, conventions and concepts familiar to theuser3. User control and freedom: there is a clearly marked exit that wouldallow the user to leave without going through an extended dialogue(ex. exit, back, undo)4. Consistency and standards: the user does not have to wonder whetherdifferent words, situations, or actions mean the same thing5. Error prevention: there are no error-prone conditions, or if there arethe user is aware of them and provided a way to deal with them716.2. Heuristic Evaluation Analysis6. Recognition over recall: the user does not have to remember info fromone part of the system to another i.e., objects, actions and options arealways visible7. Flexibility and efficiency of use: there are options that allow the ad-vanced user to operate the system more efficiently8. Aesthetic and minimalist design: the user is not presented with irrel-evant or unneeded information9. Errors: users are helped to recognize, diagnose and recover from errorswhich are expressed in plain language; they explain the problem andoffer a solution10. Help and documentation: any documentation or help offered to theuser is brief, easy to search, focused on the user’s task, and lists stepsor solutionsThe full questionnaire can be seen in Appendix F.2.2. After completingthe task and filling out the questionnaire, the participants were then askedto discuss any ratings that are less than or equal to 5, i.e., any ratings thatindicate the task does not succeed at fulfilling a specific usability heuristic.Collectively the participants found 8 major usability errors.The 8 usability errors found were:− When the login process fails there is no error message− There is no indication of what to do/where to go when you reach thehome screen− There is no obvious way to return to the home screen from the islandscreen− It is difficult to tell which cave to go to on the island screen− There is no available exit from the games− When exiting the game, the page scrolls back up to the top of theisland instead of focusing on the active cave− There are no in-game instructions− In the third level of the first game, the moving buttons were hard toclick726.3. Heuristic Follow Up Results6.3 Heuristic Follow Up ResultsThe eight errors that were discovered in the heuristic evaluation, i.e., thetasks that failed a specific heuristic, were used to develop the heuristic followup questionnaire. The participants were sent the list of the eight errors andasked to evaluate how severely these errors impacted the usability of thegame. The scale for evaluating the severity of the errors was 0 - I do notagree this is a usability problem at all. 1 - Cosmetic problem only: need notbe fixed unless extra time is available. 2 - Minor usability problem: fixingthis should be given low priority. 3 - Major usability problem: important tofix, should be given high priority. 4 - Usability catastrophe: imperative tofix this before product is used. The severity ratings selected for each errorby the participants are shown in table 6.2.Table 6.2: Error Severity Data - Participant AgreementError 1 2 3 4When the log on process failsthere is no error message0 0 4 0There was no indication ofwhat to do/where to go whenyou reach the home screen0 3 1 0There was no obvious wayto return to home screenfrom the island screen0 0 2 2It was difficult to tell whichcave to go to on the island1 1 2 0There was no availableexit from the games0 1 2 1When exiting the game, thepage scrolled back up tothe top of the island2 2 0 0There were no in-gameinstructions0 2 2 0In the third level of thefirst games, the movingbuttons were hard to click1 1 1 1Overall the data shows that there were very few catastrophic errors. Theworst usability error found was that there was no obvious way to return to736.4. Plot Linethe home screen from the island screen. The ratings indicate that this erroris somewere between a major and catastrophic usability error, which mustbe fixed before the game is used again. Two other errors rated around a 3,or major usability problem, while all other errors were rated around a 2.5or less.Upon completing the severity ratings, the participants were also askedseveral general questions regarding their likes, dislikes, appeal, and storyline. Two common likes described by the participants were the simplicity ofthe game(s) and the art style. Common dislikes reported by the participantswere that there was no error message upon unsuccessful login attempt, andthat sometimes it was unclear where to start on the main map and islands.6.4 Plot LineThe participants of both the heuristic evalution and the usability studywere asked to choose a preferred plot to integrate with the game from thefour following plot ideas:1. World War 3 has occurred between humans and robots. In order todefeat the robots we had to destroy all technology and knowledge ofprogramming, causing humans to revert to cave life. Your quest totravel the world gathering and recreating this knowledge and eachisland you visit will become more and more advanced as you progress.2. Humans have created an artificial intelligence that has taken over theworld. In order to fight back, programming was destroyed so the AIcould no longer recode itself. Now to defeat the AI, you must secretlyrelearn programming in order to write the code to finally destroy theAI.3. Cyber terrorists have launched an attach to take over the world. Inorder to fight back, programming was destroyed so the cyber terroristscould no longer wreak havoc. Now to defeat the cyber terrorists, youmust secretly relearn programming in order to write the code to takeback the world from the cyber terrorists.4. Aliens believe humanity has advanced too much and are quickly head-ing down a path of self-destruction. To preserve humanity and Earth,the aliens travel backwards and forwards in time, destroying program-ming and technology as they go. This rampant destruction sets offa chain of events that change modern life as we know it. You have746.4. Plot Linefollowed the aliens back, and are following them through time, savingprogramming as you move forward through time.In the responses from the heuristic evaluation, two participants selectedplot 1 as their preferred plot line, and two participants selected plot 4. Thefinal participant selected plot 2 as being his/her favourite. The results fromthe usability study showed that plot 1 received 6 votes, plot 2 received 3,plot 3 received 2, and plot 4 received 3 votes. Given that the majority of theparticipants selected plot 1 as their favourite, plot 1 seems to be the plotline that should be introduced to the game.75Chapter 7Conclusion and Future Work7.1 Conclusions of the GameThis thesis has presented a game that implements three different gamemechanics; buttons, question-and-answer, and point-and-shoot. The resultsof the study show that Code Age requires some modification before it canbe deployed again. The implementation of the storyline is crucial. Havinga storyline included in the game brings cohesion and meaning to the game,which provides the user with a sense of where they are in the game andwhat direction the game is going in. Inclusion of a storyline may help to in-crease student investment, interest, engagement, and overall retention. Theinclusion of a storyline can also help by using story animations to indicatewhere to begin playing the game (i.e., what island or what cave to startplaying at), which was a complaint of some students. The five of the eightusability errors discovered by the heuristic evaluation need to be corrected.The errors of no obvious way to return to the home screen from the islandscreen, and no available exit from the games, and no in-game instructionshave already been corrected. Future iterations of the game should have moregames and be mobile friendly. Other ideas for future games include:− A game that contains scrambled code that the user must unscramble− A guessing game where the user is presented with a problem and mustdecide whether to use an if or a while statement to solve it− A game where the user is given an expected output and some skeletoncode, and they must fill in the blanks to produce the correct output− A side scrolling, choose-the-right-path style of game, where each pathcontains a different piece of code, and the user must select the rightpath to complete the code to solve a problem− A simplistic tile matching game where the user must flip over tiles andmatch objects to methods767.2. Conclusions of the StudyAs discussed in the conclusions, the implementation of a coordinatedrelease style and a refinement of the engagement survey would be positivesteps forward, beginning with the removal of the three-level system. Theuser statistics/high score page needs to be reorganized. One idea would beto use a miniature world map where the user can hover over each island tosee their statistics for each game on the island. Then next to the map, havean all-time high score table, that shows the highest combined total scores(across all games). The addition of a special ability or bonus for the leaderson the high score table was suggested by a user, and may be a good way toincrease competition between users. One idea is to have students earn badgesas they complete quests/achievements (i.e., a badge for playing the longestamount of time). Finally, the cannon game should be updated to behave,in the way suggested by one student, so that the baskets do not move andthe cannon automatically aligns with each basket, and then the player usesthe arrow keys to select which basket to shoot into. This modified mechanicwould allow the student to focus more on getting the correct basket, and lesson trying to aim for the correct basket. This mechanic is an improvementover the previous version because the addition of a timer would no longerbe an unreasonable challenge for the student because they do not have toaim the cannon anymore.7.2 Conclusions of the StudyThe findings of this research are able to address the questions posed inthis thesis, in some cases anecdotally, and in other cases more directly. Thisthesis set out to determine if Code Age is user friendly and educationallyvaluable. The results of this study indicate that the system is intuitiveand easy to use, however in some instances the game play was confusingor vague and required further instruction. It is anticipated that, with afew minor adjustments, further study would show that Code Age is veryuser friendly. Minor adjustments include the addition of a back button tothe game screens, clear indications of starting points, and more thoroughinstructions for each game. Code Age has already been modified to includethe back button, and the instructions have been modified to add clarity byremaining, discretely, on the page while the user is playing the game.Ideally, the usability study should have been run in a fall semester classof COSC 111. This change would have yielded a more representative pop-ulation for study, i.e., fewer engineering students, a higher percentage ofparticipants with no programming experience, and a younger overall aver-777.2. Conclusions of the Studyage age. The fact that the population was not totally representative hascome up in this thesis several times as affecting participant engagement(time played) and motivation. It would be worthwhile to continue the studyof Code Age using a larger target audience more typical of introductoryprogramming in order to collect more meaningful data. Despite not havingthis proper audience, the data collected in this study showed that studentsfelt there was value in Code Age, and even more so in learning programmingthrough games.From the population that was recruited, some important information wasdetermined. Clearly, more mini-games are necessary, and more importantly,the games should be more in-depth in content/information delivery, thethree levels/cave system was confusing and should be removed, and finally,more diversity is required via creating games using different game mechanics.The three levels system should be replaced with a coordinated release style ofgame. Each cave should only contain one game. Each island should containgames relating to a chapter or two’s worth of course content, and as thecontent is covered in the course the island is made available to the students.In this way, all games on an island reinforce knowledge of concepts beingcovered currently in the course. This strategic release by the instructor willenable the students to use the game in the context of a study tool, and thestudents will not be looking to the game to provide more in-depth knowledge,but to complement the course. Ideally, students should have to completean island before the next island is unlocked for play, even if the island hasbeen released, and students who complete an island before the next islandis available should have access to a form of bonus game. Ultimately, it hasbeen shown that this mini-game style used in Code Age is successful forgetting students to play the game for short periods of time.Testing the game in a fall semester would require the development ofmore games, and the inclusion of games should increase the students’ interestin playing and motivate them to play longer than the participants in thisstudy did. It would be interesting to see, in these circumstances, how longeach participant would play for and what would lead to the students leavingthe game. This knowledge would allow future iterations of the game to berefined to increase player retention.As well as modifying the timeline of the study, the measures of the studyshould be revised as well. The questionnaires should be smaller and shouldbe released incrementally. After completing each island, or prior to therelease of the next island in the course, the participants should complete theusability study of the current island and the IMI. These measures shouldbe repeated for each island in the study, ensuring a larger number of data787.2. Conclusions of the Studypoints for analysis. During this study, several students wished they couldhave done the usability questionnaire while they were playing the game, andseveral students complained about the length of the post-test. Altering theusability study in this way mitigates both of these student complaints. Thetask evaluation questionnaire showed a relationship between the students’perceived competence and enjoyment of the game. Given that the questionsare related to playing an educational game, this relationship is anticipatedbecause, generally speaking, the better you are at playing at game, the moreyou enjoy it.The engagement test ultimately did not produce the intended results.The one reliable item did show an increase in time spent for COSC 111class preparation from the beginning of term two. Four weeks into term,however this increase could not be attributed to the game, but rather theincreasing difficulty of learning to program. The engagement test has showninteresting results in terms of meaning-making, but ultimately needs revisionand refinement to measure engagement. The addition of items more relatedto time spent and reasons for playing, as well having many more participantsrespond than in this study, closer to 100 [FMF12], would be an excellentstarting place. More participant responses would create more data points foranalysis, and a more reliable factor analysis. A more reliable factor analysiswill be key in determining the best items for measuring engagement.Ideally in this study the heuristic evaluation should have preceded theusability study, in order to have given the researchers a chance to modifythe game prior to the usability study. However, given the timing of thestudy, having them run concurrently gave rise to the opportunity to doublecheck the competency of the heuristic evaluators. The usability participantsdid not find any issues with the game that the heuristic evaluators hadnot already discovered, so the evaluators were successful and robust in theirevaluation of Code Age. Ultimately, the heuristic evaluation was a beneficialaddition to the study and the two-part research plan was sound.The thesis posed two final questions regarding the value of a game forlearning. Would students even like a game for learning programming? Couldan educational game for learning programming be useful to students? Thefinal question on the usability study asked the participants if using gamesto help learning is useful and 15 out of the 16 participants agreed, further-more, 14 of the 16 participants agreed that using games to learn program-ming would be useful. Programming games are clearly an option studentswould like to have for learning, and providing this option is reason enough tocontinue the advancement of Code Age. If it can be demonstrated throughfurther study that these games are of educational and/or motivational value,797.2. Conclusions of the Studythen a massive step forward will have been taken towards improving com-puter science education in introductory programming. The best way to tryand take that step is by continuing the study of Code Age.80Bibliography[Ain12] Mary Ainley. Handbook of Research on Student Engagement,chapter 13 Student’s Interest and Engagement in Classroom Ac-tivities. In Christenson et al. [CRW12], 2012. → pages 6, 7[Ber06] Mark Berends. Handbook of complementary methods in educa-tion research, chapter 37 Survey Methods in Educational Re-search. American Educational Research Association, Washing-ton, D.C; Mahwah, N.J, 2006. → pages 28[BK03] D. Barnes and Michael Kolling. Objects first with Java: a prac-tical introduction using BlueJ. Pearson Education, New York,2003. → pages 14[BPCL08] Tiffany Barnes, Eve Powell, Amanda Chaffin, and Heather Lip-ford. Game2learn: improving the motivation of cs1 students.pages 1–5. ACM, 2008. → pages 1[CBM+12] Thomas M. Connolly, Elizabeth A. Boyle, Ewan MacArthur,Thomas Hainey, and James M. Boyle. A systematic literaturereview of empirical evidence on computer games and seriousgames. Computers and Education, 59:661–686, 2012. → pages4, 5, 9, 10, 11, 18[CRW12] Sandra L. Christenson, Amy L. Reschly, and Cathy Wylie, ed-itors. Handbook of Research on Student Engagement. Springer,New York, 2012. → pages 81, 82, 83, 84, 85[DEPL94] Edward L. Deci, Haleh Eghrari, Brian C. Patrick, and Dean R.Leone. Facilitating internalization: The self-determination the-ory perspective. Journal of Personality, 62(1):119–142, 1994. →pages 50[DeQ12] Miguel DeQuadros. GameSalad. Packt Publishing, 2012. →pages 1481Bibliography[EJ13] Severine Erhel and Eric Jamet. Digital game-based learning:Impact of instructions and feedback on motivation and learningeffectiveness. Computers and Education, 67:156–167, 2013. →pages 18[EK14] Peter Ewell and George Kuh. National survey on student en-gagement, 2014. → pages 6, 29[FMF12] A. Field, J. Miles, and Z. Field. Discovering Statistics Using R.SAGE Publications, 2012. → pages 79[FP10] Joseph Flieger and James Dean Palmer. Supporting pair pro-gramming with javagrinder. J. Comput. Sci. Coll., 26(2):63–70,December 2010. → pages 14[FZ12] Jeremey D. Finn and Kayla S. Zimmer. Handbook of Researchon Student Engagement, chapter 5 Student Engagement: WhatIs It? Why Does It Matter? In Christenson et al. [CRW12],2012. → pages 6, 7[Gad11] Tony Gaddis. Starting out with Alice: a visual introduction toprogramming. Addison-Wesley, Boston, 2011. → pages 14[GM10] Anabela Jesus Gomes and Anto´nio Jose´ Mendes. A study onstudent performance in first year CS courses. In Proceedings ofthe Fifteenth Annual Conference on Innovation and Technologyin Computer Science Education, ITiCSE ’10, pages 113–117,New York, NY, USA, 2010. ACM. → pages 2[GM12] Ioana Gherguescu and Cristina Hava Muntean. Assessment inGame-Based Learning: Foundations, Innovations, and Perspec-tives, chapter 18 Measurement and Analysis of Learner’s Moti-vation in Game-Based E-Learning. Springer Verlag, 2012. →pages 5, 7, 10, 11, 29[Gre06] James G. Greeno. Handbook of complementary methods in edu-cation research, chapter 45 Theoretical and Practical AdvancesThrough Research on Learning. American Educational ResearchAssociation, Washington, D.C; Mahwah, N.J, 2006. → pages 8,11[HJ11] Paul Howard-Jones. Toward a science of learning games. Mind,Brain, and Education, 5(1):33–41, 2011. → pages 9, 1082Bibliography[JCS09] H. C. Jiau, J. C. Chen, and Kuo-Feng Ssu. Enhancing self-motivation in learning programming using game-based simula-tion and metrics. IEEE Transactions on Education, 52(4):555–562, 2009. → pages 14[K1¨0] Michael Ko¨lling. The greenfoot programming environment.Trans. Comput. Educ., 10(4):14:1–14:21, November 2010. →pages 14[KfPRP01] George D. Kuh, Indiana University. Center for Postsec-ondary Research, and Planning. National survey of studentengagement: the college student report : NSSE technical andnorms report. Indiana University Center for Postsecondary Re-search and Planning, Bloomington, IN, 2001. → pages 29[KM08] Pivi Kinnunen and Lauri Malmi. Cs minors in a cs1 course.pages 79–90. ACM, 2008. → pages 1[LI10] Thomas J. Lasley II. Encyclopedia of Educational Reform andDissent, chapter Bloom’s Taxonomy. SAGE Publications, Inc.,2010. → pages 9[LI13] Antti-Jussi Lakanen and Ville Isomttnen. High school students’perspective to university cs1. pages 261–266. ACM, 2013. →pages 1[LK10] Patricia Lasserre and Kyle Kotowick. Proposal for a new strat-egy to practice programming. In Proceedings of the 15th West-ern Canadian Conference on Computing Education, WCCCE’10, pages 9:1–9:3, New York, NY, USA, 2010. ACM. → pages1[LW11] Frederick W.B. Li and Christopher Watson. Game-based con-cept visualization for learning programming. In Proceedings ofthe third international ACM workshop on Multimedia technolo-gies for distance learning, MTDL ’11, page 3742, New York,NY, USA, 2011. ACM. → pages 1, 4[Mar12] Andrew J. Martin. Handbook of Research on Student Engage-ment, chapter 14 Part II Commentary: Motivation, and En-gagement: Conceptual, Operational, and Empirical Clarity. InChristenson et al. [CRW12], 2012. → pages 783Bibliography[May12] Igor Mayer. Towards a comprehensive methodology for the re-search and evaluation of serious games. Procedia Computer Sci-ence, 15:233–3247, 2012. → pages 5, 9, 10[MDT89] E. McAuley, T. Duncan, and V. V. Tammen. Psychometricproperties of the intrinsic motivation inventory in a competitivesport setting: a confirmatory factor analysis. Research quarterlyfor exercise and sport, 60(1):48, 1989. → pages 29, 50[MSL11] Jamie McKee-Scott and Patricia Lasserre. Educational gamesfor cs1: Raised questions. In Proceedings of the 16th WesternCanadian Conference on Computing Education, WCCCE ’11,pages 46–46, New York, NY, USA, 2011. ACM. → pages 1[MW04] Daniel D. McCracken and R. J. Wolfe. User-centered Web sitedevelopment: a human computer interaction approach. PrenticeHall, Upper Saddle River, NJ, 2004. → pages 11[Nie92] Jakob Nielsen. Finding usability problems through heuristicevaluation. In Proceedings of the SIGCHI Conference on HumanFactors in Computing Systems, CHI ’92, pages 373–380, NewYork, NY, USA, 1992. ACM. → pages 12, 31, 71[Nie95] Jakob Nielsen. Usability inspection methods. In ConferenceCompanion on Human Factors in Computing Systems, CHI ’95,pages 377–378, New York, NY, USA, 1995. ACM. → pages 12,31[NM90] Jakob Nielsen and Rolf Molich. Heuristic evaluation of userinterfaces. In Proceedings of the SIGCHI Conference on HumanFactors in Computing Systems, CHI ’90, pages 249–256, NewYork, NY, USA, 1990. ACM. → pages 12, 30, 71[Pre01a] Marc Prensky. Digital Game-Based Learning, chapter 5 Fun,Play and Games: What Makes Games Engaging. In [Pre01b],2001. → pages 4, 5, 13, 17, 18, 26[Pre01b] Marc Prensky. Digital Game-Based Learning. McGraw-Hill,2001. → pages 17, 84[Ree12] Johnmarshall Reeve. Handbook of Research on Student Engage-ment, chapter 7 A Self-Determination Theory Perspective onStudent Engagement. In Christenson et al. [CRW12], 2012. →pages 6, 784Bibliography[RES13] Katie Redmond, Sarah Evans, and Mehran Sahami. A large-scale quantitative study of women in computer science at Stan-ford university. In Proceeding of the 44th ACM Technical Sympo-sium on Computer Science Education, SIGCSE ’13, pages 439–444, New York, NY, USA, 2013. ACM. → pages 36[SLL+12] Neil Suttie, Sandy Loucharf, Theodore Lim, Andrew Macvean,Wim Westera, Damien Djaouti, and Damian Brown. In per-suit of a ’Serious games mechanics’: A theoretical framework toanalyse relationships between ’Game’ and ’Pedagogical aspects’of serious games. Procedia Computer Science, 15:314–315, 2012.→ pages 4, 9, 10[SM12] Dale H. Schunk and Carol A. Mullen. Handbook of Research onStudent Engagement, chapter 10 Self-Efficacy an an EngagedLearner. In Christenson et al. [CRW12], 2012. → pages 6, 7[The99] Self-Determination Theory. Intrinsic motivation inventory(IMI), 1999. → pages 7, 29, 50[YSC+12] Michael F. Young, Stephen Slota, Andrew B. Cutter, GerardJalette, Greg Mullin, Benedict Lai, Zeus Simeoni, MatthewTran, and Mariya Yukhymenko. Our princess is in another cas-tle: A review of trends in serious gaming for education. Reviewof Educational Research, 82(1):61–89, 2012. → pages 4, 8, 9, 1185Appendix86Appendix ASupplementary Data87Appendix A. Supplementary DataTable A.1: The Chi-Square Data of the 29 Engagement ItemsQuestion Chi-Squre D.F. p-value Significant1 2.34 3 0.50 No2 3.14 3 0.37 No3 1.51 3 0.68 No4 0.41 2 0.81 No5 1.07 3 0.79 No6 0.44 2 0.80 No7 4.32 3 0.23 No8 3.36 3 0.34 No9 2.65 3 0.44 No10 1.25 3 0.74 No11 1.44 3 0.69 No12 0.82 3 0.84 No13 2.1 3 0.55 No14 0.72 3 0.87 No15 2.81 3 0.42 No16 5.6 3 0.13 No17 1.42 3 0.70 No18 4.84 3 0.19 No19 4.6 3 0.20 No20 3.67 3 0.30 No21 3.16 3 0.37 No22 2.94 3 0.40 No23 1.29 3 0.73 No24 2.55 5 0.77 No25 6.37 5 0.27 No26 2.34 4 0.67 No27 4.09 5 0.54 No28 10.45 3 0.01 Yes29 8.22 6 0.22 No88Appendix BScreenshotsFigure B.1: The Login Screen89Appendix B. ScreenshotsFigure B.2: The World Map90Appendix B. ScreenshotsFigure B.3: The Caveman Island91Appendix B. ScreenshotsFigure B.4: The entrance of the cave92Appendix B. ScreenshotsFigure B.5: The Sabretooth kitten93Appendix B. ScreenshotsFigure B.6: The Caveman Guide94Appendix B. ScreenshotsFigure B.7: The Greek Island95Appendix B. ScreenshotsFigure B.8: Inside the Greek Temple96Appendix B. ScreenshotsFigure B.9: The Greek Kitten97Appendix B. ScreenshotsFigure B.10: The Greek Guide98Appendix B. ScreenshotsFigure B.11: The High Score Page99Appendix CGame ContentTable C.1: Correct Reserved Words Used in GameReserved Words Reserved Words Reserved Wordsabstract assert booleanbreak byte casecatch char classcontinue default dodouble else enumextends false finalfinally float forif implements importinstanceof int interfacelong native newnull package privateprotected public returnshort static strictfpsuper switch synchronizedthis throw throwstransient true tryvoid volatile while100Appendix C. Game ContentTable C.2: Correct Method Names Used in GameMethodNamesMethodNamesMethodNamesMethodNamescharAt applyPattern compareTo equalsIgnoreCasenextInt toLowerCase doubleValue toUpperCasenextLine parseInt intValue getPercentInstancehasNext sqrt substring getCurrencyInstanceconcat pow length randomequals exp printf formatprint abs println nextFloatTable C.3: Correct Class Names Used in GameClass Names Class Names Class NamesArrayList Number NumberFormatColor Character ObjectDecimalFormat Float PolygonDouble String RandomInteger Scanner MathTable C.4: IncorrectWords Used in GameIncorrect Words Incorrect Words Incorrect Wordsalgorithm applet PackageDatabase Debug MainEclipse file stringfirst GUI ModuleJava JUnit reservedLibrary micro System101102D.1. Instructor Consent EmailAppendix DIntroductory EmailsD.1 Instructor Consent Email5/4/2014 Gmail - Recruting COSC111/121 students for your usability studyhttps://mail.google.com/mail/u/0/?ui=2&ik=afa1898d5e&view=pt&search=inbox&th=145bf898144a8d2b&siml=145bf898144a8d2b&siml=145c05461fa50251 1/1Jamie McKee <jamieleemckee@gmail.com>Recruting COSC111/121 students for your usability study2 messagesjames yu <james.yu@shaw.ca> Fri, May 2, 2014 at 5:43 PMTo: Jamie McKee <jamieleemckee@gmail.com>Hello Jamie   It was great talking with you.  It sounds like an interesting project for your MS thesis.   I agree that you canrecruit the COSC 111/121 students to participate in your usability study during the course.  Please come duringthe 1st or 2nd lecture (May 12, or May 15) and spend 10 – 15 minutes to explain the details to the students.  Wewill also offer 1% of the marks for the students as an incentive for your study.I look forward to work with you this Summer.James Yu Ph.D.( I hope this time, the mail can go through. Please let me know if you get this email)Jamie McKee <jamieleemckee@gmail.com> Fri, May 2, 2014 at 9:25 PMTo: james yu <james.yu@shaw.ca>Hi James,Thank you so much for sending this along.Cheers,Jamie[Quoted text hidden]103D.2. Usability Study - Student EmailD.2 Usability Study - Student EmailProtocol H14-01105: Phase I Introductory Email (May 12, 2014) Page 1 of 1  !INTRODUCTORY EMAIL TO STUDENTS  Hello!  My name is Jamie McKee-Scott and I am emailing you to introduce my research and myself. I work with Dr. Patricia Lasserre of the Computer Science department and we are creating an educational game to assist students’ learning introductory programming. We have completed several pieces of the game, and we are looking for some feedback from students who could potentially benefit from a game like this.   We will be coming to your COSC 111 lecture on May 15th, 2014 to introduce ourselves. We will be bringing consent forms and asking you to participate in our study. The study consists of using our game and filling in questionnaires prior to using the game, and after you have finished using the game. While you are using the game, the software will track information about your game use, including time spent, score achieved, etc.   If you chose to participate in our study, you may withdraw at any time. If you participate in our study by playing our game, you will receive an incentive of a bonus 0.5% of your course grade. If you further participate by completing all questionnaires, you will receive an incentive of a bonus 0.5% of your course grade.  Attached please find the consent form we will be bringing to class. If you have any questions/concerns, please feel free to contact Dr. Lasserre at patricia.lasserre@ubc.ca 250-807-9229 or myself at jamiem@alumni.ubc.ca.   Thank you! Jamie McKee-Scott!104D.3. Heuristic Evaluation - Evaluator EmailD.3 Heuristic Evaluation - Evaluator EmailProtocol H14-01105: Phase II Introductory Email (May 9, 2014) Page 1 of 1  INTRODUCTORY EMAIL TO STUDENTS  Hello!  My name is Jamie McKee-Scott and I am emailing you on behalf of Dr. Patricia Lasserre to introduce Dr. Lasserre’s research project. We are creating an educational game to assist students’ learning introductory programming. We have completed several pieces of the game, and we are looking for some feedback from people with experience in user interface design.   You are being contacted because you have completed a course on Human Computer Interaction (COSC 341) with Dr. Lasserre and have experience with user interface design. We are hoping you will consider participating in our study by performing a heuristic evaluation of the system. This involves completing a series of tasks and scoring each task on its success on different aspects of usability. We anticipate that this questionnaire will need to be completed and returned in approximately 2-3 weeks. Upon completion of the heuristic evaluation, participants will be sent a follow up questionnaire that rates the severity of any usability errors that were found during the heuristic evaluation.  If you chose to participate in our study, you may withdraw at any time. You will receive some compensation for your participation. If you complete all questionnaires you will be offered a $25 gift card to the restaurant/coffee shop of your choosing. These gift cards will be distributed to you within two weeks of completing all questionnaires. If you wish to participate in this study, please sign the attached consent form and return it to me via email by next week, at which time you will be emailed the next questionnaire and a timeline in which to complete it and return it via email.  If you have any questions/concerns, please feel free to contact myself at jamiem@alumni.ubc.ca or Dr. Lasserre at patricia.lasserre@ubc.ca 250-807-9229  Thank you! Jamie McKee-Scott!!!!!!!!!!!!105Appendix EConsent FormsE.1 Usability Study Consent FormProtocol H14-01105: Consent Phase I (May 12, 2014), Page 1 of 3 RID:     T H E  U N I V E R S I T Y  O F  B R I T I S H  C O L U M B I A    Irving K. Barber School of Arts and Sciences Unit 5 - Computer Science 3333 University Way Kelowna, BC  Canada  V1V 1V7    Consent Form A Study of an Educational Game to Learn Programming Syntax and Concepts   Principal Investigator: Dr. Patricia Lasserre, Associate Professor of Computer Science, UBC Okanagan, 250-807-9229  Co-Investigator: Jamie McKee-Scott, BSc. Major in Computer Science, UBC Okanagan, jamiem@alumni.ubc.ca  Purpose: Computer science, and particularly programming, is often challenging for students in university. It is often aggravated by the fact that very few students had an opportunity to learn programming prior to entering university. To help students with no prior experience, a prototype game platform has been developed. This game platform follows a similar model as the popular game Brain Age and several mini-games have been developed to test the most fundamental elements of programming.  The objective of the present study is to test the games and determine the most useful ones in terms of learning. It is anticipated that the results of this study will be published in a peer-reviewed professional journal. Furthermore, this study is a part of Jamie McKee-Scott’s Master’s Thesis, which will be available online after December 31, 2014. A summary of the results will be available from Dr. Patricia Lasserre after December 31, 2014.  Study Procedures. You are being invited to participate in this research study because you are fluent in English, registered in a first year course, and could potentially take such an introductory programming course in university.   As a volunteer participant in Phase 1, you will be asked to complete a series of mini-games as well as some questionnaires. The questionnaires will be administered prior to and after playing the mini-games. Prior to playing the game, you will be asked to complete two questionnaires: the first will include questions about your personal demographics (such as gender, age, and interests), which might impact the result of the study, and the second will include questions about your engagement as a student in an introductory computer science class. Upon completion of these questionnaires, you will be given access to the educational games for a period of 3 weeks. At the end of this 3-week period, you will be asked to complete three final questionnaires, one about the usability and aesthetics of the games, one about motivation, and one about engagement. These questionnaires will include questions on your current motivation OKANAGAN  106Protocol H14-01105: Consent Phase I (May 12, 2014), Page 2 of 3 RID:    and engagement as a student in introductory computer science. Your feedback is important to ensure the mini-games are attractive and adequately built for students with no programming experience. It is anticipated that the two pretests will take 20 minutes to complete, and the three posttests will take 90 minutes to complete, with the whole process taking no more than 120 minutes, not including the time you spend on your own playing the games.   You will receive some compensation for your participation in Phase 1. If you participate in the study by playing the game you will be given a bonus 0.5% in the course. If you fully participate by both playing the game and completing all of the questionnaires you will be given a bonus 1% in the course. These bonus percentages will be given upon completion of the study (Dr. Lasserre will forward the name of the participants to your instructor). If you choose to withdraw from the study, you will only be given the bonus 0.5% for playing the game.  Potential Risks: There are no known risks associated with participating in this study.   Potential Benefits: There are no direct benefits associated with participating in this study. However, it is anticipated that there will be benefits to the computer science educational community and to future computer science students with no prior programming experience. If proven a useful process its benefit could extend to other disciplines.  Confidentiality: Your participation and all information you provide will be kept confidential. However, the information that you provide will not be anonymous. That is, the researchers will know who provided what information. Only the researchers will have access to the information that you provide. The COSC 111 instructor, Dr. James Yu, will know who has participated in the study. Dr. Lasserre will send Dr. James Yu the list of participants at the end of the study in order to award the bonus percentage.   All information will be stored on password-protected computer files. All information obtained about you will be identified only by a research number in these files. Documentation of consents will be stored separately in a locked file cabinet located in Patricia Lasserre’s office.  Please be aware that we will only identify you as a participant in this study to Dr. James Yu. You will not be identified to anyone else not directly involved in the project. Furthermore, we will not connect your name with your questionnaire responses to anyone other than Dr. Patricia Lasserre. Moreover, in all publications and presentations of the research findings, no information that would allow someone to identify specific participants will be released.  Contact for information about the study: If you have any questions or desire further information with respect to this study, you may contact Dr. Patricia Lasserre (telephone: 250-807-9229; email: Patricia.Lasserre@ubc.ca) or Jamie McKee-Scott (email: jamiem@alumni.ubc.ca).   Contact for concerns about the rights of research subjects: If you have any concerns or complaints about your rights as a research participant and/or your experiences while participating in this study, contact the Research Participant Complaint Line in the UBC Office of Research Services at 1-877-822-8598 or the UBC Okanagan Research Services Office at 107Protocol H14-01105: Consent Phase I (May 12, 2014), Page 3 of 3 RID:    250-807-8832. It is also possible to contact the Research Participant Complaint Line by email (RSIL@ors.ubc.ca).  Consent: Your participation in this study is strictly voluntary. At any time during the study, you are free to stop your participation. If you wish to stop your participation after you have submitted your questionnaires, please email Jamie McKee-Scott (jamiem@alumni.ubc.ca) or Dr. Patricia Lasserre (patricia.lasserre@ubc.ca) and indicate that you would like to withdraw from the Educational Game study. All data that pertains to you will then be destroyed.  If you agree to participate, check the box “I consent”. Please provide us with a valid email address with which we may contact you to set up your participation in the study. This will indicate that you have read and understood the above information and have consented to your participation in this study.   If you do not wish to participate, check the box “I do NOT consent”.  Please detach and complete this page and return the slip below to Jamie McKee-Scott (ASC 304) or Patricia Lasserre (ASC 403) by Tuesday, May 20th 2014 (4pm). Please keep the above description and explanation of the study for your records.  !--------------------------------------------------------------------------------------------------------------- Consent Form:  A Study of an Educational Game to Learn Programming Syntax and Concepts   ! I consent.   ! I do NOT consent.   Name:__________________________________  Date: ___________________________________  Email: ___________________________________  Signature: 108E.2. Heuristic Evaluation Consent FormE.2 Heuristic Evaluation Consent FormProtocol H14-01105: Consent Phase 2 (May 9, 2014), Page 1 of 3 RID:     T H E  U N I V E R S I T Y  O F  B R I T I S H  C O L U M B I A    Irving K. Barber School of Arts and Sciences Unit 5 - Computer Science 3333 University Way Kelowna, BC  Canada  V1V 1V7    Consent Form A Study of an Educational Game to Learn Programming Syntax and Concepts   Principal Investigator: Dr. Patricia Lasserre, Associate Professor of Computer Science, UBC Okanagan, 250-807-9229  Co-Investigator: Jamie McKee-Scott, BSc. Major in Computer Science, UBC Okanagan, jamiem@alumni.ubc.ca  Purpose: Computer science, and particularly programming, is often challenging for students in university. It is often aggravated by the fact that very few students had an opportunity to learn programming prior to entering university. To help students with no prior experience, a prototype game platform has been developed. This game platform follows a similar model to the popular game Brain Age and several mini-games have been developed to test the most fundamental elements of programming.  The objective of the present study is to test several levels of the games and determine the most useful ones in terms of usability, engagement, and motivation. It is anticipated that the results of this study will be published in a peer-reviewed professional journal. Furthermore, this study is a part of Jamie McKee-Scott’s Master’s Thesis, which will be available online after December 31, 2014. A summary of the results will be available from Dr. Patricia Lasserre after December 31, 2014.  Study Procedures. You are being invited to participate in this research study because you have completed COSC 341 at UBC Okanagan and are considered an ‘expert’ in this area.  As an expert participant in Phase 2, you will be asked to complete an evaluation of the usability of the interface of several games, as well as some questionnaires. The questionnaires will be administered prior to, during, and following your interaction with the mini-games. These questionnaires will include questions about your personal demographics (such as gender, age, and interests), which might impact the result of the study, as well as questions asking you to perform tasks and evaluate the usability aspects of these tasks. Your feedback is important to ensure the mini-games are attractive and adequately built for students with no programming experience. It is anticipated that it will take a maximum of 2 hours of your time. The questionnaires will be provided to you via email for you to fill out, and then return via email.  OKANAGAN  109Protocol H14-01105: Consent Phase 2 (May 9, 2014), Page 2 of 3 RID:    You will receive some compensation for your participation in Phase 2. If you complete all questionnaires you will be offered a $25 gift card to the restaurant/coffee shop of your choosing. These gift cards will be distributed to you within two weeks of completing all questionnaires. If you choose to withdraw from the study prior to its completion, you will still receive the gift card.  Potential Risks: There are no known risks associated with participating in this study.   Potential Benefits: There are no direct benefits associated with participating in this study. However, it is anticipated that there will be benefits to the computer science educational community and to future computer science students with no prior programming experience. If proven useful, this research could benefit other disciplines.  Confidentiality: Your participation and all information you provide will be kept confidential. However, the information that you provide will not be anonymous. That is, the researchers will know who provided what information. Only the researchers will have access to the information that you provide.   All information will be stored on password-protected computer files. All information obtained about you will be identified only by a research number in these files. Documentation of consents will be stored separately in a locked file cabinet located in Patricia Lasserre’s office.  Please be aware that we will not identify you, or connect your name with your responses, to anyone not directly involved in the project. Moreover, in all publications and presentations of the research findings, no information that would allow someone to identify specific participants will be released.  Contact for information about the study: If you have any questions or desire further information with respect to this study, you may contact Dr. Patricia Lasserre (telephone: 250-807-9229; email: Patricia.Lasserre@ubc.ca) or Jamie McKee-Scott (email: jamiem@alumni.ubc.ca ).   Contact for concerns about the rights of research subjects: If you have any concerns or complaints about your rights as a research participant and/or your experiences while participating in this study, contact the Research Participant Complaint Line in the UBC Office of Research Services at 1-877-822-8598 or the UBC Okanagan Research Services Office at 250-807-8832. It is also possible to contact the Research Participant Complaint Line by email (RSIL@ors.ubc.ca).  Consent: Your participation in this study is strictly voluntary. At any time during the study, you are free to stop your participation. If you wish to stop your participation after you have submitted your questionnaires, please email Jamie McKee-Scott (jamiem@alumni.ubc.ca ) and indicate that you would like to withdraw from the Educational Game study. All data that pertains to you will then be destroyed.  110Protocol H14-01105: Consent Phase 2 (May 9, 2014), Page 3 of 3 RID:    If you agree to participate, check the box “I consent”. Please provide use with a valid email address with which we may contact you to set up your participating in the study. This will indicate that you have read and understood the above information and have consented to your participation in this study.   If you do not wish to participate, check the box “I do NOT consent”.  Please return this consent form via email to jamiem@alumni.ubc.ca within 7-10 business days. Please keep the above description and explanation of the study for your records.          --------------------------------------------------------------------------------------------------------------- Consent Form:  A Study of an Educational Game to Learn Programming Syntax and Concepts   ! I consent.   ! I do NOT consent.   Name:__________________________________  Date: ___________________________________  Email: ___________________________________  Signature: 111112F.1. Usability StudyAppendix FQuestionnairesF.1 Usability StudyF.1.1 Demographics and Background  Protocol H14-01105: Demographics (May 9, 2014), Page 1 of 4 RID:     !Please answer the following questions regarding your personal demographics and background in the space provided. 1. Sex:  !  Male !   Female   2. Age:   ! 17-19 ! 20-24 !25-29 !30 or older        3. Ethnicity:  !Caucasian   !Aboriginal !Asian  !Other Please specify:  ______________________________ 4. Degree and Major (if known): ________________________________________________________ 5. Select among the following areas/discipline the two most attractive to you. ! Writing ! Mathematics ! Experimental sciences (such as physics, chemistry or biology) ! Social Sciences and humanities ! Medical field (nursing, medicine) ! Fine Arts ! Business / Management Other: ______________________________________________ 6. Rate your skills level in each of the following disciplines (1 = poor and 10 = excellent).  7. Please indicate the extent to which you agree or disagree with the statements listed below using the following 7 point scale, where 1 = strongly disagree and 7 = strongly agree.     Poor       Excellent a. English skills 1 2 3 4 5 6 7 8 9 10 b. Math skills. 1 2 3 4 5 6 7 8 9 10  Strongly disagree    Strongly  agree a. I like memory games 1 2 3 4 5 6 7 b. Given the choice, I prefer playing games where I have time to reflect than games that challenge my reflexes. 1 2 3 4 5 6 7 113  Protocol H14-01105: Demographics (May 9, 2014), Page 2 of 4 RID:     !8. On average, how many hours a week do you spend gaming? ! 0 ! less than an hour ! 1-2 hours ! 3-4 hours  ! 5 or more hours  9. Please list the video gaming devices or consoles that you own, if any:     10. Please list the types of video games you enjoy playing, if any:     11. Do you have access to a computer at home:  !  Yes !  No  12. How often do you use a computer?    !  Once a month or less   !  Once a week  ! 2 to 3 times a week ! Daily 13. What do you use a computer for? (Check all that apply) ! Internet ! Writing essays and reports (e.g. school work) ! Doing computations (e.g. school work) ! Graphics design ! Web design ! Programming ! Other:__________________________  If web design/programming is selected:  14. Which software do you use for web design?       15. What technologies/languages are you aware of: HTML5, CSS, JavaScript, etc.?    114  Protocol H14-01105: Demographics (May 9, 2014), Page 3 of 4 RID:     ! 16. What technologies/languages have you worked with?      17. Please rate your programming ability on the following scale: I cannot program    I am a competent programmer 0 1 2 3 4  18. Have you ever taken a computer science course(s) before? If yes, please list specifically the course(s) (both high school and post-secondary, if applicable).    19. Can you give me the name of some software you have used in the past three months?     20. Do you know what a programming language is for? Can you give me the names of some programming languages?      115  Protocol H14-01105: Demographics (May 9, 2014), Page 4 of 4 RID:     !21. Have you ever programmed? If yes, could you give me examples of your work?                          Please provide a nickname by which we can refer to you during this study: ____________________ 116F.1. Usability StudyF.1.2 Engagement Pre-test  Protocol H14-01105: Phase I Engagement Pre-Test (April 30, 2014), Page 1 of 3 RID:    !Engagement Survey In usual courses you are registered in, about how often have you done each of the following?  Very Often Often Sometimes Never Asked questions or contributed to class discussions in other ways 1 2 3 4 Prepared several solutions for a lab or assignment before turning it in 1 2 3 4 Had the professor or teaching assistant look over your solution before turning it in 1 2 3   4 Asked another students to help you understand course material 1 2 3 4 Prepared for an exam by discussing or working through course material with other students 1 2 3 4 Worked with other students on course projects or assignments 1 2 3 4 Given a course presentation 1 2 3 4 Combined ideas from different courses when completing assignments 1 2 3 4 Connected your learning to societal problems or issues 1 2 3 4 Examined the strengths and weaknesses of your own views on a topic or issue 1 2 3 4 Learned something that changed the way you understand a concept or an issue 1 2 3 4 117  Protocol H14-01105: Phase I Engagement Pre-Test (April 30, 2014), Page 2 of 3 RID:    ! Connected ideas from your courses to your prior experiences and knowledge 1 2 3 4 Talked about career plans with a faculty member 1 2 3 4 Considered a career in computer science 1 2 3 4 Worked with a faculty member on activities other than coursework 1 2 3 4 Discussed course topics, ideas, or concepts with a faculty member outside of class 1 2 3 4 Discussed your academic performance with a faculty member 1 2 3 4 Reached conclusions based on your own analysis of programming and algorithms 1 2 3 4 Used programming to examine a real-world problem or issue 1 2 3 4 Evaluated what others have concluded from programming and algorithms 1 2 3 4 Identified key information from reading assignments 1 2 3 4 Reviewed your notes after class 1 2 3 4 Summarized what you learned in class or from course materials 1 2 3 4 118  Protocol H14-01105: Phase I Engagement Pre-Test (April 30, 2014), Page 3 of 3 RID:    !In usual courses you are registered in, how often have each of the following items challenged you to do your best work?   In usual courses you are registered in, about how many hours do you spend in a typical week doing each of the following?    Not at all      Very much The course 1 2 3 4 5 6 7 The labs 1 2 3 4 5 6 7 The game 1 2 3 4 5 6 7  0 1-5 6-10 11-15 16-20 21-25 25-30 30+ Preparing for class (studying, reading, assignments) ! ! ! ! ! ! ! ! Practicing programming (not including time spent in lab and on the lab) ! ! ! ! ! ! ! ! Programming for the lab assignments ! ! ! ! ! ! ! ! Playing the game ! ! ! ! ! ! ! ! Playing other video games ! ! ! ! ! ! ! ! 119F.1. Usability StudyF.1.3 Engagement Post-test  Protocol H14-01105: Phase I Engagement Post-Test (May 9, 2014), Page 1 of 4 RID:    !Engagement Survey In this summer semester of Computer Science 111, about how often have you done each of the following?  Very Often Often Sometimes Never Asked questions or contributed to class discussions in other ways 1 2 3 4 Prepared several solutions for a lab or assignment before turning it in 1 2 3 4 Had the professor or teaching assistant look over your solution before turning it in 1 2 3   4 Asked another students to help you understand course material 1 2 3 4 Prepared for an exam by discussing or working through course material with other students 1 2 3 4 Worked with other students on course projects or assignments 1 2 3 4 Given a course presentation 1 2 3 4 Combined ideas from different courses when completing assignments 1 2 3 4 Connected your learning to societal problems or issues 1 2 3 4 Examined the strengths and weaknesses of your own views on a topic or issue 1 2 3 4 Learned something that changed the way you understand a concept or an issue 1 2 3 4 120  Protocol H14-01105: Phase I Engagement Post-Test (May 9, 2014), Page 2 of 4 RID:    !Connected ideas from your courses to your prior experiences and knowledge 1 2 3 4 Talked about career plans with a faculty member 1 2 3 4 Considered a career in computer science 1 2 3 4 Worked with a faculty member on activities other than coursework 1 2 3 4 Discussed course topics, ideas, or concepts with a faculty member outside of class 1 2 3 4 Discussed your academic performance with a faculty member 1 2 3 4 Reached conclusions based on your own analysis of programming and algorithms 1 2 3 4 Used programming to examine a real-world problem or issue 1 2 3 4 Evaluated what others have concluded from programming and algorithms 1 2 3 4 Identified key information from reading assignments 1 2 3 4 Reviewed your notes after class 1 2 3 4 Summarized what you learned in class or from course materials 1 2 3 4 121  Protocol H14-01105: Phase I Engagement Post-Test (May 9, 2014), Page 3 of 4 RID:    ! During this summer semester of Computer Science 111, how often have each of the following items challenged you to do your best work?      During this summer semester of Computer Science 111, about how many hours do you spend in a typical week doing each of the following? Identified key information from the game  1 2 3 4  Not at all      Very much The course 1 2 3 4 5 6 7 The labs 1 2 3 4 5 6 7 The game 1 2 3 4 5 6 7  0 1-5 6-10 11-15 16-20 21-25 25-30 30+ Preparing for class (studying, reading, assignments) ! ! ! ! ! ! ! ! Practicing programming (not including time spent in lab and on the lab) ! ! ! ! ! ! ! ! 122  Protocol H14-01105: Phase I Engagement Post-Test (May 9, 2014), Page 4 of 4 RID:    ! Of the time you spend preparing for class in a typical 7-day week, about how much time is spent on playing the game?   Programming for the lab assignments ! ! ! ! ! ! ! ! Playing the game ! ! ! ! ! ! ! ! Playing other video games ! ! ! ! ! ! ! ! Very little Some About half Most Almost all ! ! ! ! ! 123F.1. Usability StudyF.1.4 Intrinsic Motivation Inventory!  Protocol H14-01105: Phase I IMI (April 30, 2014), Page 1 of 4 RID:     !Task Evaluation Questionnaire For each of the following statements, please indicate how true it is for you, using the scale provided, where 1 is not at all true, and 7 is very true. Please answer all items.  Not at all true    Very True 1. While I was playing the game I was thinking about how much I enjoyed it 1 2 3 4 5 6 7 2. I did not feel at all nervous about playing the game 1 2 3 4 5 6 7 3. I felt that it was my choice to play the game 1 2 3   4 5 6 7 4. I think I am pretty good at the game 1 2 3 4 5 6 7 5. I found the game very interesting 1 2 3 4 5 6 7 6. I felt tense while playing the game 1 2 3 4 5 6 7 7. I think I did pretty well at playing the game, compared to other students 1 2 3 4 5 6 7 8. Playing the game was fun 1 2 3 4 5 6 7 9. I felt relaxed while playing the game 1 2 3 4 5 6 7 10. I enjoyed playing the game very much 1 2 3 4 5 6 7 11. I didn’t really have a choice about playing the game 1 2 3 4 5 6 7 124!  Protocol H14-01105: Phase I IMI (April 30, 2014), Page 2 of 4 RID:     !12. I am satisfied with my performance at playing the game 1 2 3 4 5 6 7 13. I was anxious while playing the game 1 2 3 4 5 6 7 14. I thought playing the game was very boring 1 2 3 4 5 6 7 15. I felt like I was doing what I wanted to do while I was playing the game 1 2 3 4 5 6 7 16. I felt pretty skilled at playing the game 1 2 3 4 5 6 7 17. I thought playing the game was very interesting 1 2 3 4 5 6 7 18. I felt pressured while playing the game 1 2 3 4 5 6 7 19. I felt like I had to play the game 1 2 3 4 5 6 7 20. I would describe the game as very enjoyable 1 2 3 4 5 6 7 21. I played the game because I had no choice 1 2 3 4 5 6 7 22. After playing the game for awhile, I felt pretty competent 1 2 3 4 5 6 7 23. I believe that playing the game could be of some value for me 1 2 3 4 5 6 7 125!  Protocol H14-01105: Phase I IMI (April 30, 2014), Page 3 of 4 RID:     !24. I believe I had some choice about playing the game 1 2 3 4 5 6 7 25. I believe that playing this game is useful for improved concentration 1 2 3 4 5 6 7 26. I think playing the game is important for my improvement 1 2 3 4 5 6 7 27. I played the game because I wanted to 1 2 3 4 5 6 7 28. I think the game is an important activity 1 2 3 4 5 6 7 29. I felt like I was enjoying the game while I was playing it 1 2 3 4 5 6 7 30. I thought the game was a very boring activity 1 2 3 4 5 6 7 31. It is possible that the game could improve my studying habits 1 2 3 4 5 6 7 32. I felt like I had no choice but to play the game 1 2 3 4 5 6 7 33. I thought the game was a very interesting activity 1 2 3 4 5 6 7 34. I am willing to play the game again because I think it is somewhat useful 1 2 3 4 5 6 7 35. I believe that playing the game could be somewhat beneficial to me 1 2 3 4 5 6 7 126!  Protocol H14-01105: Phase I IMI (April 30, 2014), Page 4 of 4 RID:     ! 36. I believe playing the game could help me do better in school 1 2 3 4 5 6 7 37. While playing the game I felt like I had a choice 1 2 3 4 5 6 7 38. I would describe playing the game as very fun 1 2 3 4 5 6 7 39. I felt like it was not my own choice to play the game 1 2 3 4 5 6 7 40. I would be willing to play the game again because it has some value to me 1 2 3 4 5 6 7 127F.1. Usability StudyF.1.5 Usability  Protocol H14-01105: Phase I Usability (May 9, 2014), Page 1 of 13 RID:     !Please answer the following questions regarding the interface of the game.   Part 1. General System Questions Now that you have played the games and used the system, please indicate the extent to which you agree or disagree with the statements listed below using the following 7 point scale, where 1 = strongly disagree and 7 = strongly agree.   Strongly Disagree    Strongly Agree 1. Overall, I am satisfied with how easy it is to use the system 1 2 3 4 5 6 7 2. It was simple to use this system 1 2 3 4 5 6 7 3. I feel comfortable using this system 1 2 3   4 5 6 7 4. I feel like the games reinforced programming concepts 1 2 3 4 5 6 7 5. It was easy to learn to use the game 1 2 3 4 5 6 7 6. The instructions provided for the games are effective in helping me complete the games 1 2 3 4 5 6 7 7. The organization of information on the different screens is clear 1 2 3 4 5 6 7 8. The interface of the game is pleasant 1 2 3 4 5 6 7 9. I like using the interface of the game 1 2 3 4 5 6 7 10. The game has all the functions and capabilities I expect it to have 1 2 3 4 5 6 7 128  Protocol H14-01105: Phase I Usability (May 9, 2014), Page 2 of 13 RID:     ! If the game does not have all the functions and capabilities you expect it to have, please list the ones you would expect to see here.    Do you like the name “Code Age” for this game? If not, what would you call this game?   Do you have a mobile device? If so, please list which one(s).    Did you try and use the game on the mobile device? Would you have liked to have been able to use the game on your mobile device?     Why did you spend as much (or as little) time as you did playing the games?11. The system was intuitive to use 1 2 3 4 5 6 7 12. Overall, I am satisfied with the game system 1 2 3 4 5 6 7 13. I like the game mechanic of travelling through an island filled with caves, where each cave is a game to complete 1 2 3 4 5 6 7 14. I like the game mechanic of travelling through an island filled with temples, where each temple is a game to complete 1 2 3 4 5 6 7 15. I like being able to see other players’ high scores 1 2 3 4 5 6 7 129  Protocol H14-01105: Phase I Usability (May 9, 2014), Page 3 of 13 RID:     ! Part 2. Aesthetics of the General System  Strongly Disagree    Strongly Agree 1. I like the look and feel of the login screen 1 2 3 4 5 6 7 2. I like the look of the world map 1 2 3 4 5 6 7 3. I like the look of the user statistics page 1 2 3   4 5 6 7 4. I like the look of the cave/caveman island 1 2 3 4 5 6 7 5. I like the look of the temple island 1 2 3 4 5 6 7 6. I like the look of the navigation bar 1 2 3 4 5 6 7 7. I like the placement of the logo on the world map 1 2 3 4 5 6 7 8. I like the color theme of the game 1 2 3 4 5 6 7 9. I like the art style of the game 1 2 3 4 5 6 7 10. I like buttons with rounded corners more than buttons with square corners 1 2 3 4 5 6 7 11. I think the corners of the buttons are too rounded 1 2 3 4 5 6 7 130  Protocol H14-01105: Phase I Usability (May 9, 2014), Page 4 of 13 RID:     ! Is there anything you would change about the look of the world map? If yes, what?         Is there anything you would change about the look of the system? If yes, what?12.   I liked the red/green highlighting of which islands are locked/unlocked 1 2 3 4 5 6 7 13. I thought the load time between screens was too long 1 2 3 4 5 6 7 14. I liked the hover effect that indicated the next cave                                                            1 2 3 4 5 6 7 131  Protocol H14-01105: Phase I Usability (May 9, 2014), Page 5 of 13 RID:     !Part 3. Island Specific Questions A. Island 1 – The Cave Island Please indicate the extent to which you agree or disagree with the statements listed below using the following 7 point scale, where 1 = strongly disagree and 7 = strongly agree.   How would you like to see your high score/progress displayed at each cave? (ex. Change cave color or change a light/glow inside the cave depending on what score you achieved)      What did you like about this island?     What did you dislike about this island?     Strongly disagree    Strongly  agree 1. I like the way the path between the caves is highlighted 1 2 3 4 5 6 7 2. I like the hover animation that indicates which cave you are on  1 2 3 4 5 6 7 3. I liked the transition animation between the world map and this island 1 2 3   4 5 6 7 4. I like the transition animation between this island and its caves 1 2 3 4 5 6 7 5. I think there are just the right amounts of games/caves on this island 1 2 3 4 5 6 7 132  Protocol H14-01105: Phase I Usability (May 9, 2014), Page 6 of 13 RID:     !Is there anything you would change about this island? If yes, what?       A. The Cave Island – The first cave game The first cave you visit on this island is a select the correct button game, and every few seconds an incorrect button is removed from the screen. Please indicate the extent to which you agree or disagree with the statements listed below using the following 7 point scale, where 1 = strongly disagree and 7 = strongly agree.      Strongly disagree    Strongly  agree 1. The buttons were removed too quickly 1 2 3 4 5 6 7 2. I liked that the incorrect buttons were removed 1 2 3 4 5 6 7 3. I did not find it overwhelming to have all of the instructions for the game presented to me on one screen at one time 1 2 3   4 5 6 7 4. The amount of time given to complete this game was sufficient 1 2 3 4 5 6 7 5. I liked the animation of the buttons that were removed 1 2 3 4 5 6 7 6. It was helpful to have incorrect buttons removed from the screen 1 2 3 4 5 6 7 7. I thought this game was fun 1 2 3 4 5 6 7 133  Protocol H14-01105: Phase I Usability (May 9, 2014), Page 7 of 13 RID:     !What would you do to make this game more attractive and engaging?       Do you like the use of a progress bar as a timer? Why or why not?      Do you like the color of the progress bar?    What colors would you like to see on a timer or progress bar? Is there anything you would change about this game? If yes, what?     B. The Cave Island – The second cave game The second cave you visit on this island is a select the correct button game and no buttons are removed from the screen. Please indicate the extent to which you agree or disagree with the statements listed below using the following 7 point scale, where 1 = strongly disagree and 7 = strongly agree.  What would you do to make this game more attractive and engaging?       Strongly disagree    Strongly  agree 1. I did not find it overwhelming to have all of the instructions for the game presented to me on one screen at one time 1 2 3 4 5 6 7 2. The amount of time given to complete this game was sufficient 1 2 3 4 5 6 7 3. I thought this game was fun 1 2 3 4 5 6 7 134  Protocol H14-01105: Phase I Usability (May 9, 2014), Page 8 of 13 RID:     !Is there anything you would change about this game? If yes, what?          C. The Cave Island – The third cave game The third cave you visit on this island is a select the correct button game and the buttons move around on the screen. Please indicate the extent to which you agree or disagree with the statements listed below using the following 7 point scale, where 1 = strongly disagree and 7 = strongly agree.    Strongly disagree    Strongly  agree 1. The buttons moved too quickly 1 2 3 4 5 6 7 2. I liked that the buttons moved 1 2 3 4 5 6 7 3. I did not find it overwhelming to have all of the instructions for the game presented to me on one screen at one time 1 2 3   4 5 6 7 4. The amount of time given to complete this game was sufficient 1 2 3 4 5 6 7 5. The buttons moved smoothly 1 2 3 4 5 6 7 6. The buttons were easy to click 1 2 3 4 5 6 7 7. The moving buttons made the game more challenging 1 2 3 4 5 6 7 8. I thought this game was fun 1 2 3 4 5 6 7 135  Protocol H14-01105: Phase I Usability (May 9, 2014), Page 9 of 13 RID:     !What would you do to make this game more attractive and engaging?       Is there anything you would change about this game? If yes, what?       D. The Cave Island – The fourth cave game The fourth cave you visit on this island is a select the correct piece of code game. Please indicate the extent to which you agree or disagree with the statements listed below using the following 7 point scale, where 1 = strongly disagree and 7 = strongly agree  What would you do to make this game more attractive and engaging?       Is there anything you would change about this game? If yes, what?        Strongly disagree    Strongly  agree 1. I did not find it overwhelming to have all of the instructions for the game presented to me on one screen at one time 1 2 3 4 5 6 7 2. The amount of time given to complete this game was sufficient 1 2 3 4 5 6 7 3. I thought this game was useful 1 2 3 4 5 6 7 4. I thought this game was fun 1 2 3 4 5 6 7 136  Protocol H14-01105: Phase I Usability (May 9, 2014), Page 10 of 13 RID:     !E. The Cave Island – The fifth cave game The final cave you visit on this island is a point-and-shoot, parameter passing game. Please indicate the extent to which you agree or disagree with the statements listed below using the following 7 point scale, where 1 = strongly disagree and 7 = strongly agree       Strongly disagree    Strongly  agree 1. The baskets moved too quickly 1 2 3 4 5 6 7 2. The cannon movement was smooth and easy to use 1 2 3 4 5 6 7 3. It was too hard to shoot the cannon balls into the baskets 1 2 3   4 5 6 7 4. The amount of time given to complete this game was sufficient 1 2 3 4 5 6 7 5. The cannon balls moved smoothly 1 2 3 4 5 6 7 6.  I did not find it overwhelming to have all of the instructions for the game presented to me on one screen at one time 1 2 3 4 5 6 7 7. I found the game challenging 1 2 3 4 5 6 7 8. This game was easy to use 1 2 3 4 5 6 7 9. I thought this game was useful 1 2 3 4 5 6 7 10. I thought this game was fun 1 2 3 4 5 6 7 137  Protocol H14-01105: Phase I Usability (May 9, 2014), Page 11 of 13 RID:     !What would you do to make this game more attractive and engaging?         Is there anything you would change about this game? If yes, what?          Do you think this game would be fun if it were timed?          Part 4. Final Usability Questions Select the features that would motivate you to spend time to learn with this game. Check all that apply.  Experience points as you progress  A stamp or reward when you have been successful  A world in which a character moves to play different games  An avatar that follows your progress  If so, what type of character would you find most attractive:  A typical professor  A typical computer geek  A typical student (like someone who would have done the course)  Someone else: ________________________________________  Other suggestions: ______________________________________________   138  Protocol H14-01105: Phase I Usability (May 9, 2014), Page 12 of 13 RID:     ! We would like to incorporate a storyline or story board into this game. Please read the following four plots:  Plot 1: World War 3 has occurred between humans and robots. In order to defeat the robots we had to destroy all technology and knowledge of programming, causing humans to revert to cave life. Your quest is to travel the world gathering and recreating this knowledge and each island you visit will become more and more advanced as you progress. Plot 2: Humans have created an artificial intelligence that has taken over the world. In order to fight back, programming was destroyed so the AI could no longer recode itself. Now to defeat the AI, you must secretly relearn programming in order to write the code to finally destroy the AI. Plot 3: The same plot as Plot 2, but fighting cyber-terrorists instead of an AI. Plot 4: Aliens believe humanity has advanced too much and are quickly heading down a path of self-destruction. To preserve humanity and Earth, the aliens travel backwards and forwards in time, destroying programming and technology as they go. This rampant destruction sets off a chain of events that change modern life as we know it. You have followed the aliens back, and are following them through time, saving programming as you move forward through time. Which of the above four plot lines do you like best and why?    Do you have any plot suggestions of your own? If yes, what is it that appeals to you about your plot line?       Describe how it felt playing the games without a developed plot line to follow    139  Protocol H14-01105: Phase I Usability (May 9, 2014), Page 13 of 13 RID:     !Do you think that the idea to use game(s) to help learning is useful? Why? Or why not?     Do you think that using games to learn programming is intimidating or useful? Why?    Do you enjoy the idea of small games to complement your learning, or would you prefer a longer game?     Is there anything else you would like to tell us?              Thank you for taking the time to participate in this study!!140F.2. Heuristic EvaluationF.2 Heuristic EvaluationF.2.1 Demographics and Background Protocol H14-01105: Phase 2 Demographics (May 9, 2014), Page 1 of 2 RID:     !Please answer the following questions regarding your personal demographics and background in the space provided. 1. Sex:  !  Male !   Female   2.  Age: ! 17-19 ! 20-24 !25-29 !30 or older  3. Please indicate the extent to which you agree or disagree with the statements listed below using the following 7 point scale, where 1 = strongly disagree and 7 = strongly agree.  4. On average, how many hours a week do you spend gaming? ! 0 ! less than an hour ! 1-2 hours ! 3-4 hours  ! 5 or more hours  5. Please list the video gaming devices or consoles that you own, if any: ____________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________ 6. Please list the types of video games you enjoy playing, if any: ____________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________  Strongly disagree    Strongly  agree a. I like memory games 1 2 3 4 5 6 7 b. Given the choice, I prefer playing games where I have time to reflect than games that challenge my reflexes. 1 2 3 4 5 6 7 141 Protocol H14-01105: Phase 2 Demographics (May 9, 2014), Page 2 of 2 RID:     ! 7. Do you have access to a computer at home:  !  Yes !  No  8. How often do you use a computer?    !  Once a month or less   !  Once a week  ! 2 to 3 times a week ! Daily 9. What do you use a computer for? (Check all that apply) ! Internet ! Writing essays and reports (e.g. school work) ! Doing computations (e.g. school work) ! Graphics design ! Web design ! Programming ! Other:__________________________  If web design/programming is selected:  10. Which software do you use for web design?        11. What technologies/languages have you worked with?            Please provide a nickname by which we can refer to you in the study: ____________________  142F.2. Heuristic EvaluationF.2.2 Heuristic Evaluation  Protocol H14-01105: Heuristic Evaluation (March 5, 2014), Page 1 of 21 RID:     !Please answer the following questions regarding the gaming system interface in front of you. This is not to test your skills, but to determine if there are any usability flaws in the interface. Please answer honestly as there are no right or wrong answers. Please rate each of the tasks below on the scale provided. After rating each item, if an item scores less than a 5, please briefly describe why the item has received that score.  This description can be written in the section below the scale labeled “discuss”. Thank you for participating in the heuristic evaluation. You will be contacted shortly and asked to fill in a follow up questionnaire regarding all of the usability errors (if any) that were found during the heuristic evaluation process. If you have any questions about the study, the heuristic evaluation, or questions in general, please feel free to contact Jamie McKee-Scott at jamiem@alumni.ubc.ca at any time. 143  Protocol H14-01105: Heuristic Evaluation (March 5, 2014), Page 2 of 21 RID:     !Task 1 – The log in process Please go to s72.ok.ubc.ca/games and log in using the RID/password combination provided to you. Then stop and answer the following questions. Please rate the level of success the log in process achieves on each of the following principles, using the following 7 point scale, where 1 = the login process fails entirely and 7 = the login process completely succeeds.    Fails Completely    Succeeds Completely Visibility of system status: the user is informed about what is going on through appropriate feedback within reasonable time 1 2 3 4 5 6 7 Match between system and the real world: the system speaks the user’s language with words, phrases, conventions and concepts familiar to the user 1 2 3 4 5 6 7 User control and freedom: there is a clearly marked exit that would allow the user to leave without going through an extended dialogue (ex. exit, back, undo) 1 2 3   4 5 6 7 Consistency and standards: the user does not have to wonder whether different words, situations, or actions mean the same thing 1 2 3 4 5 6 7 Error prevention: there are no error-prone conditions, or if there are the user is aware of them and provided a way to deal with them 1 2 3 4 5 6 7 Recognition over recall: the user does not have to remember info from one part of the system to another, i.e., objects, actions and options are always visible 1 2 3 4 5 6 7 Flexibility and efficiency of use: there are options that allow the advanced user to operate the system more efficiently 1 2 3 4 5 6 7 Aesthetic and minimalist design: the user is not presented with irrelevant or unneeded information 1 2 3 4 5 6 7 Errors: users are helped to recognize, diagnose and recover from errors which are expressed in plain language; they explain the problem and offer a solution 1 2 3 4 5 6 7 Help and documentation: any documentation or help offered to the user is brief, easy to search, focused on the user’s task, and lists steps or solutions 1 2 3 4 5 6 7 144  Protocol H14-01105: Heuristic Evaluation (March 5, 2014), Page 3 of 21 RID:     !Discuss any ratings less than or equal to 5: 145  Protocol H14-01105: Heuristic Evaluation (March 5, 2014), Page 4 of 21 RID:     !Task 2 – The home screen/world map Please take some time to navigate around the home screen/world map. Now please rate the level of success the home screen page achieves on each of the following principles, using the following 7 point scale, where 1 = the page fails entirely and 7= the page completely succeeds.     Fails Completely    Succeeds Completely Visibility of system status: the user is informed about what is going on through appropriate feedback within reasonable time 1 2 3 4 5 6 7 Match between system and the real world: the system speaks the user’s language with words, phrases, conventions and concepts familiar to the user 1 2 3 4 5 6 7 User control and freedom: there is a clearly marked exit that would allow the user to leave without going through an extended dialogue (ex. exit, back, undo) 1 2 3   4 5 6 7 Consistency and standards: the user does not have to wonder whether different words, situations, or actions mean the same thing 1 2 3 4 5 6 7 Error prevention: there are no error-prone conditions, or if there are the user is aware of them and provided a way to deal with them 1 2 3 4 5 6 7 Recognition over recall: the user does not have to remember info from one part of the system to another, i.e., objects, actions and options are always visible 1 2 3 4 5 6 7 Flexibility and efficiency of use: there are options that allow the advanced user to operate the system more efficiently 1 2 3 4 5 6 7 Aesthetic and minimalist design: the user is not presented with irrelevant or unneeded information 1 2 3 4 5 6 7 Errors: users are helped to recognize, diagnose and recover from errors which are expressed in plain language; they explain the problem and offer a solution 1 2 3 4 5 6 7 Help and documentation: any documentation or help offered to the user is brief, easy to search, focused on the user’s task, and lists steps or solutions 1 2 3 4 5 6 7 146  Protocol H14-01105: Heuristic Evaluation (March 5, 2014), Page 5 of 21 RID:     !Discuss any ratings less than or equal to 5:  147  Protocol H14-01105: Heuristic Evaluation (March 5, 2014), Page 6 of 21 RID:     !Task 3 – The first island Please click on the (only) clickable island, i.e., the one highlighted in green. Now please explore this island, and then rate the level of success the island page achieves on each of the following principles, using the following 7 point scale, where 1 = the page fails entirely and 7= the page completely succeeds.    Fails Completely    Succeeds Completely Visibility of system status: the user is informed about what is going on through appropriate feedback within reasonable time 1 2 3 4 5 6 7 Match between system and the real world: the system speaks the user’s language with words, phrases, conventions and concepts familiar to the user 1 2 3 4 5 6 7 User control and freedom: there is a clearly marked exit that would allow the user to leave without going through an extended dialogue (ex. exit, back, undo) 1 2 3   4 5 6 7 Consistency and standards: the user does not have to wonder whether different words, situations, or actions mean the same thing 1 2 3 4 5 6 7 Error prevention: there are no error-prone conditions, or if there are the user is aware of them and provided a way to deal with them 1 2 3 4 5 6 7 Recognition over recall: the user does not have to remember info from one part of the system to another, i.e., objects, actions and options are always visible 1 2 3 4 5 6 7 Flexibility and efficiency of use: there are options that allow the advanced user to operate the system more efficiently 1 2 3 4 5 6 7 Aesthetic and minimalist design: the user is not presented with irrelevant or unneeded information 1 2 3 4 5 6 7 Errors: users are helped to recognize, diagnose and recover from errors which are expressed in plain language; they explain the problem and offer a solution 1 2 3 4 5 6 7 Help and documentation: any documentation or help offered to the user is brief, easy to search, focused on the user’s task, and lists steps or solutions 1 2 3 4 5 6 7 148  Protocol H14-01105: Heuristic Evaluation (March 5, 2014), Page 7 of 21 RID:     !Discuss any ratings less than or equal to 5:  149  Protocol H14-01105: Heuristic Evaluation (March 5, 2014), Page 8 of 21 RID:     !Task 4 – The first cave/game Please click on the first clickable cave, i.e., the one pulsing up and down. Now please play this game, and then rate the level of success this game achieves on each of the following principles, using the following 7 point scale, where 1 = the page fails entirely and 7= the page completely succeeds.    Fails Completely    Succeeds Completely Visibility of system status: the user is informed about what is going on through appropriate feedback within reasonable time 1 2 3 4 5 6 7 Match between system and the real world: the system speaks the user’s language with words, phrases, conventions and concepts familiar to the user 1 2 3 4 5 6 7 User control and freedom: there is a clearly marked exit that would allow the user to leave without going through an extended dialogue (ex. exit, back, undo) 1 2 3   4 5 6 7 Consistency and standards: the user does not have to wonder whether different words, situations, or actions mean the same thing 1 2 3 4 5 6 7 Error prevention: there are no error-prone conditions, or if there are the user is aware of them and provided a way to deal with them 1 2 3 4 5 6 7 Recognition over recall: the user does not have to remember info from one part of the system to another, i.e., objects, actions and options are always visible 1 2 3 4 5 6 7 Flexibility and efficiency of use: there are options that allow the advanced user to operate the system more efficiently 1 2 3 4 5 6 7 Aesthetic and minimalist design: the user is not presented with irrelevant or unneeded information 1 2 3 4 5 6 7 Errors: users are helped to recognize, diagnose and recover from errors which are expressed in plain language; they explain the problem and offer a solution 1 2 3 4 5 6 7 Help and documentation: any documentation or help offered to the user is brief, easy to search, focused on the user’s task, and lists steps or solutions 1 2 3 4 5 6 7 150  Protocol H14-01105: Heuristic Evaluation (March 5, 2014), Page 9 of 21 RID:     !Discuss any ratings less than or equal to 5:  151  Protocol H14-01105: Heuristic Evaluation (March 5, 2014), Page 10 of 21 RID:     !Task 5 – The second cave/game Please click on the next clickable cave, i.e., the one at the end of the path, the one pulsing up and down. Now please play this game, and then rate the level of success this game achieves on each of the following principles, using the following 7 point scale, where 1 = the page fails entirely and 7= the page completely succeeds.   Fails Completely    Succeeds Completely Visibility of system status: the user is informed about what is going on through appropriate feedback within reasonable time 1 2 3 4 5 6 7 Match between system and the real world: the system speaks the user’s language with words, phrases, conventions and concepts familiar to the user 1 2 3 4 5 6 7 User control and freedom: there is a clearly marked exit that would allow the user to leave without going through an extended dialogue (ex. exit, back, undo) 1 2 3   4 5 6 7 Consistency and standards: the user does not have to wonder whether different words, situations, or actions mean the same thing 1 2 3 4 5 6 7 Error prevention: there are no error-prone conditions, or if there are the user is aware of them and provided a way to deal with them 1 2 3 4 5 6 7 Recognition over recall: the user does not have to remember info from one part of the system to another, i.e., objects, actions and options are always visible 1 2 3 4 5 6 7 Flexibility and efficiency of use: there are options that allow the advanced user to operate the system more efficiently 1 2 3 4 5 6 7 Aesthetic and minimalist design: the user is not presented with irrelevant or unneeded information 1 2 3 4 5 6 7 Errors: users are helped to recognize, diagnose and recover from errors which are expressed in plain language; they explain the problem and offer a solution 1 2 3 4 5 6 7 Help and documentation: any documentation or help offered to the user is brief, easy to search, focused on the user’s task, and lists steps or solutions 1 2 3 4 5 6 7 152  Protocol H14-01105: Heuristic Evaluation (March 5, 2014), Page 11 of 21 RID:     !Discuss any ratings less than or equal to 5:  153  Protocol H14-01105: Heuristic Evaluation (March 5, 2014), Page 12 of 21 RID:     !Task 6 – The third cave/game Please click on the next clickable cave, i.e., the one at the end of the path, the one pulsing up and down. Now please play this game, and then rate the level of success this game achieves on each of the following principles, using the following 7 point scale, where 1 = the page fails entirely and 7= the page completely succeeds.   Fails Completely    Succeeds Completely Visibility of system status: the user is informed about what is going on through appropriate feedback within reasonable time 1 2 3 4 5 6 7 Match between system and the real world: the system speaks the user’s language with words, phrases, conventions and concepts familiar to the user 1 2 3 4 5 6 7 User control and freedom: there is a clearly marked exit that would allow the user to leave without going through an extended dialogue (ex. exit, back, undo) 1 2 3   4 5 6 7 Consistency and standards: the user does not have to wonder whether different words, situations, or actions mean the same thing 1 2 3 4 5 6 7 Error prevention: there are no error-prone conditions, or if there are the user is aware of them and provided a way to deal with them 1 2 3 4 5 6 7 Recognition over recall: the user does not have to remember info from one part of the system to another, i.e., objects, actions and options are always visible 1 2 3 4 5 6 7 Flexibility and efficiency of use: there are options that allow the advanced user to operate the system more efficiently 1 2 3 4 5 6 7 Aesthetic and minimalist design: the user is not presented with irrelevant or unneeded information 1 2 3 4 5 6 7 Errors: users are helped to recognize, diagnose and recover from errors which are expressed in plain language; they explain the problem and offer a solution 1 2 3 4 5 6 7 Help and documentation: any documentation or help offered to the user is brief, easy to search, focused on the user’s task, and lists steps or solutions 1 2 3 4 5 6 7 154  Protocol H14-01105: Heuristic Evaluation (March 5, 2014), Page 13 of 21 RID:     ! Discuss any ratings less than or equal to 5: 155  Protocol H14-01105: Heuristic Evaluation (March 5, 2014), Page 14 of 21 RID:     !Task 7 – The fourth cave/game Please click on the next clickable cave, i.e., the one at the end of the path, the one pulsing up and down. Now please play this game, and then rate the level of success this game achieves on each of the following principles, using the following 7 point scale, where 1 = the page fails entirely and 7= the page completely succeeds   Fails Completely    Succeeds Completely Visibility of system status: the user is informed about what is going on through appropriate feedback within reasonable time 1 2 3 4 5 6 7 Match between system and the real world: the system speaks the user’s language with words, phrases, conventions and concepts familiar to the user 1 2 3 4 5 6 7 User control and freedom: there is a clearly marked exit that would allow the user to leave without going through an extended dialogue (ex. exit, back, undo) 1 2 3   4 5 6 7 Consistency and standards: the user does not have to wonder whether different words, situations, or actions mean the same thing 1 2 3 4 5 6 7 Error prevention: there are no error-prone conditions, or if there are the user is aware of them and provided a way to deal with them 1 2 3 4 5 6 7 Recognition over recall: the user does not have to remember info from one part of the system to another, i.e., objects, actions and options are always visible 1 2 3 4 5 6 7 Flexibility and efficiency of use: there are options that allow the advanced user to operate the system more efficiently 1 2 3 4 5 6 7 Aesthetic and minimalist design: the user is not presented with irrelevant or unneeded information 1 2 3 4 5 6 7 Errors: users are helped to recognize, diagnose and recover from errors which are expressed in plain language; they explain the problem and offer a solution 1 2 3 4 5 6 7 Help and documentation: any documentation or help offered to the user is brief, easy to search, focused on the user’s task, and lists steps or solutions 1 2 3 4 5 6 7 156  Protocol H14-01105: Heuristic Evaluation (March 5, 2014), Page 15 of 21 RID:     !Discuss any ratings less than or equal to 5:  157  Protocol H14-01105: Heuristic Evaluation (March 5, 2014), Page 16 of 21 RID:     !Task 8 – The user information page Please click on your research id number in the top right corner. Now take some time to navigate around this user information page. Now please rate the level of success the user information page achieves on each of the following principles, using the following 7 point scale, where 1 = the page fails entirely and 7= the page completely succeeds.   Fails Completely    Succeeds Completely Visibility of system status: the user is informed about what is going on through appropriate feedback within reasonable time 1 2 3 4 5 6 7 Match between system and the real world: the system speaks the user’s language with words, phrases, conventions and concepts familiar to the user 1 2 3 4 5 6 7 User control and freedom: there is a clearly marked exit that would allow the user to leave without going through an extended dialogue (ex. exit, back, undo) 1 2 3   4 5 6 7 Consistency and standards: the user does not have to wonder whether different words, situations, or actions mean the same thing 1 2 3 4 5 6 7 Error prevention: there are no error-prone conditions, or if there are the user is aware of them and provided a way to deal with them 1 2 3 4 5 6 7 Recognition over recall: the user does not have to remember info from one part of the system to another, i.e., objects, actions and options are always visible 1 2 3 4 5 6 7 Flexibility and efficiency of use: there are options that allow the advanced user to operate the system more efficiently 1 2 3 4 5 6 7 Aesthetic and minimalist design: the user is not presented with irrelevant or unneeded information 1 2 3 4 5 6 7 Errors: users are helped to recognize, diagnose and recover from errors which are expressed in plain language; they explain the problem and offer a solution 1 2 3 4 5 6 7 Help and documentation: any documentation or help offered to the user is brief, easy to search, focused on the user’s task, and lists steps or solutions 1 2 3 4 5 6 7 158  Protocol H14-01105: Heuristic Evaluation (March 5, 2014), Page 17 of 21 RID:     !Discuss any ratings less than or equal to 5: 159  Protocol H14-01105: Heuristic Evaluation (March 5, 2014), Page 18 of 21 RID:     !Task 9 – The navigation bar Please take some time exploring the navigation bar, but do not click on the log out option. Now please rate the level of success the navigation bar achieves on each of the following principles, using the following 7 point scale, where 1 = the page fails entirely and 7= the page completely succeeds.    Fails Completely    Succeeds Completely Visibility of system status: the user is informed about what is going on through appropriate feedback within reasonable time 1 2 3 4 5 6 7 Match between system and the real world: the system speaks the user’s language with words, phrases, conventions and concepts familiar to the user 1 2 3 4 5 6 7 User control and freedom: there is a clearly marked exit that would allow the user to leave without going through an extended dialogue (ex. exit, back, undo) 1 2 3   4 5 6 7 Consistency and standards: the user does not have to wonder whether different words, situations, or actions mean the same thing 1 2 3 4 5 6 7 Error prevention: there are no error-prone conditions, or if there are the user is aware of them and provided a way to deal with them 1 2 3 4 5 6 7 Recognition over recall: the user does not have to remember info from one part of the system to another, i.e., objects, actions and options are always visible 1 2 3 4 5 6 7 Flexibility and efficiency of use: there are options that allow the advanced user to operate the system more efficiently 1 2 3 4 5 6 7 Aesthetic and minimalist design: the user is not presented with irrelevant or unneeded information 1 2 3 4 5 6 7 Errors: users are helped to recognize, diagnose and recover from errors which are expressed in plain language; they explain the problem and offer a solution 1 2 3 4 5 6 7 Help and documentation: any documentation or help offered to the user is brief, easy to search, focused on the user’s task, and lists steps or solutions 1 2 3 4 5 6 7 160  Protocol H14-01105: Heuristic Evaluation (March 5, 2014), Page 19 of 21 RID:     !Discuss any ratings less than or equal to 5: 161  Protocol H14-01105: Heuristic Evaluation (March 5, 2014), Page 20 of 21 RID:     !Task 10 – The log out process Please click on the log out option in the top right corner. Now please rate the level of success the log out process achieves on each of the following principles, using the following 7 point scale, where 1 = the page fails entirely and 7= the page completely succeeds.    Fails Completely    Succeeds Completely Visibility of system status: the user is informed about what is going on through appropriate feedback within reasonable time 1 2 3 4 5 6 7 Match between system and the real world: the system speaks the user’s language with words, phrases, conventions and concepts familiar to the user 1 2 3 4 5 6 7 User control and freedom: there is a clearly marked exit that would allow the user to leave without going through an extended dialogue (ex. exit, back, undo) 1 2 3   4 5 6 7 Consistency and standards: the user does not have to wonder whether different words, situations, or actions mean the same thing 1 2 3 4 5 6 7 Error prevention: there are no error-prone conditions, or if there are the user is aware of them and provided a way to deal with them 1 2 3 4 5 6 7 Recognition over recall: the user does not have to remember info from one part of the system to another, i.e., objects, actions and options are always visible 1 2 3 4 5 6 7 Flexibility and efficiency of use: there are options that allow the advanced user to operate the system more efficiently 1 2 3 4 5 6 7 Aesthetic and minimalist design: the user is not presented with irrelevant or unneeded information 1 2 3 4 5 6 7 Errors: users are helped to recognize, diagnose and recover from errors which are expressed in plain language; they explain the problem and offer a solution 1 2 3 4 5 6 7 Help and documentation: any documentation or help offered to the user is brief, easy to search, focused on the user’s task, and lists steps or solutions 1 2 3 4 5 6 7 162  Protocol H14-01105: Heuristic Evaluation (March 5, 2014), Page 21 of 21 RID:     !Discuss any ratings less than or equal to 5:  163F.2. Heuristic EvaluationF.2.3 Heuristic Follow Up  Protocol H14-01105: Heuristic Follow Up (May 9, 2014), Page 1 of 3 RID:     !Part A. Error Severity Please answer the following questions regarding the usability errors that were found in the gaming system interface you evaluated in the last session. Please answer honestly as there are no right or wrong answers. Please rate the level of severity of each of the usability errors listed below on the scale provided. Error 1 – The Log In Process  When log on process fails there is no error message  I do not agree this is a usability problem at all Cosmetic problem only: need not be fixed unless extra time is available Minor usability problem: fixing this should be given low priority Major usability problem: important to fix, should be given high priority Usability catastrophe: imperative to fix this before product is used 0 1 2 3 4  Error 2 – The Home Screen No indication of what to do/where to go when you reach the main island page  I do not agree this is a usability problem at all Cosmetic problem only: need not be fixed unless extra time is available Minor usability problem: fixing this should be given low priority Major usability problem: important to fix, should be given high priority Usability catastrophe: imperative to fix this before product is used 0 1 2 3 4  Error 3 – Greek (Second) Island  No obvious way to return to home screen  I do not agree this is a usability problem at all Cosmetic problem only: need not be fixed unless extra time is available Minor usability problem: fixing this should be given low priority Major usability problem: important to fix, should be given high priority Usability catastrophe: imperative to fix this before product is used 0 1 2 3 4  Error 4 – Cave Island  It was difficult to tell which cave to go to play the mini games  I do not agree this is a usability problem at all Cosmetic problem only: need not be fixed unless extra time is available Minor usability problem: fixing this should be given low priority Major usability problem: important to fix, should be given high priority Usability catastrophe: imperative to fix this before product is used 0 1 2 3 4     164  Protocol H14-01105: Heuristic Follow Up (May 9, 2014), Page 2 of 3 RID:     !Error 5 – Cave Island Game  No available exit from the games  I do not agree this is a usability problem at all Cosmetic problem only: need not be fixed unless extra time is available Minor usability problem: fixing this should be given low priority Major usability problem: important to fix, should be given high priority Usability catastrophe: imperative to fix this before product is used 0 1 2 3 4  Error 6 – Cave Island Game  When you exit the game, the page has scrolled back up to the top of the island  I do not agree this is a usability problem at all Cosmetic problem only: need not be fixed unless extra time is available Minor usability problem: fixing this should be given low priority Major usability problem: important to fix, should be given high priority Usability catastrophe: imperative to fix this before product is used 0 1 2 3 4   Error 7 – Cave Island Game  There are no in-game instructions in case you forget what you are supposed to be doing  I do not agree this is a usability problem at all Cosmetic problem only: need not be fixed unless extra time is available Minor usability problem: fixing this should be given low priority Major usability problem: important to fix, should be given high priority Usability catastrophe: imperative to fix this before product is used 0 1 2 3 4  Error 8 – Cave Island Game In the third level of the first games, the moving buttons were hard to click  I do not agree this is a usability problem at all Cosmetic problem only: need not be fixed unless extra time is available Minor usability problem: fixing this should be given low priority Major usability problem: important to fix, should be given high priority Usability catastrophe: imperative to fix this before product is used 0 1 2 3 4   Part B. General Questions 1. Please list the things you disliked about the interface      165  Protocol H14-01105: Heuristic Follow Up (May 9, 2014), Page 3 of 3 RID:     !2. Please list the things you liked about the interface     3. Did you find the overall look and feel of the interface to be appealing and pleasing? If yes, what made it appealing? If not, what would you change about it?     4. We would like to incorporate a storyline or storyboard into this game. Please read the following plot scenarios: Plot 1: World War 3 has occurred between humans and robots. In order to defeat the robots we had to destroy all technology and knowledge of programming, causing humans to revert to cave life. Your quest to travel the world gathering and recreating this knowledge and each island you visit will become more and more advanced as you progress. Plot 2: Humans have created an artificial intelligence that has taken over the world. In order to fight back, programming was destroyed so the AI could no longer recode itself. Now to defeat the AI, you must secretly relearn programming in order to write the code to finally destroy the AI. Plot 3: The same plot as Plot 2, but fighting cyber-terrorists instead of an AI. Plot 4: Aliens believe humanity has advanced too much and are quickly heading down a path of self-destruction. To preserve humanity and Earth, the aliens travel backwards and forwards in time, destroying programming and technology as they go. This rampant destruction sets off a chain of events that change modern life as we know it. You have followed the aliens back, and are following them through time, saving programming as you move forward through time. Which of the above four plots do you like best and why?    5. Do you have any plot suggestions of your own? If yes, what is it that appeals to you about your plot line?     6. Thank you for participating in this study. What restaurant/store would you like your $25.00 gift card to come from? How may we get this gift card to you? 166

Cite

Citation Scheme:

        

Citations by CSL (citeproc-js)

Usage Statistics

Share

Embed

Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                        
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            src="{[{embed.src}]}"
                            data-item="{[{embed.item}]}"
                            data-collection="{[{embed.collection}]}"
                            data-metadata="{[{embed.showMetadata}]}"
                            data-width="{[{embed.width}]}"
                            async >
                            </script>
                            </div>
                        
                    
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:
https://iiif.library.ubc.ca/presentation/dsp.24.1-0166288/manifest

Comment

Related Items