Open Collections

UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Social network analysis as a progressive tool for learning analytics : a sample of teacher education… Froehlich, Fabian 2020

Your browser doesn't seem to have a PDF viewer, please download the PDF to view this item.

Notice for Google Chrome users:
If you are having trouble viewing or searching the PDF with Google Chrome, please download it here instead.

Item Metadata

Download

Media
24-ubc_2020_november_froehlich_fabian.pdf [ 1.99MB ]
Metadata
JSON: 24-1.0391921.json
JSON-LD: 24-1.0391921-ld.json
RDF/XML (Pretty): 24-1.0391921-rdf.xml
RDF/JSON: 24-1.0391921-rdf.json
Turtle: 24-1.0391921-turtle.txt
N-Triples: 24-1.0391921-rdf-ntriples.txt
Original Record: 24-1.0391921-source.json
Full Text
24-1.0391921-fulltext.txt
Citation
24-1.0391921.ris

Full Text

SOCIAL NETWORK ANALYSIS AS A PROGRESSIVE TOOL FOR LEARNING ANALYTICS: A SAMPLE OF TEACHER EDUCATION STUDENTS' COURSE INTERACTIONS WITH THREADZ ON THE CANVAS PLATFORMbyFabian Froehlich B.A., Macromedia University of Applied Science, Cologne, Germany 2015A THESIS SUBMITTED IN PARTIAL FULFILLMENT OFTHE REQUIREMENTS FOR THE DEGREE OFMASTER OF ARTS inTHE FACULTY OF GRADUATE AND POSTDOCTORAL STUDIES(Media and Technology Studies Education)THE UNIVERSITY OF BRITISH COLUMBIA(Vancouver)June 2020© Fabian Froehlich, 2020iiThe following individuals certify that they have read, and recommend to the Faculty of Graduate and Postdoctoral Studies for acceptance, the thesis entitled:Social network analysis as a progressive tool for learning analytics: A sample of teacher education students’ course interactions with Threadz on the canvas platformExamining Committee:Jillianne Code, Assistant Professor, Department of Curriculum and PedagogySupervisor Sandrine Han, Professor, Department of Curriculum and PedagogySupervisory Committee Member Leah Macfadyen, Instructor, Department of Language and Literacy EducationSupervisory Committee MemberStephen Petrina, Professor, Department of Curriculum and PedagogyAdditional Examiner submitted by Fabian Froehlich in partial fulfillment of the requirements forthe degree of Masters of Artsin Media and Technology Studies EducationiiiAbstractThis thesis investigates social network analysis as a progressive tool for learning analytics through the lens of curriculum theory. The theoretical framework is based on the Tyler rationaleand Dewey’s concept of progressivism. A quasi-experiment was conducted relying on an embedded mixed-method research design. Teacher education students (n=18) participated in three online discussions. Two online discussions allowed students to access social network analysis visualizations through Threadz, a Canvas plugin. The overall inquiry focuses on how this exposure of learning analytics data influences the students and how the application of learning analytics might be related to educational ideologies. The methods of investigation rely on social network analysis, inferential statistics and content analysis. The conclusion states that exposing learning analytics data to students might have an impact on their behaviour in online discussions.ivLay SummaryThis thesis investigates how information on online posts in a discussion forum might have the potential to influence students and their engagement behaviour. During their regular course students had to participate in online discussions. They could access a tool, which illustrated who they replied to and who reacted to their online discussion posts. The research takes the students’ perception into account and how the data might influenced them in their discussion engagement. The results show that students react different to the online post discussion data. Some acknowledged a benefit while others rejected to look at the data at all.vPrefaceThis research project was originally conceptualized by the author, Fabian Froehlich. The author is also solely responsible for writing this thesis, under guidance of the supervisor and oversight of the committee. Ethics approval for this research was provided by the University of British Columbia Behavioral Research Ethics Board: Certificate H19-02417.viTable of ContentsAbstract......................................................................................................................................... iiiLay Summary............................................................................................................................... ivPreface.............................................................................................................................................vTable of Contents ......................................................................................................................... viList of Tables ................................................................................................................................ ixList of Figures.................................................................................................................................xAcknowledgements ...................................................................................................................... xiChapter 1: Introduction ................................................................................................................11.1 Introduction..................................................................................................................... 11.2 Statement of the problem................................................................................................ 31.3 Research questions.......................................................................................................... 31.4 Purpose of the study........................................................................................................ 41.5 Positionality .................................................................................................................... 61.6 Key terms ........................................................................................................................ 71.7 Thesis overview and structure ........................................................................................ 8Chapter 2: Literature Review.......................................................................................................92.1 Theoretical framework.................................................................................................... 92.2 Learning analytics and educational data mining........................................................... 142.3 Social network analysis................................................................................................. 182.3.1 Socio-centric measures ............................................................................................. 192.3.2 Ego-centric measures ................................................................................................ 21viiChapter 3: Research Methodology.............................................................................................243.1 Research methodology.................................................................................................. 243.2 Restatement of the research questions .......................................................................... 243.3 Research design ............................................................................................................ 253.4 Procedures..................................................................................................................... 303.5 Ethics............................................................................................................................. 32Chapter 4: Methods .....................................................................................................................334.1 Threadz ......................................................................................................................... 334.2 Social network analysis workflow................................................................................ 374.3 Threadz-access .............................................................................................................. 384.4 Survey ........................................................................................................................... 384.5 Merging sources............................................................................................................ 404.6 Summary of data sources .............................................................................................. 40Chapter 5: Data Analysis and Findings.....................................................................................435.1 Population and sample .................................................................................................. 435.2 Socio-centric SNA comparison of the three online discussions ................................... 445.2.1 Comparing OD1 and OD2 ........................................................................................ 455.2.2 Comparing OD1 and OD3 ........................................................................................ 465.2.3 SNA visualization ..................................................................................................... 475.3 T test ego-centric SNA measures.................................................................................. 485.4 Threadz-access counts .................................................................................................. 515.4.1 Threadz-access correlating with egocentric SNA measures ..................................... 525.5 ANOVA – Threadz-access ........................................................................................... 53viii5.6 Summary of SNA, t tests, correlations and ANOVA ................................................... 545.7 The outlier..................................................................................................................... 555.8 Gamifying learning analytics........................................................................................ 575.9 Survey item correlation................................................................................................. 615.10 Content analysis of open-ended survey items............................................................... 62Chapter 6: Discussion ..................................................................................................................646.1 Research question 1 ...................................................................................................... 646.2 Research question 2 ...................................................................................................... 656.3 Research question 3 ...................................................................................................... 676.4 Limitations & future directions..................................................................................... 686.5 Conclusion .................................................................................................................... 69Bibliography .................................................................................................................................74Appendices....................................................................................................................................90Appendix A Survey................................................................................................................... 90Appendix B Course syllabus EDUC....................................................................................... 100ixList of TablesTable 3.3.1 Embedded mixed method research design characteristics......................................... 27Table 5.1.1 Descriptive statistics of survey items Q7, in which students indicate their own behaviour on a scale form 0-100................................................................................................... 43Table 5.2.1 Socio-centric network metrics of the three online discussions.................................. 45Table 5.3.1  Paired Sample t-test comparing ego-centric SNA measures of OD1 and OD2........ 49Table 5.3.2 Paired Sample t-test comparing ego-centric SNA measures of OD1 and OD3......... 50Table 5.4.1 Descriptive statistics of Threadz-access of OD 3 ...................................................... 51Table 5.4.2 Correlating Threadz-access and ego-centric SNA measures of OD3........................ 52Table 5.5.1 ANOVA of ego-centric SNA measures of OD3. Students are separated in 3 user groups (no Threadz-access, moderate, heavy).............................................................................. 53Table 5.7.1 Descriptive statistics of Threadz-access in total, for OD2 and OD3......................... 56Table 5.9.1 Correlating survey items with Threadz-access and wordcount ................................. 61xList of FiguresFigure 3.3.1 Diagram of an embedded mixed method research design........................................ 29Figure 3.4.1 Timeline of the intervention ..................................................................................... 31Figure 4.1.1 Screenshot Threadz - Network Sociogram............................................................... 34Figure 4.1.2 Screenshot Threadz – Chord-diagram...................................................................... 35Figure 4.1.3 Screenshot Threadz – Timeline ................................................................................ 36Figure 4.1.4 Screenshot Threadz - Data Set table......................................................................... 36Figure 4.5.1 Abstract display of the csv file structure .................................................................. 40Figure 4.6.1 Display of data sources and analytical methods....................................................... 41Figure 5.2.1 SNA visualization of OD1........................................................................................ 47Figure 5.2.2 SNA visualization of OD2........................................................................................ 47Figure 5.2.3 SNA visualization of OD3........................................................................................ 47Figure 5.3.1 Plot showing the increase and variability of the Degree measure of OD1 and OD350Figure 5.7.1 Threadz access counts for OD2 (black) and OD3 (blue) ......................................... 55Figure 5.8.1 Student's post in OD3............................................................................................... 58Figure 5.8.2 Student's post in OD3............................................................................................... 59Figure 5.8.3 Student's post in OD3............................................................................................... 59Figure 6.5.1 The Epistemology - Assessment - Pedagogy triad ................................................... 71xiAcknowledgementsI offer my enduring gratitude to the faculty, staff and my fellow students at UBC, who have inspired me to start my work in educational research. I owe particular thanks to Dr. Jillianne Code, who supported me in every aspect of my life as an international student. I feel honored inquiring the mysteries of education under her supervision. I thank Dr. Sandrine Han for introducing me to the world of educational research and accompanying me from the beginning of my academic journey. I also thank Dr. Leah Macfadyen for challenging my perspective on learning analytics and guiding me through the deep waters of quantitative research. The whole Learning Analytics project team and CTLT contributed to my understanding of learning analytics and personal growth. Thanks for that. I would also like to give huge thanks to Alan Jay who helped me on numerous occasions with questions I had throughout the program. Special thanks are owed to my parents and brothers, whose have supported me throughout my whole life.1Chapter 1: Introduction1.1 IntroductionOver the past decade, learning analytics (LA) has emerged as one of a number of rising trends in higher education (Booth, 2012; Johnson, Adams, & Cummins, 2012). This thesis focuses on LA and an applied example of social network analysis (SNA) in an educational context. The main inquiry focuses on what LA can do for students and how exposing SNA data might affect student’s engagement behaviour in online discussions. Following a student-centered approach this thesis contributes to the fast-evolving field of educational data mining (EDM) and LA. While some scholars consider LA to be harmful (Dringus, 2011), this study argues how itmight be the key for progressive learning supporting constructivism and connectivism as perspectives of learning (Siemens, 2008). LA might also be able to bridge the gap between a scientific-management approach (Tyler, 2004) and progressive teaching philosophy (Dewey, 2004). Acknowledging the challenges applied LA has to face, such as "location and interpretation of data, informed consent, privacy, and deidentification of data; and [the] classification and management of data” (Slade & Prinsloo, 2013, p. 1511), this study focuses on a different inquiry. Looking at LA through the lens of curriculum theory this thesis supports the idea that progressive education and the field of LA are beneficial for each other. Progressive education acknowledges the significance of social interactions during the process of learning (Vygotsky, 1978). Emphasizing the role of community and experiences (Dewey, 1938) in the process of learning and meaning making contrasts the concept of traditional education, which focuses on teacher-centered lectures (Edwards, 2011). 2LA interrelated with progressive education might have the potential to impact education, especially in higher education due to personalised and adaptive learning. The concept of personalised and adaptive learning (Xie, Chu, Hwang, & Wang, 2019) is based on the theory of self-regulated learning (Schunk & Zimmerman, 2012). The main idea of those approaches is to enable the learner to manage and optimize their own learning behaviour. And in the first place to become aware of it. LA can assist students in this endeavour illustrating data to reflect on in order to optimize their learning experience (Tabuenca, Kalz, Drachsler, & Specht, 2015). LA is a field of inquiry, which measures, collects and analyzes data in an educational context, aiming to optimize the process of learning (SOLAR, 2011). SNA is an analyticalmethod, which was initially developed by anthropologists, therefore it is grounded in the social sciences (Scott & Carrington, 2011). It investigates the structures and dynamics of networks and relations between entities. LA adapts this method employing it in an educational context.  This study relies on a quasi-experimental embedded mixed-methods design and focuses on an undergraduate blended course in teacher education, which uses online discussions. In two out of three online discussions Threadz1 [Computer Software] (University of Eastern Washington, 2014), which is an open source SNA tool, was enabled. It can be integrated in the learning management system (LMS) Canvas2.1 https://threadz.ewu.edu/2 https://www.instructure.com/canvas/31.2 Statement of the problemEvidence of previously conducted research “strongly suggests that SNA, particularly when combined with content analysis, can provide detailed understanding of the nature and type of interactions between members of the network, allowing for optimisation of course design, composition of learner groups and identification of learners in danger of dropping out” (Cela, Sicilia, & Sanchez, 2014, p. 219).” LA, SNA results in particular, are able to summarize students’ engagement for educators, policy makers or course designers. Shifting the perspective and following a student-centered approach, this study investigates what happens if this kind of data and visualization is available for students. This consideration might have an impact on pedagogy and teaching practices. “The potential for learning technologies to enhance teaching and learning may be poorly understood and incongruent with individual perceptions and beliefs surrounding good teaching practice.” (Macfadyen & Dawson, 2012, p. 160). This thesis examines whether exposing LA data, in form of SNA, to students contributes to progressive education. The following research questions frame this work:1.3 Research questionsRQ1. Does calling student’s attention to particular learning analytics data change their way of learning? RQ2. Do students presented with social network analysis data on online course discussions adjust their engagement behaviour? RQ3. What are student’s perceptions of the utility of learning analytics tools? 41.4 Purpose of the studyLA tools have been shown to be effective in increasing students’ motivation and achievement (Pardo, Han, & Ellis, 2016). However, there is limited understanding of students’ perception of effective feedback and how to target feedback to be effective (Poulos & Mahony, 2008). This thesis focuses on the impact of LA on student engagement behaviour in online discussion forums. Student engagement behaviour in the context of this thesis is defined associo-centric and ego-centric network measures, which are explained in detail in chapter 2.3. This study regards the position of students within a network, measured by SNA, as an indicator of their engagement behaviour. The usage of LMSs, which allow the integration of LA tools, is not a new phenomenon (Educause Learning Initiative, 2012). But the increase in sophistication of these systems allows educational stakeholders to collect, compile and mine new sources of learning-behaviour data. The decrease of technological challenges related to data-access over the past (Ferguson et al., 2014) leads to the “linking of rapidly maturing information technologies to a renewed interest in how, when, and why people learn.” (Zemsky & Massy, 2004, p. 2). This study examines the impact of exposing SNA-data in form of sociograms and chord-diagramsfrom an online discussion forum to students in an educational context. The quasi-experiment investigates applied LA and the possibilities made available by the data, which already sits in UBC’s LMS. Inquiring into the effect on the learning experience of receiving feedback on one’s own engagement in an online environment, might be valuable because “many students have not utilized the cognitive processes (i.e. metacognition/ reflective thinking) necessary for self-directed learning to occur and are completely unfamiliar with evidence-based learning strategies, making self-directed learning a challenge.” (Pate, Lafitte, Ramachandran, & Caldwell, 2019, p. 492). Can LA be the key as a form of effective feedback triggering reflective thinking?5This study focuses on online discussion forum posts and explores how SNA might impact students in an educational context. Forum posts created by students are valuable artefacts in this context. “The power of an intellectual product lies in its potential influence on the minds of the consumers by providing them with knowledge, persuading them on an issue position, or forming an opinion on a subject.” (Ha & Yun, 2014, p. 47). Threadz, the SNA tool, can illustrate influential individuals, such as teachers, students or teaching assistants, due to the online discussion’s interaction data, turning this invisible dynamic into a graphic display, revealing possible power relations and influencers. SNA encodes usage behaviour leading to a better understanding of social relationships which becomes crucial in the context of education. “Ideology critique wants to remind us that everything that exists in society is created by humans in social relationships and that social relationships can be changed.” (Fuchs, 2014, p. 13). The idea that social relations play a key role in facilitating learning is based on Lev Vygotsky’s (1978) work, which influenced researchers in the sociocultural field and education. Considering the fundamental role that interaction plays in learning and cognition, offers an opportunity for LA, in particular Threadz, to foster critical thinking and might improve the learning experience.However, one should also acknowledge that the increase in monitoring in an educational context, which might be perceived as surveillance has an impact on the work and identities of students, faculty, and administrators (Knox, 2010). The feeling of being monitored might affect the student’s engagement and perception of LA tools. In order to answer the research questions this study relies on the following instruments: pre-post survey, SNA, descriptive and inferential statistics. Utilizing those instruments shape an embedded mixed-method research approach, which is explained in detail in chapter 3.1. The 6choice of tools, methodology and research question was informed by a thorough literature review, which will be presented in the next chapter.1.5 PositionalityThe research questions were informed by courses about curriculum theory, which I took during my academic journey at UBC. I acknowledge that I am neither an expert in curriculum theory nor LA. I am just a graduate student with a lot of questions. The value this thesis might contribute to the fields lies in the perspective from which it approaches LA. Talking to practitioners and theorists within the environment of UBC’s Centre for Teaching and Learning Technology and Educational Technology Support, let me wonder if there might be a misconception about LA. Theorists might condemn the field of LA as technical-behavioral and positivistic, while practitioners might perceive LA as a handy way to improve and assess learning. Of course, the reality is more complex. Drawing from theorists and practitioners such as Dewey, Kliebard, Siemens and Tyler (see Chapter 2: Literature Review for details) this thesis let me explore the beauty of LA.The background of this research is related to the work of the LA projects team at UBC and the SNA tool Threadz, which visualizes interaction data of online discussion forums within the LMS Canvas. As a student I found this tool fascinating and showed it my professors, whoemployed online discussions in their courses. It was intriguing to see the networks of different discussions. I personally identified hotspots in online discussions due to the sociogram, reading it as a roadmap to find a vivid discussion partner. I wondered if other students felt the same about the tool, or if I was the only learner, who cared about it. Inquiring and surveying the students’7reaction to a LA application ties in with the bigger picture of this thesis, how LA may enhance student engagement in online discussions and how educational ideologies might play a role.1.6 Key termsLearning analytics (LA): “Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs. A related field is educational data mining.” (SOLAR, 2011)Educational data mining (EDM): “Educational Data Mining is an emerging discipline, concerned with developing methods for exploring the unique and increasingly large-scale data that come from educational settings and using those methods to better understand students, and the settings which they learn in.” (IEDMS, 2011). Social Network Analysis (SNA): Social network analysis focuses on “relationships among social entities, and on the patterns and implications of these relationships.” (Wasserman & Faust, 1994, p. 3).Progressivism: In the context of this thesis progressivism refers to an educational ideology (Kliebard, 2002) which bases on John Dewey’s thoughts on education (Dewey, 2004). It is a counter-movement to traditional, lecture-based, teaching, emphasizing the importance of social elements during the process of learning (Vygotsky, 1978). Progressivism is also referred to as social-romantic ideology in the context of education.8Assessment: Assessment can be defined as a “wide variety of methods or tools that educators use to evaluate, measure, and document the academic readiness, learning progress, skill acquisition, or educational needs of students.” (Great Schools Partnership, 2015). Literature does not offer a common definition of assessment and sometimes uses it interchangeably with the term evaluation. One source suggests: “Assessment is defined as a process of appraising something or someone, i.e. the act of gauging the quality, value or importance. As against, evaluation focuses on making a judgment about values, numbers or performance of someone or something. Assessment is made to identify the level of performance of an individual, whereas evaluation is performed to determine the degree to which goals are attained.” (Surbhi, 2017).1.7 Thesis overview and structureThis thesis is divided into six chapters. Chapter 1 provides the background, research questions, purpose of the study, a brief definition of key terms, and the positionality or lens through which this work was undertaken. Chapter 2 provides a review of literature and the theoretical framework in which to conceptualize this research. Chapter 3 presents the mixed methodological framework used in this study. Chapter 4 explains the methods and tools used in this research, displaying the structure of the analysis process in detail. Chapter 5 presents theanalysis itself and results of the findings. Chapter 6 concludes with a discussion of the research findings, limitations, implications and future directions.9Chapter 2: Literature ReviewIn order to identify and analyse literature related to curriculum theory, LA and SNA, a literature review was conducted. The following databases were searched in order to identify peer-reviewed research articles, conference papers and proceedings that addressed this study’sobjective using the keywords “curriculum”, “learning analytics" and “social network analysis”: UBC library search, ERIC – Education Resource Information Centre, jstor library and Google Scholar. Recommendations and readings from committee members’ syllabi are included in the review. In order to acquire a deeper understanding of LA and considering the blended course environment of this study, I also included the search terms “educational data mining” and “blended”. All search attempts were conducted with a focus on an educational context. Databases were searched during September 2019 and April 2020. Only studies published in English were considered. A brief overview of the content findings will be presented next.2.1 Theoretical frameworkThis thesis offers a perspective on LA drawing from a curriculum theory lens. The attempt to measure or assess the process of learning has ideological consequences. Using SNA implicates the consideration of learning as a social process, thus the acknowledgement of a social-romantic ideology. This section covers the different streams and the interrelationships of several ideological stances related to curricula, which are worth considering if applying any sort of assessment or measurement in an educational context. Therefore, this section encompasses the theoretical foundation of curriculum and educational ideology. In order to classify LA as a field, which investigates education, positioning it within the bigger picture of educational ideologies 10might be of value. The following sources might be helpful to understand the theoretical context of curriculum and ideology and approaches to measure or assess anything related to learning. It is important to distinguish between educational ideologies (Flinders & Thornton, 2004) and learning theories (Johnson, 2019) such as behaviorism, cognitivism, constructivism, or connectivism. The wording of those learning theories might change depending on the literature. Those theories describe different approaches and beliefs about learning and learning preferences or “styles” (Pashler, McDaniel, Rohrer, & Bjork, 2008). They investigate how content is taught and humans learn shaping a pedagogy. Ideologies take a step back and explore the philosophical framework behind the learning theory and its implications. Ideologies in education offer a rationale for the justification and choice of a learning theory or pedagogy. Ideology in the context of education is related to the beliefs, customs, culture and values shaping the purpose of education. The curriculum is the result and origin of those forces. Eisner (1979) analyzes this phenomenon in detail and the following quote highlights the function of educational ideologies:It should not be inferred from my remarks that ideologies are, somehow, a kind of infection in education that is to be cured by taking the proper medicine. Nor should it be inferred that ideologies somehow interfere with the exercise of "pure rationality." Because education is a normative enterprise, it cannot be approached value free. (p. 52)By choosing and supporting a learning theory as an educator, one makes a judgement, maybe just unconsciously. This value-statement can be described by educational ideologies. To appreciate the values of LA and EDM an understanding of the idea of scientific managementmight be helpful. At its core, scientific management is all about improvement, as do LA and EDM state in their official definition. Scientific management is “geared to the needs of a 11developing industrial society.” (Doll Jr., 2004, p. 253). An understanding of the Tyler rationaleand the concept of evaluation and assessment is closely related to the idea of scientificmanagement. The Tyler rationale and scientific management are grounded in a technical-behavioral ideology (Kliebard, 2004). Even though LA is not primary assessment, the same principle applies: “For evaluation purposes, the response should be that unmeasurable goals are of little or no use.” (Popham, 1969, p. 73). Tyler (2004) himself phrases it this way: Certain kinds of information and knowledge provide a more intelligent basis for applying the philosophy in making decisions about objectives. If these facts are available to those making decisions, the probability is increased that judgments about objectives will be wise and that the school goals will have greater significance and greater validity. (p. 52)The catch is, if something is not measurable, how can it be improved? In order to enable improvement in education it had to be turned into a measurable science, which is still the foundation for current curricula and instructional design. According to Slade & Prinsloo (2013, p.1519) LA can function as a “moral practice”, which means that efficiency in an educational context can be based on a value system. Defining improvement is a judgement. Acknowledging that LA might be seen rather as a moral practice than a mere form of measurement and evaluation, literature about ideology and curriculum design becomes relevant (Adler, 2004; Bobbitt, 1924; Bruner, 2006; Dewey, 2004; Doll Jr., 2004; Eisner, 1979; Gray, 1998; Hern, 1996; Kliebard, 2004; Mathison, 2004; McKernan, 2007; Petrina, 2004; Popham, 1969; Ross & Vinson, 2013; Smith, 2003; Teitelbaum, 1991; Tyler, 2004). Following this notion, the Tylerrationale applied to education focuses on: Objective – Teaching – Assessment. From a scientific management point of view this means, if a teacher is not able to assess the objective, the 12objective is not worth teaching. The terminology assessment in the context of curriculum theory explores the relationship between teaching and objective. Assessment defined as mere grading or marking is not appropriate in regard of this research, which understands assessment as a “wide variety of methods or tools that educators use to evaluate, measure, and document the academic readiness, learning progress, skill acquisition, or educational needs of students.”(Great Schools Partnership, 2015). Tyler remarks that prioritizing assessment in the process of curriculum design allows evidence-based decision-making. This perspective might indicate that knowledge in a scientific management system serves the ‘managers’, the decision-makers. It is not guided towards the learner and problematic from an epistemological point of view. “Evidence-based education seems to favor a technocratic model in which it is assumed that the only relevant research questions are about the effectiveness of educational means and techniques, forgetting, among other things, that what counts as “effective” crucially depends on judgments about what is educationally desirable.” (Biesta, 2007, p. 5). Concerning a technical-behavioral ideology the industry phrases what is educationally desirable. During the early 20th century the American labour market needed a standard certificate for workers and a quality-assurance of their skills. Education seen through a technical-behavioral lens functions as certification and validation regardless of the ingredients. Vocationalism is on the rise (Wells, 2015). To achieve a basis for certification standardizing the curriculum becomes a necessity. This argument bases on the idea that “free market competition is viewed as the route to assure a quality product.” (Ross & Vinson, 2013, p. 19). Another idea of this ideology is the emphasis of conformity. In the early 20th century this need of unity rose due to the diverse students caused by the heterogenic demographics of America influenced by a history of immigration. All learners should become productive citizens and find their place in a market society perpetuating the status quo. The 13technical-behavioral approach is not about science, but it is scientistic itself. Proof of that is the application of the Tyler rationale (Tyler, 2004) and the idea of scientific management (Bobbitt, 2004). “Through the method of activity analysis (or job analysis, as it was also called), Charters was able to apply professional expertise to the development of curricula in many diverse fields, including secretarial studies, library studies, pharmacy, and especially teacher education.” (Kliebard, 2004, p. 38). A need of a more efficient education aroused with the employer’s need of new methods, new materials, and new types of experience, which must be employed. “A study of the learners themselves would seek to identify needed changes in behavior patterns of the students which the educational institution should seek to produce.” (Tyler, 2004, p.53).Bobbit brings a more balanced approach to the table. His idea of education is that schools should prepare the student to find its ideal place in society to contribute to economy and society. Which means education encourages students as consumers in the capitalist system emphasizing producing, consuming, and vocationalism (McKernan, 2007).But Bobbitt, unlike other educators who turned to scientific management, was not content merely to apply certain management techniques to education, such as maximum utilization of the school plant; he provided the professional educators in the twentieth century with the concepts and metaphors—indeed, the very language—that were needed to create an aura of technical expertise without which the hegemony of professional educators could not be established. (Kliebard, 2004, p. 38)Thanks to his influence the principle of scientific management could be applied around the world in every higher education institute. The global compatibility comes with a downside “The school creates an environment that does not put much premium on imagination, on personal 14spirit, or on creative thinking. It emphasizes a form of rationality that seeks convergence on the known more than exploration.” (Eisner, 1979, p. 55). A counterapproach to this whole movement is the idea of the social-romantic ideology. Philosopher and educator John Dewey emphasised a social-romantic ideology (Dewey, 1938). Teitelbaum (1991) provides a sketch of the milestones in the history of curriculum and ideologies, which shows that the social-romantic ideology and the stream of scientific curriculum making are discussing the aims and purpose of education. Those findings resulting from the literature review inform the research methodology of this study, which will be explained in detail in the next chapter.  2.2 Learning analytics and educational data mining Learning analytics (LA) offers valuable insights to educators, instructors and practitioners on educational data that might be helpful for the improvement of teaching and learning. SOLAR (2011), the Society for Learning Analytics Research, defines LA as “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs”.3According to Siemens & Baker (2012) LA and EDM emerged due to the fact that large data sets from students became available. Educational software like LMSs and online learning environments began to generate data sources, such as system logs, surveys, quiz-data, and clickstream-data. One approach of educational data mining, which is highlighted by the reviewed literature, is to analyze data in order to design pedagogical interventions to support students’ 3 https://www.solaresearch.org/15learning behaviour (Baker et al., 2016; Black, Dawson, & Priem, 2008; Bogarín, Romero, & Cerezo, 2015; Bovo, Sanchez, Heiguy, & Duthen, 2013; Minaei-Bidgoli & Punch, 2003; Romero, Espejo, Zafra, Romero, & Ventura, 2013; Romero, Ventura, & García, 2008; Slaninoa, Martinovic, Drazdilova, & Snasel, 2014; Wise, 2014). An early focus of LA research and implementation was the identification and the support of “at-risk” students. LA acting as an early alert system has the ability to improve students’ success and retention (Sclater, Peasgood, & Mullan, 2016, p. 4). A detailed review of the field of predictive analytics is beyond the scope of this review. LA research and practice has pursued many further aspects of learning design and learning success and might be able to provide valuable insights at a macro (government, institutional), meso (course, class) and micro (learner, student) level (Buckingham Shum, 2012), as the following examples might show. Several tools exist in universities to promote and enhance the student’s educational experience. According to the JISC – Learning Analytics in Higher Education report of April 2016 (Sclater et al., 2016) some instructors at the University of Wollongong (p.36) used SNAPP, which is a SNA visualization tool for online discussions similar to Threadz. The Edith Cowan University established the C42-system (Connect for success), which paired students automatically with their supervisor if they fell off track (Amara & Richards, 2013). Another example of LA are learning dashboards, which present educational data in a collaborative interactive setting (Charleer, Klerkx, & Duval, 2014; Park & Jo, 2015). Research indicates that some examples promote the students’ awareness of their performance, positive responses to course goals and an improved overall progress (Charleer, Klerkx, Odriozola, & Duval, 2013).    16The previous examples try to contribute to a practical improvement in teaching and learning with an emphasis on application as a deliberate goal. Another field, which examines learning data is the field of EDM. According to some authors, one has to distinguish between EDM (Banihashem, Aliabadi, Ardakani, Delaver, & Ahmadabadi, 2018) and LA. While EDMemerged around 1995 according to Romero & Ventura (2007), Siemens hesitates to define a birth year of LA due to the fact that it emerged from several disciplines such as “education, statistics, computer science, information science, sociology, and computer-supported collaborative learning” (Siemens, 2013, p. 1395). Both fields share the same methods but focus on slightly different outcomes. LA puts emphasis on pedagogy and learning experience. Gottipati & Shankararaman (2014, p. 3) highlight the difference of the fields as followed: “Though these two fields are closely related, LA goes beyond data mining techniques and includes other methods, such as statistical and visualization tools or social network analysis.” EDM also covers the field of academic analytics (Campbell & Oblinger, 2007), which focuses on academic career-pathways and course-structures of a program on an institutional level (Denley & Oblinger, 2012). The major stakeholders in LA are students and instructors. According to Siemens (2013, p. 1382) LA “draws from and extends” educational data mining methodologies. Potential main benefits of LA applications might be an improved curriculum design, enhanced students’ and instructors’ performance and personalised learning services (Du et al., 2019). Current research on LA can be grouped in three categories: “(1) predicting a student’s performance or the likelihood of dropout, (2) detecting a student’s learning progress via analysis and (3) providing feedback or modelling based on analytic results.” (Du et al., 2019, p. 9).17One method, which has been adopted by LA researchers is SNA. It was already in use in other fields, such as anthropology and sociology (Schepis, 2011), before utilized in a LA context.  The theory and concept of SNA is firmly grounded in systematic empirical data. Network data is visualized in the form of a sociogram or chord-diagram revealing interactions between actors in a conversation and their relations (Carolan, 2014). Literature on this method, which plays a central role in this thesis, will be reviewed in the next section.According to Du et al. (2019, p. 3) the most popular research topics among LA and EDMare “student behaviour modelling and performance prediction, the support of students’ and teachers’ reflections, and the awareness and the improvement of feedback and assessment services.” Furthermore, research indicates that appropriately used learning technologies may be able to enhance student engagement (Dawson, Burnett, & O’Donohue, 2006; Macfadyen & Dawson, 2010).While the availability of educational data enables the above-mentioned analysis methods improving education, some literature mentions challenges and risks, which come along with the abundance of educational data. A LACE (Learning Analytics Community Exchange) report (Griffiths et al., 2016, p. 2f) highlights that the “educational process cannot be isolated in hermetically sealed areas”, which are easy to survey. The limitation of tracking activity outside of an institutional system is a reoccurring theme in LA challenges (Slade & Prinsloo, 2013). In regard of my thesis this might be a crucial point. This research study analyzes only the Canvas online discussions. Engagement and interactions of students related to the course, happening in other online environments, in classroom or in person was not surveyed. That said, the data this research collected, might origin from a “sealed area”, the Canvas online discussion. This 18possibility is acknowledged by the above quoted literature. Mentioning challenges and risks here, serves the purpose to be aware of limitations this research design faced.2.3 Social network analysisSNA employs well defined measures in order to deliver an organised overview of network characteristic and dynamics (Carolan, 2014a; Scott & Carrington, 2011). SNA and related theories around it can be traced back to the work of anthropologist J.A. Barnes, who systematically analyzed social links in the 1950s. Sociologists and anthropologists were looking for a way to understand human behaviour in a large-scale context, such as a complex community. Relying on the concept of mathematical graph theory (Mitchell, 1974), Barnes' new methods enabled researchers to “uncover the form and substance of relationships” (Hopkins, 2017, p. 644). Blank (2017, p. 18) states that “social network theory is the single biggest theoretical and empirical success story of the social sciences over the past decade.” According to Cela et al. (2014) SNA proved to be an extremely powerful tool in an e-learning environment for educational purposes. Using data to visualize social interactions is a valuable method in aneducational context, taking into account that from a progressive educational perspective social relationships are considered fundamental for meaning making (Vygotsky, 1978).  A SNA consists of several core metrics, which will be explained in detail in the next paragraph. Those metrics can be computed according to the composition of nodes and edges, sometimes also called strings or lines. A node represents an actor in a social network while an edge embodies an interaction between two actors (Carolan, 2013). One must distinguish between directed and undirected networks. Directed networks such as online discussions consist of 19interactions with an actor, who acts upon a target, while undirected networks, such as a friend list on a social media platform – friend lists only have a mutual interaction. There is nothing such as a unilateral friendship. Directed networks or graphs can be displayed as undirected, but not the other way around. This is the case with the sociogram of Threadz. It displays the online discussion in canvas as an undirected network in its sociogram. The chord diagram displays a directed network as does the exported data. The data retrieved from Threadz and analyzed in Gephi4 (Bastian, Heymann, & Jacomy, 2009) describes directed networks (see chapter 4Methods).SNA offers socio-centric measures describing the whole network. Those are the average degree, network diameter, average path length, graph density, modularity, average clustering coefficient and. SNA measures on an ego-centric level describe attributes of an individual (node) within the network. Those are: degree centrality, indegree, outdegree, closeness, betweenness and eigen-centrality. 2.3.1 Socio-centric measuresThe socio-centric measures allow a comparison of different networks, in this case three online discussions. The average degree measure indicates the average count of interactions of every node within the network. “Degrees refers to the number of ties an actor either sends (out-degree) or receives (in-degree).” (Carolan, 2014b, p. 106). 4 https://gephi.org/20The network diameter and average path length measure offer valuable clues about the size and information flow of a whole network. “A network's diameter refers to the longest path between any two actors”, while the average path length measures the mean distance between all individuals (Carolan, 2014b, p. 102). A high diameter value indicates that information can travel from one end of the network to the other.Another socio-centric measure is the clustering coefficient, which indicates the affinity of the network to build groups. A high cluster value points out hot spots of dense connectivity within the network, while a low value might be proof for an evenly distributed network and the abundance of cliques. The modularity measure also reveals if the network can be broken down into “some number of mutually exclusive groups” (Carolan, 2014b, p. 117).The graph density investigates how many interactions happen, as opposed to the total number of interactions that are possible. That said, the density value indicates the number of actual relationships in a network divided by the total number of possible relationships (Wasserman & Faust, 1994). For example, in an online discussion with 3 people, every person is able to contact the two other members. If this is the case the density value equals 1. If all the three persons decide to only message one other member, the density value would be 0.5. “We can use density to study how well connected a network is; How well do people know each other?” (Sie et al., 2012, p. 6). The authors also suggest that in an educational context a decentralized network is preferable in order to ensure that the network is not dependent on a few actors. 212.3.2 Ego-centric measuresImportant for this research are also the ego-centric measures of indegree and outdegree. The outdegree indicates how frequently a student posts a comment, while the indegree measures the frequency of incoming interactions, such as receiving a post (Cela et al., 2014). The degree centrality of an individual can be computed by counting the edges, or links, which connects the node to other nodes. Degree centrality of an individual indicates popularity and the access to resources in this network (Howard, 2015, p. 6).Closeness is a centrality measure indicating the distance between nodes. For example, if a social media platform suggests a friend, this friend is probably a friend of another friend of yours. In this case you have to “jump” through one node to get to this new friend. A small closeness value indicates greater proximity to other nodes. A larger value would indicate, that one has to “jump” over several nodes to reach another node in the distance (Scott & Carrington, 2011). A closeness centrality measure for an individual (a node) is calculated as a measure of their overall ‘average’ closeness to all other individuals in a network. According to Howard (2015, p.7), the closeness measure of an individual can tell  us “who can most efficiently obtain information on other nodes” and “who could most quickly spread information in a network”. The betweenness centrality is an indicator of individuals (nodes) who potentially act as a major player in the network: “high betweeness centrality means that the node representing an actor is on relatively more shortest paths between other actors” (Newman, 2010, p. 185).Eigen-centrality takes the links of an individual node and their connections into account revealing “wide-reaching influence” nodes and important actors on a macro-scale (Howard, 222015, p. 7f). All those ego-centric measures are important in this research, because they wereused in a correlation-test.  SNA tools typically compute network metrics and visualize the network in the form of a sociogram (Carolan, 2014b, p. 9). SNA uses quantitative methods to investigate “constructs”(Gravetter & Wallnau, 2014, p. 20), in this thesis’ case engagement. SNA does not claim to assess the quality of discussion contributions (Schepis, 2011). This method acknowledges that a significant component of the learning process is social, emotional, and affective, which echoes the progressive school of thought (Dewey, 2004; Vygotsky, 1978). SNA does not measure any of the above-mentioned elements by analyzing the learner’s output. It emphasizes the process of learning rather than the product of learning. SNA opens a window into the social behaviours of learners and learning communities.To summarize, SNA “is the mathematical analysis of interactions, relationships, positions in the network, and the network itself.” (Sie et al., 2012). Two levels of analysis are of interest inthis research. The network level (socio-centric), which includes measures such as density and centrality in order to compare different networks, and the individual level (ego-centric), which relies on metrics such as in- and-out degree analyzing the position of individual nodes within a network. Literature suggests that while conducting a SNA, to balance a socio-centric and ego-centric approach (De Laat, Lally, Lipponen, & Simons, 2007). The authors argue that using both methods act as triangulation. This research follows this approach using SNA data generated from the LMS and in addition a pre- and post-survey asks participants about their sense of belonging and awareness of their in- and out-degree within the network to enrich the findings.23Avella, Kebritchi, Nunn, & Kanai (2016) reviewed 112 LA journal articles from 2000 to 2015. From this small set of LA studies the authors identified visual data analysis and social network analysis as the most frequently applied method. However, Du et al. (2019) presented a more exhaustive meta-review, 911 studies between 2011-2017, which classified SNA as one of many applied methods. SNA qualifies as a valid analytic method to choose for this study due to its constant occurrence in research publications during the last 10 years and as a mean of investigating learner engagement (Dawson, 2010).One further argument why SNA might prove valuable in an educational context is the assumption that, “if self-regulated learning begins (in part) by comparing ourselves with others who are not like us, then this raises interesting questions about how learning environments (including digital ones afforded by LMS courses) might be designed to optimize student opportunities for social comparison.” (Fritz, 2016, p. 32). SNA is a perfect tool for realizing one’s own centrality in a network and to enable social comparison related to online discussions.Dawson (2010) forges strong arguments about why SNA might be a powerful tool in the context of education, referencing social learning theories formulated by Dewey and Vygotsky.Furthermore, he values the methodology of SNA and highlights that “correlations between SNA centrality measures and student sense of community, noting that degree and centrality were noted to be positive predictors of student sense of community.” (Dawson, 2010, p. 739). That said, it is a legitimate approach to correlate SNA measures with constructs such as sense of community, engagement or other constructs.Those findings inform the research methodology of this study, which will be explained in detail in the next chapter.  24Chapter 3: Research Methodology3.1 Research methodologyA mixed-method approach is the best fit for the educational context used in this researchdue to the richness of data, which can be gathered through both quantitative and qualitative methods (Johnson & Onwuegbuzie, 2004). An embedded mixed-method research design allowedthe linkage of qualitative data and quantitative data. The quantitative data, SNA, was exposed to the students through Threadz, a LMS plugin. SNA analyzed the online discussions, but also became part of the online discussion itself, allowing students to reflect on the network they weresituated in. The themes around how students reacted to Threadz and how it might affect the online discussion were explored through open-ended survey questions. This allowed students to express their personal experiences, which were also mirrored by SNA measures. This consideration justified the use of an embedded mixed-method research design modeled after Brady & O’Regan’s (2009) study.3.2 Restatement of the research questionsI considered the choice of a mixed-methods design to address the three stated research questions:RQ 1: Does calling student’s attention to particular learning analytics data change their way of learning? RQ 2: Do students presented with social network analysis data on online course discussions adjust their engagement behavior? 25RQ 3: What are student’s perceptions of the utility of learning analytics tools? The research questions address an existing area of LA research. Recent published works explore such questions and discusses research and findings in this and related areas (Du et al., 2019; Khe, Wing, & Connie, 2010; Sie et al., 2012; Tamim et al., 2019). Also the second research question can build upon a rich body of existing research and literature (Whitelock-Wainwright, Gašević, Tejeiro, Tsai, & Bennett, 2019). However, using SNA as intervention and analytic method this research might be able to add some valuable insights to the existing body of knowledge.3.3 Research designI made use of an embedded mixed-method design based on Creswell’s framework (Creswell & Plano Clark, 2011). I display my research design in a table and diagram due to its “substantial history of use in the mixed methods literature.” (Creswell & Plano Clark, 2011, p. 108). The embedded design involved collecting and analyzing two types of data, quantitative and qualitative, “within a design framework associated with the other type of data, such as when a researcher chooses to embed a qualitative strand within a quantitative experiment.” (Creswell & Plano Clark, 2011, p. 123). Related to my project this meant that I exposed quantitative data in form of a sociogram and chord-diagram of the online discussions to the participants through Threadz. As a second step I asked them through a survey what they thought of this data. Qualitative data revealed opinions and beliefs about to the exposure of the sociogram, which is part of the intervention. The ego-centric and socio-centric measures of the SNA functioned as quantitative data as part of the research analysis. Inquiring into student perceptions of the LA26tool led to the above stated experiential design. The guiding theoretical framework of LA and SNA provided a solid ground for the chosen embedded-mixed method approach.This study’s methods were shaped by a quantitative and qualitative data collection for testing whether the LA plugin Threadz had a significant impact on learner’s engagementbehaviour related to online discussions. Due to the limits of this thesis engagement behaviour in this study was defined as SNA measures. It should be acknowledged that SNA itself does not claim to indicate engagement behaviour by definition. However, within the context of a Canvas online discussion interaction between students contributed to the in-degree or out-degree value of an individual and were interpreted as part of the engagement behaviour. The quantitative analysisincluded t-tests, an ANOVA and correlation tests based on the hypothesized relationship to the tool usage. The tests were conducted in order to detect statistical significant relations (Gay, Mills, & Airasian, 2012). Additionally, a pre- and post-intervention survey was conducted. The design of the qualitative component based on open-ended survey items addressing questions related to perception, accessibility and engagement. The qualitative analysis focused on thematic development across the students’ perspectives and experiences, which justifies the qualitative approach, embedded in the research design. The quantitative analysis asked, “what happened?” and revealed existing relations in the data. The qualitative analysis tried to answer, “why did it happen?”, exploring the reasons behind the quantitative findings. Furthermore, the low sample size of this research study (n=18) is a strong argument to follow this embedded mixed method approach rather to focus on a purely quantitative method. The sample size is smaller than 30, which is the recommended minimum for most statistical analysis (Gravetter & Wallnau, 2014).Qualitative data was analysed through inductive coding (Auerbach & Silverstein, 2003). An open coding approach was followed by an axial coding phase forming categories and dimensions 27(Strauss & Corbin, 1990). That said, the responses of open-ended survey items were categorized based on their content without an initial codebook. Reoccurring thematic responses were counted and expressed as a ratio.  The intervention, which is explained in detail in the next chapter, is a quasi-experiment(Cook & Campbell, 1979). The results were analyzed in regard to the guiding theoretical model merging the quantitative and qualitative strand. According to Creswell & Plano Clark (2011, p. 139) this embedded mixed method design type can be expressed through a notation: QUAN (+qual) = enhance experimentThe following table modeled after Creswell’s comparison (Creswell & Plano Clark, 2011, p. 134) lists all critical components of an embedded mixed method research methodology:Table 3.3.1 Embedded mixed method research design characteristicsContent area and field of studyHigher education and the application of learning analyticsPhilosophical foundationPragmatic and dialectic, Progressive education (Dewey, 1938; Vygotsky, 1978)Ideologies in education (Bobbitt, 1924; Dewey, 2004; Doll Jr., 2004; Teitelbaum, 1991)Theoretical foundations Learning analytics (Macfadyen & Dawson, 2012)Social network analysis (Dawson, 2010; Wasserman & Faust, 1994)Scientific management (Bruner, 2006; Tyler, 2004)Self-regulated learning (Schunk & Zimmerman, 2012)Content purpose To evaluate the impact of the social network analysis tool Threadz related to online course discussionsQuantitative strand28Sample N= 18 undergraduate students in teacher education programData Collection Gather pretest and post-test measures over one monthOnline discussion posts meta data (post-count, word-count)Cross-sectional survey design Demographics related to education (learner’s profile)Data Analysis Regression, t-test analysis, correlational tests, descriptive statisticsInquiring engagement through social network analysis Qualitative strandSample N=18 same sample as aboveData Collection Online discussion posts - contentOpen ended survey questionsData Analysis Thematic Analysis Mixed methods featuresReason for mixing methodsNeed to relate quantitative findings of the social network analysis with qualitative descriptions of students’ experience and perception of the learning analytics toolNeed quantitative data to validate qualitative findings and reveal hidden patternsPriority of the strands Quantitative priority Timing of the strands ConcurrentPrimary points of mixing (point of interface)DesignMixing of the strands Embedded: Qualitative strand is embedded within the quantitative strandMerge: Qualitative data of the individual level and quantitativedata cumulated as SNA socio-centric dataMerge:Impact results and case study results in regards to the guiding theoretical model.  Mixed method designMixed methods design typeEmbedded (quantitative instruments ask: “what is going on?”, qualitative instruments ask: “why is it going on?”)Notation QUAN (+qual) = enhance experiment29The following diagram illustrates the research design and bases on Creswell’s research design framework (Creswell & Plano Clark, 2011, p. 126):Figure 3.3.1 Diagram of an embedded mixed method research designHowever, I would like to acknowledge that “the distributive nature of networks and the inability to track activity outside of an institution’s internal systems also affect the ability to get a holistic picture of students’ life worlds.” (Slade & Prinsloo, 2013, p. 1515). To justify the embedded mixed-method approach of this study I would like to stress the purpose of this inquiry: to better understand participants' thoughts, feelings, and actions about online asynchronous 30discussion forums, in order to understand if students presented with SNA data on online course discussions adjust their engagement behaviour. And what are students’ perceptions of the utility of learning analytics tools? As stated above learning is a complex endeavour and measuring the single impact of this tool is only an attempt to inquire the interrelationship between students’ behaviour and the application of LA.3.4 ProceduresThis research project focused on the question: Does calling student’s attention to particular learning analytics data change their way of engagement in online discussions? The study focused on the application of Threadz, a SNA tool, which is integrated into the CanvasLMS. The intervention was conducted as follows:1. Students participated in the first online discussion (OD1) as part of the regular course proceedings;2. Students filled out a pre-survey, which asks about demographics, general digital user behaviour, language proficiency and the experience of participating in OD1 and online discussion forums in general;3. The instructor enabled the LA tool Threadz for students through Canvas.  From now on students could access a SNA illustrating their online discussion structure;4. Students participated in a second and third online discussion (OD2, OD3) as part of the regular course proceedings;315. Students filled out a post-survey, which asks about the experience of the online discussion forum after enabling ThreadzFigure 3.4.1 Timeline of the interventionThis intervention took place in an EDUC course, 2020, offered in a curriculum department at an urban western Canadian university. An abbreviated version of the syllabus can be found in the appendix of this thesis (p. 100). The three online discussions were embedded in the regular course design. The assignment associated with the online discussions was phrased as follows: Use your own sources and if necessary, library sources, and any other information connected to the inquiry course material.You will be graded using the following rubric on your original post, and at least two responses.Keep it succinct, maximum 250 words (plus references). (Appendix B, p. 109)I hypothesized that the student’s engagement might change after enabling Threadz, which might promote self-awareness in online discussion forums. This study tested if displaying comparative feedback to learners in form of SNA can influence their behaviour, which is 32expressed through socio-centric and ego-centric SNA measures. The study analyzed thecorrelation between Threadz-access, and the learner’s position within the network. The surveycovered the students’ perception of SNA in an educational context. Does it help or distract?Details about the survey can be found in chapter 4.4 and Appendix A.  3.5 EthicsEthical issues are addressed in the BREB form (H19-02417). Due to the linkage of data from the LMS and secondary use of data, approval by the BREB committee had to be obtained according to the TCP2 protocol. The study does not include any at-risk population and is classified as minimal-risk research. The BREB committee has approved this study.33Chapter 4: MethodsIn order to answer the three research questions, I decided to rely on SNA, because it is an analytic method, which allows me and the students, who have access to it during the intervention, to compare and analyze changes in networks. Observing if online discussions utilizing Threadz differ from online discussions without the plugin is the focus of the analysis.SNA is able to detect a difference because SNA “measures provide a framework for interpreting and developing an understanding of the observed patterns of exchanges that occur between social actors.” (Dawson, 2010, p. 738). SNA is one of a number of moderately popular methods of LA studies (Du et al., 2019). My research has the potential to contribute to this pool of knowledge, because it uses SNA as an intervention and analytic method at the same time.4.1 ThreadzOne central role of this thesis is played by Threadz. It is an SNA plugin for Canvas, which means Threadz draws on Canvas discussion forum data to develop network visualizations of course data. It is a plugin, which can be enabled within Canvas relying on LTI, Learning Tools Interoperability. This is an education technology specification and allows LMS to integrate external tools. Threadz was developed at the Eastern Washington University (Lewis, 2016). The developers explain that: “The data coming out of the LMS needs to be formatted in a specific way in order to use the functions that process the data for the D3 graphic displays. The php array titled 'arrTopic' holds the information about a discussion topic as well as the discussion content (participants, posts, replies) in SESSION. The individual discussion topic data is saved into the array using the topic id as the key name.” (Lewis, 2015). Threadz can be enabled for students 34and instructors or instructors only. The following figures illustrate the different network visualizations the user has access to. (The following student names are templates. The following figures do not represent the collected data of this study):Figure 4.1.1 Screenshot Threadz - Network SociogramThe figure above shows the network sociogram-view in Threadz for an online discussion. The relative circle size of the nodes can represent four different values: posts/ replies sent (outdegree), received replies (indegree), post-reply ratio and wordcount. The user can choose one of those options, the circle size adjusts instantly. Threadz lists nodes without any activity below the sociogram. In the context of an online discussion, this represents participants who have not 35posted anything yet. On the right side of the display, Threadz offers a simple instruction and manual on how to interpret sociograms in general.Figure 4.1.2 Screenshot Threadz – Chord-diagramThe figure above illustrates the chord-diagram view of Threadz. As with the sociogram before, Threadz adds a short description how to use and interpret the diagram. Opposed to the sociogram the chord diagram presents a directed network. In the context of an online discussion, participants can distinguish between post and replies. For example, student PGt sent a post to student NHt and has one post remaining, nobody has replied to yet. 36Figure 4.1.3 Screenshot Threadz – TimelineAnother feature of Threadz is the timeline, which displays the post-counts happening on a certain date. Threadz provides interface pages labeled Statistics and Data Set. Statistics displays tables dedicated to post count and word count of participants. Every user has the option to export those tables in form of a csv-file. The same goes for the Data Set interface. It displays a table, which lists in detail all the interactions of the participants in one online discussion including timestamp and message text.Figure 4.1.4 Screenshot Threadz - Data Set tableFor this study’s analysis only column 2 and 3 were of interest. Column 2 indicates the source of an interaction, while column 3 indicates the target of the interaction. We can see the 37example described in the chord-diagram earlier. Row 3 tells us that student PGt (Priya Godehard) has interacted with Student NH. Each sociogram, chord-diagram and data set is specific to a single discussion forum and not the overall course. For this course three sociograms and three chord diagrams existed. I exported three sets of data, for OD1, OD2, and OD3.  4.2 Social network analysis workflowFor the purpose of this study the whole Threadz data-set table (Figure 4.1.4) was exported as a csv-file. The data was cleaned in MS Excel, removing all columns except 2 and 3. The source column of initial posts was filled up with the source label “Prompt” in order to be processed by Gephi5 (Bastian et al., 2009), an open source SNA tool, which allows users to visualize and analyze networks.  The cleaned table was imported in Gephi version 0.9.2. as a directed network. Gephi allows computation of socio-centric and ego-centric SNA measures. Both were copied and exported into separate csv-files. This procedure was conducted three times, for OD1, OD2 and OD3.5 https://www.gephi.org384.3 Threadz-accessThe browser extension TamperMonkey6 was used to run a “Data Access Report” script (Jones, 2019), to obtain Threadz-access counts. This script enables a button in the Canvas People page that will create Access Report data for all students in the course and exports it automatically as a csv-file. The online discussions were open for a week, accordingly after the discussion was closed, I ran the script to get the counts of Threadz visits per student. The report lists all Canvas-access interactions of a student, accordingly unrelated entries, which do not indicate Threadz visits were filtered out. For OD3 the results were subtracted from the previous counts. However, this data poses one of many limitations of this study. The Threadz-access count does not indicate, which online discussion the students looked at in Threadz. Even though the online discussion is closed, students might have revisited them.4.4 Survey In order to answer the research questions and to enrich the SNA findings, a survey consisting of quantitative items and open-ended questions was created. “Because it is far easier to measure quantity than quality in discussions posts” (Garbrick, 2018, p. 1), the survey is intended to complement the findings of the LA data. The pre-survey was distributed before the intervention, and the post-survey after the intervention took place. Both surveys together cover five sections: demographics, general behaviour, perception of online discussions, expectations of learning analytic tools, usage of Threadz. 6 https://www.tampermonkey.net/39Survey items on general behaviour relied on a slider scale to indicate self-efficacy related to the usage of social media apps, language proficiency and a sense of belonging within the cohort (Bandura, 2006; Zimmerman, Bandura, & Martinez-Pons, 1992). Questions related to expectations of the LA tool are taken from Whitelock-Wainwright et al. (2019) and adjusted accordingly to fit into the context of this study. Survey items addressing the usage of Threadz-access assess if students used the tool at all and if they understood its function. Some of those items cross-validate the data obtained from theCanvas data access report summary.The open-ended questions try to enrich the data set of this study and contribute to the investigation of the three research questions. They also allow students to express their opinion on Threadz and to address topics, which may not be covered by the previous items (Fink, 2009).The pre-survey was distributed on January 13th, 2020 and received 13 responses. OnJanuary 20th, 2020 the post-survey received 16 responses. Threadz was enabled on January 13th. The survey was created and distributed through the survey platform Qualtrics7. On both dates I visited the class explaining the purpose of the study and invited the students to fill out the survey. All questions of the participants were addressed to their satisfaction and the link to the survey was posted on the Canvas announcement page of the course. The students could use 15 minutes of the class to fill out the survey. After filling out the pre-survey I demonstrated how to access and use Threadz. The survey results were exported from Qualtrics as a csv-file using numeric values.7 https://www.qualtrics.com404.5 Merging sourcesThe three csv-files, the two Qualtrics-survey and Gephi’s ego-centric SNA export, were merged together in Excel. Adding two columns for the students’ Threadz-access count and further columns for word and post count of the online discussions completed the table. Open ended questions from the survey were removed. The merged csv-file contains all the data, which was described in the previous sub-chapters, except the socio-centric SNA and qualitative survey item responses, which were handled in separate csv-files.  ID Survey Data Ego-centric SNA Post/Word Count Threadz-accessParticipant’s name Pre Post OD 1 OD 2 OD 3 OD 1 OD 2 OD 3 OD 2 OD 3Figure 4.5.1 Abstract display of the csv file structure  This csv-file was imported into jamovi (The jamovi project, 2019), which is based on R, a free software environment for statistical computing (R CoreTeam, 2018), in order to run statistical analysis, such as t-tests and a correlation analysis. The two t-tests compared ego-centric SNA measures of OD1 and OD2 and OD1 and OD3. A correlation analysis took survey item responses, post/wordcount, ego-centric SNA measures and Threadz-access into account. An ANOVA compared grouped students. The results of those tests are described in the next chapter.4.6 Summary of data sourcesResulting from the embedded mixed method research design and the above stated methods, this study draws from four sources of data: SNA, Threadz access (Canvas Data Access Report), Canvas online discussions and a pre and post survey. The SNA metrics are quantitative 41as is Threadz-access count. Pre and post survey consist of open-ended survey items and Likert-scales (Jamieson, 2004). This study draws from qualitative and quantitative sources. The same goes for the Canvas online discussions itself. On one hand OD3 offered valuable qualitative content, such as students congratulating each other for their post-count achievement. On the other hand OD3 was the source for the SNA and offered metrics like word and post count, which were exported form Threadz. This thesis did not assess if online discussion contributions are related or add value to the course topic itself. The following figure summarizes the data sources described in this chapter and their purpose for analysis. Figure 4.6.1 Display of data sources and analytical methodsAll those data sources allow the investigation of the above stated research questions, illustrating the students’ engagement behaviour, attitudes and beliefs related to Canvas online discussions and a LA tool in a blended course environment. According to Prinsloo, Slade, & Galpin (2012, p. 1) this study would be considered to follow a ‘closed’ view approach due to the fact that this research is “focusing only on data produced by students’ interactions with 42institutions of higher learning”. This statement highlights one limitation of this ‘closed’ study. 93% of the participants indicate that they “engage in other social media platforms with the cohort discussing course content” (Pre survey, Q8_5). This engagement cannot be measured by the instruments of this research, which only monitor the participation in Canvas online discussions. However, the research question focuses on the exposure of LA data to students and their perception. Accordingly, the collected data of this study may still be valuable and offer legitimate insights of the cohort’s behaviour related to online discussions happening on Canvas. 43Chapter 5: Data Analysis and Findings5.1 Population and sample The population of the EDUC course, which participated in this study consisted of 18 students (N=18). Data acquired through Canvas, e.g. Threadz-access counts, SNA data draw from the whole population. However, pre- and post- survey results include a sample of 12 out of 18 students (n=12). Unfinished surveys were excluded from the study. Also, some students filled out either a pre- or a post- survey but not both. Those surveys were also excluded. Every student finished at least one survey consenting to their data collection. The average student age of the sample was 37 and consisted of 8 male and 4 female students. All except one student indicatedEnglish as their mother tongue. All the students, who filled out the surveys, were familiar with Canvas online discussions and used the LMS before. Furthermore, in the survey students were asked about their general behaviour in their overall daily life. They could indicate their state of agreement with the survey item expressing their proficiency in different fields on a slider scale form 0 – 100 (Kero & Lee, 2015).  Table 5.1.1 Descriptive statistics of survey items Q7, in which students indicate their own behaviour on a scale form 0-100social media usageengagement in class-discussionsEnglish language proficiencyConfidence in discussion skillsComfort in cohortMean 69.8 66.8 95.7 79.7 84.7SD 25 32.5 8.64 23.7 20.6Minimum 20 5 73 25 42Maximum 100 100 100 100 10044The table displays certain traits of the population. Apart from English language proficiency, there existed a wide range of responses throughout all other survey items according to a large standard deviation. The largest variance in responses could be found for Q7_2: “I engage in discussions in class in person.”, indicating that different preferences for in class discussions existed among the students. The social media usage of the students also varied, only 3 students indicated a value of 100 on the slider. Overall learners felt comfortable in the cohort (5 students reported a comfort-level of 100, while one student indicated 42).  Based on the characteristics,this was not a homogeneous cohort. One should be aware that LA embedded in blended courses may face several challenges (Liu, Gong, & Zhao, 2018) like low engagement due to significant teaching, learning and peer engagement, which occurred face to face rather than online.5.2 Socio-centric SNA comparison of the three online discussionsThe following table was created based on the results computed in Gephi and shows the socio-centric SNA results for OD1, OD2, OD3. Threadz was not enabled in OD1, which functioned as a baseline. OD1 and OD2 shared the same characteristics regarding graph density, modularity and clustering coefficient, while OD3 differed from OD1 in every aspect. The next paragraph explains this comparison in detail.45Table 5.2.1 Socio-centric network metrics of the three online discussionsOnline discussion 1Online discussion 2Online discussion 3average degree 3.2 3.7 4.8network diameter 6 7 4graph density 0.2 0.2 0.3modularity 0.2 0.2 0.1avg. clustering coefficient 0.3 0.3 0.4avg. path length 2.5 2.7 1.95.2.1 Comparing OD1 and OD2Comparing OD1 and OD2 revealed an increase of 14% in the average degree measures. This indicates that students wrote or received 14% more posts compered to OD1. The network diameter increased by 1, which shows that the length of the longest path between two students in the online discussion grew, the network was more spread out. Graph density, modularity and average clustering coefficient stayed the same, which indicates that the same amount of interactions happened and that the same percentage of possible connections between nodes existed (Hernández-García, González-González, Jiménez Zarco, & Chaparro-Peláez, 2016, p. 7). The path length increased by 8%, which discloses that the mean distance between all pairs of actors grew. This aligns with the finding of an increased network diameter. To summarize, minor increases in degree, diameter and path length are indicators of a “wider” network. The interaction happening within the network is distributed over slightly more actors compared to OD 1.465.2.2 Comparing OD1 and OD3The average degree increased by 50%, the graph density by 67% and the average clustering coefficient by 34%. Those amplified measures indicate that OD3 compared to OD1 counted more interactions resulting in a denser online discussion, in which actors were connected to more participants as the lower average path length indicates. OD3 recorded also the lowest diameter value. The network diameter describes the distance between the most separated nodes expressing “the largest number of nodes that must be traversed in order to travel from one node to another.” (Hernández-García et al., 2016, p. 7). OD3 had a diameter of 4, which indicates that the discussion had less “cliques” (Carolan, 2014a). Less separated discussion threads occurredthan in the other two online discussions. Modularity, which divides the graph into different clusters based on the numbers of interactions of each node, decreased by 50%. Some research suggest that modularity could be used as an indicator of engagement within the context of online discussions (Garbrick, 2018, p. 99). The clustering coefficient, which relates to the modularity,measures “how likely it is that two people connected to each other through someone else are also connected directly to each other; networks with high clustering tend to be cliquish and are known as cliques if everyone is connected directly to everyone else.” (Sclater, 2017, p. 257). The maximum value for this measure would be 1. The values 0.3 and 0.4 indicate that students weremore likely clustered and that those clusters did not communicate with each other. Overall, the network of OD3 had a major increase in interactions and denser. Participants were more closely connected compared to OD1.475.2.3 SNA visualizationVisualizing the online discussions with the help of Gephi offers the following illustrations: Figure 5.2.1 SNA visualization of OD1Figure 5.2.2 SNA visualization of OD2Figure 5.2.3 SNA visualization of OD3The three sociograms represent the directed networks and the above-mentioned measures of the three online discussions. “Force Atlas” is the chosen layout algorithm, which is the a common choice in SNA (Wise, Cui, & Jin, 2017) in order to ensure a clear organization of nodes and edges. A node (dot) represents a student, while the edge (string) defines the interaction between them. The visuals reveal that OD1 and OD2 consisted of one community, while OD3 might be separated into two different clusters, an upper and lower one. This would align with the measured cluster coefficient. 48A major difference of the three networks might be the centre of OD3. One student received and sent strong interactions. The thick strings indicate a higher than average interaction rate. Most of the nodes in OD1 had the same distance to each other, while at the lower part of the OD2 network, nodes were closer situated to each other starting to form a cluster. However, the cluster coefficient of OD1 and OD2 were the same. The nodes of the OD3 network had all different distances to each other, which aligns with the measure of a lower network diameter and the slight forming of clusters. To summarize the previous findings, OD1 and OD2 shares the same characteristics, only minor increases in OD2 could be found. Whereas OD1 and OD3 show major differences due to an increase in graph density and average degree of the whole network. In order to determine if the average degree value of OD1, 3.2, is statistically significantly different from 3.7 or 4.8, the next chapter pays attention to the degree value on an ego-centric level.5.3 T test ego-centric SNA measuresA network is the sum of its individual parts. This section explores in detail the differences of the three networks on an ego-centric level conducting two paired t tests to compare the students’ ego-centric SNA measures of OD1 and OD2, and OD1 and OD3. Threadz was not available in OD1. In contrast to the previous chapter, which explored the measures of networks, the following measures are attributed to individual participants of those networks.49The null hypothesis stated that there was no statistically significant difference in the ego-centric SNA measures, H0: μ = 0. The alternative hypothesis stated that there was a change in the ego-centric SNA measures, H1: μ ≠ 0. The significance level was α = .05 as I conducted a two-tailed test. Based on the finding of the previous chapter the assumption is that the test fails to reject the null hypothesis. Table 5.3.1 Paired Sample t-test comparing ego-centric SNA measures of OD1 and OD295% Confidence Intervalstatistic df p Lower Upperindegree indegree_2 Student's t -1.524 17.0 0.146 -3.312 0.5344outdegree outdegree_2 -0.652 17.0 0.523 -1.412 0.7454closnesscentrality closnesscentrality_2 -0.960 17.0 0.350 -0.183 0.0686betweenesscentrality betweenesscentrality_2 -0.936 17.0 0.363 -23.245 8.9621eigencentrality eigencentrality_2 -1.276 17.0 0.219 -0.218 0.0536Degree Degree_2 -1.915 17.0 0.073 -3.620 0.1757A test of normality (Shapiro-Wilk) was conducted. The assumption of normality test is important for small samples indicating that the data is not normally distributed, therefore results for those variables should be interpreted with caution or not at all (Gravetter & Wallnau, 2014). The indegree variable is the only one which violates the assumption of normality, which means that results related to the indegree value should be interpreted with caution or not at all.  The above stated results show that there was no significant change in ego-centric SNA measures. We fail to reject the null hypothesis. Students did not behave different in OD2 compared to OD1 based on ego-centric SNA measures. Even though the t test for Degree (t(17) = -1.915, p = .073, CI [-3.620, 0.1757] was close to indicating a significant finding. 50The following t test analyzes the same phenomenon comparing OD1 and OD3. The null hypothesis stayed the same.Table 5.3.2 Paired Sample t-test comparing ego-centric SNA measures of OD1 and OD395% Confidence Intervalstatistic df p Lower Upperindegree indegree_3 Student' s t -2.18 17.0 0.043 -4.041 -0.07027outdegree outdegree_3 -1.69 17.0 0.108 -1.622 0.17720closnesscentrality closnesscentrality_3 -2.01 17.0 0.061 -0.166 0.00417betweenesscentrality betweenesscentrality_3 2.00 17.0 0.062 -0.593 22.52047eigencentrality eigencentrality_3 -2.54 17.0 0.021 -0.292 -0.02691Degree Degree_3 -3.09 17.0 0.007 -4.674 -0.88181A test of normality (Shapiro-Wilk) was conducted. The indegree comparison is again the only one, which violates the assumption of normality. The variable should be disregarded. The results of this t test indicate that I can reject the null hypothesis for the following ego-centric SNA measures: eigen-centrality and Degree. The effect size for Degree, Cohen’s d, was large (d=0.729). There was a significant increase in the average value for those three measures. The increase of the Degree value indicated that every student in the third online discussion had more interactions than in the first one. A rise in eigen-centrality shows that students had a wide-reaching influence and were connected to other students, who were not isolated. Figure 5.3.1 Plot showing the increase and variability of the Degree measure of OD1 and OD351To summarize, the comparison of the students’ behaviour in online discussions measured by SNA did not differ significantly from OD1 to OD2. The comparison of OD1 and OD3 reveals statistically significant differences in ego-centric network measures. Those findings answer the closing question of the previous chapter. The Degree measure of OD 3 is significantly different from OD1. With regards to RQ 2 this might indicate that enabling Threadz changed the nature of network participation. The next paragraphs explore an aspect of this possibility by analyzing the Threadz-access counts of OD3, a correlation test and an ANOVA. The ANOVA compares ego-centric network measures of three group of students. Students are grouped by their Threadz-access counts.5.4 Threadz-access countsThe following table shows the descriptive statistics of the Threadz-access counts for OD3. On average students visited Threadz almost 4 times and one student 17 times within aweek. The standard deviation is larger than the mean indicating a spread-out dataset. The students demonstrated heterogenous behaviour, when it comes to Threadz visits.  Table 5.4.1 Descriptive statistics of Threadz-access of OD 3DescriptivesThreadz_access_3N 18Mean 3.89Median 2.00Standard deviation 4.68Minimum 0Maximum 17Skewness 1.6052DescriptivesThreadz_access_3Kurtosis 2.4033.33th percentile 1.0066.67th percentile 4.00The low skewness (<2) and kurtosis (<3) value allowed me to conduct a correlation test(Gravetter & Wallnau, 2014). Based on the percentile limits I grouped the students into three categories for the ANOVA (see chapter 5.5). 5.4.1 Threadz-access correlating with egocentric SNA measures This study uses the Spearman’s rho correlation treating the Threadz-access counts as ordinal data.Table 5.4.2 Correlating Threadz-access and ego-centric SNA measures of OD3Closness-centralityBetweenness-centralityEigen-centrality indegree outdegree DegreeThreadz-accessSpearman's rho 0.100 0.124 0.241 0.293 0.237 0.281p-value 0.693 0.623 0.336 0.238 0.344 0.258Correlating the Threadz visit count of every student with their ego-centric SNA measures did not reveal any statistically significant findings. Accordingly, the count of accessing Threadz did not correlate with the students’ behaviour measured by SNA. To gain further insights the next chapter describes an ANOVA and how I decided to group the students based on their Threadz-access.535.5 ANOVA – Threadz-access Based on their Threadz-access count of OD3 students were grouped into three frequency bins: no user (0 Threadz visits), moderate user (1-4 visits) and heavy user (>4 visits). Every group consisted of 6 out of 18 students. The null hypothesis stated that there is no difference in the ego-centric SNA measures among the three groups, H0: μ1 = μ2 = μ3= 0. The alternative hypothesis stated that there is a change in the ego-centric SNA measures in at least one of the groups, H1: μk ≠ 0.Table 5.5.1 ANOVA of ego-centric SNA measures of OD3. Students are separated in 3 user groups (no Threadz-access, moderate, heavy)One-Way ANOVA (Welch's)F df1 df2 pindegree_3 0.783 2 8.51 0.487outdegree_3 2.654 2 9.85 0.120closnesscentrality_3 1.156 2 8.70 0.359betweenesscentrality_3 1.628 2 9.73 0.245eigencentrality_3 0.727 2 8.28 0.512Degree_3 1.760 2 8.18 0.231A test of normality (Shapiro-Wilk) was conducted. The indegree and closeness-centrality violate the assumption of normality. Welch’s ANOVA is able to compare unequal variances.  The ANOVA failed to reject the null hypothesis. The results indicate that there was no significant difference of ego-centric SNA measures between the three different groups of Threadz users. This test demonstrates that the frequency of Threadz access does not seem to be related to ego-centric SNA metrics. The relationship can not be phrased as: the higher the Threadz-access count the higher the change in behaviour measured by SNA. The ANOVA 54compared three groups. One group, the no-user, is defined by a Threadz-access count of 0. Six students fell into this category. The ANOVA results show that their network metrics do not differ significantly from students, who used Threadz. These are important findings regarding RQ2 with focus on the exposure of LA data. Students who look at Threadz and students who do not access the plugin at all still have quite similar egocentric-SNA metrics.5.6 Summary of SNA, t tests, correlations and ANOVA The socio-centric SNA showed that OD1 and OD2 were similar. The network of OD3 showed different characteristics than OD1. The t test of the ego-centric SNA measures indicatedthat OD3 is statistically significant different from OD1. On an ego-centric level taking the Threadz visits into account, the previous analysis demonstrated that the impact of Threadz, if there is one, can not be expressed through a simple correlation or comparison of means. The absence of significant correlations and the ANOVA results might indicate that using Threadz more does not lead to more change in ego-centric SNA metrics. Based on those quantitative findings RQ2 might be answered with no. However, one outlier data point in this small sample may have had a significant impact on overall results. This outlier will be discussed in the next paragraph.  555.7 The outlier Visualizing the Threadz-access counts of OD2 and OD3 creates the following histogram:Figure 5.7.1 Threadz access counts for OD2 (black) and OD3 (blue)The histogram shows that 4 out of 18 students did not accessed Threadz during OD2 and OD3. It displays, that students did accessed Threadz slightly more during OD2. Overall, most of the students have accessed Threadz on average eight times per discussion (Table 5.7.1). The outlier has a dramatic effect on the mean and the median of 5 might be a more meaningful representation of the average Threadz-access behaviour. 4 students did not visit Threadz at all. One student accessed Threadz 35 times during OD2. This data point is an extreme outlier. Visual inspection of a distribution with the help of a histogram (Figure 5.7.1) or cut-offs can reveal outlier. The histogram displays the outlier (Threadz-access = 35) and a positively skewed distribution. The distribution of OD2 had a standard deviation of 7.94. The distance between the mean and the outlier was more than four times of the standard deviation, which rendered this data point an extreme outlier (Dawson, 2011; Osborne & Overbay, 2004).56Table 5.7.1 Descriptive statistics of Threadz-access in total, for OD2 and OD3DescriptivesThreadz-accesstotalThreadz-access ofOD 2Threadz-access ofOD3N 18 18 18Mean 8.44 4.61 3.89Median 5.00 2.50 2.00Standard deviation 11.8 7.94 4.68Minimum 0 0 0Maximum 52 35 17Skewness 3.26 3.64 1.60The skewness (3.64) in OD2 is caused by the outlier. “There is no standard rule or agreement about how to manage outliers that are a legitimate part of a data set.” (Kovach & Ke, 2016, p. 206). On the one hand extreme scores might represent legitimate findings and could possible guide new research questions. In this case: Why did Threadz work well for this particular student? On the other hand this outlying data point might just be caused by a “novelty effect” (Sung, Yang, & Lee, 2017). Enriching the discussion about this outlier, a closer look at the student’s qualitative data offered further insights in his motivation and attitudes. The student stated in the pre-survey: I enjoy it [online discussions] and consider it valuable, provided students participate. As with group work, and other practices that appear to allow limited oversight by instructors, online discussion can encourage 'free-riding'. This student had a positive attitude towards online discussions and might be an avid student even without the access to Threadz. In the post-survey the student stated: 57It's [Threadz] useless for me when we don't have adequate time to see everyone's post and comment on it. The student’s statement verified findings in the literature. Mansur, Yusof, & Othman (2011)researched on investigated learner contributions in e-learning settings proving that lack of time affects the amount of contributions in a network. Furthermore, the student’s statement revealedthat the learner was avidly using Threadz in order to observe his peer’s posts. The student’s critique was justified and caused by the research design of this study. In order to get a snapshot and track the evolution of the Canvas online discussions, students had a timeframe of one week to engage in the online discussion. After one week the online discussions closed and a new online discussion opened. Apparently, the outlier felt motivated by Threadz to post and comment but frustrated by the limiting research design. He was not able to submit posts after the deadline. This also explains why there is no extreme outlier in OD3 anymore. The outlier from OD2 got frustrated and stopped using Threadz. This observation might also indicate that some learners really found value in being able to compare their activity to others. Another interesting aspect of the characteristics of the outlier was his high sense of belonging in the pre-survey (85 on a scale of 100). This is not surprising according to Suh (2005), who attests that popular students aremore active in networks. Both findings add value to the inquiry of RQ 3.5.8 Gamifying learning analyticsI have to give credit to the audacity of this student cohort. As mentioned earlier, qualitative data critiqued the research design of this study and exploring the content of OD3 revealed an interesting finding. The following posts originated from OD3: 58Figure 5.8.1 Student's post in OD3By design there is no traditional ranking system in Threadz. However, the students used the relative size of the orange circles, the nodes, as a ranking indicator. Apparently, students abused the research design to gamify the online discussion. According to this post they used Threadz as a ranking system and tried to win the game. There are several studies which confirm that LA has been “applied in educational context as an attempt to reduce motivational problems” (Klock, Ogawa, Gasparini, & Pimenta, 2018, p. 133). However, gamifying the online discussions was not the intent of this study. Assuming, that some students also used the online discussion as a game, exposing LA data to students might had an impact on their behaviour, because they would not do so, if they had no access to their engagement data through the SNA. The reason why students started to treat the online discussions as a game remained unclear based on the qualitative data. One suggestion might be that they were experimenting with something novel in order to see how it works. Literature confirms that students taking part in gamified LA experiments have a higher satisfaction but there is no significant impact on engagement or interaction (Klock et al., 2018). Talking about satisfaction, the following post shows that some students really enjoyed the game they invented. 59Figure 5.8.2 Student's post in OD3However, all those comments, which were not related to the course topic itself are embedded in relevant course content posts, except the first comment. The students established a meta-level, where they talked about their engagement, while the regular discussion went on. Even if statistical instruments might not measure this change, one should regard the following quote: “Learning is not best judged by change in minds but by changing participation; in changing practices.” (Lave, 1996, p. 162). Regarding the qualitative data drawn from discussion posts, exposure to LA data had an impact on some students and changed their participation and practice in this online discussion. Due to the exposure to SNA students might had formed an “affinity group” (Gee, 2007, p. 203) based on a common endeavor, which was getting to the top of the Threadz ranking. Furthermore, students might realize the bigger picture of their own role situated within a network of relationships. Even, if students might made fun of Threadz using it as a toy, they became aware of their interrelationship to each other. The students also complimented each other for their achievements as the following post shows: Figure 5.8.3 Student's post in OD360This post, which appeared at first glance like a simple cheer on, might prove that some students acquired data literacy skills. The student was able to read and interpret the SNA visualization. “To survive the postmodern adventure and the rapidly changing cybersociety, children need critical media and technoliteracy to become informed consumers of cyberaesthetics.” (Chung, 2010, p. 67). Even if the students were gamifying LA, this contributed to their data literacy skills, which are crucial according to Sweeny. Overall the students’ behaviour might prove that Threadz has the potential to turn this online discussion into a psychosocial moratorium (Schwartz, Alan, & Skhirtladze, 2006), which is defined as “a learning space in which the learner can take risks where real-world consequences are lowered.” (Gee, 2007, p. 62). Those students who engaged in online discussions probably would not have the audacity to stand up in class and just talk for the sake of talking. And then announce that they just talked to get the highest talking time ranking. This ties in to the idea of a real-identity and a virtual-identity (Shaffer, 2008). In this study Threadz triggered this certain behaviour, which usually occurs during playing videogames, taking into account the average student age of 37. Adult learners might respond differently to LA exposure than younger students. But not only age is a consideration. Cultural factors and personal aversions towards technology might played a role in this context (Cannon, 2018; National Academies of Sciences, Engineering, Medicine, 2018). The next paragraph reveals further insights about the cultural background and personal attitudes of the students. 615.9 Survey item correlation Researchers using a LMS in blended courses found that discussion posts and the amount of communication between peers were significantly correlated with student performance (Conijn, Snijders, Kleingeld, & Matzat, 2017). In this study the main communication might not happenedthrough Canvas. The survey data proved that all students of the sample, except one, used other social media channels to discuss about course content. Survey item Q8_5 stated: “I engage in other social media platforms with the cohort discussing course content.” Correlating this item with further statistics revealed insights, which highlights the fact that in this particular research study the gathered LMS data might not be that significant due to the research design and blended course circumstances. The literature indicates that research results investigating LA in blended courses can strongly vary (Conijn et al., 2017).Table 5.9.1 Correlating survey items with Threadz-access and wordcountOther social media platforms(Q8_5)Preference of in class discussion (Q8_2)OD 2 average wordcount Pearson's r -0.671 -0.579p-value 0.017 0.049OD 3 average wordcount Pearson's r -0.289 -0.594p-value 0.362 0.042Total Threadz access Pearson's r 0.105 -0.583p-value 0.746 0.047Comfort in cohort (Q7_7) Pearson's r 0.649 0.272p-value 0.022 0.392Students who expressed in the pre-survey that they “prefer in class discussions over online discussions” (Survey item Q 8_2) were more likely to have a lower word count in all online discussions than students who disagree with the statement (r(11) = -.806, p = .002). Students 62who used other social media channels to discuss about specific course content (survey item Q8_5) were more likely to have a lower average wordcount. Learners, who felt more comfortable in in-class discussions might took advantage of this in a blended course rather than focusing on online discussions. There was a negative correlation between the access of Threadz and the valuing of in-class discussion. The more students preferred in-class discussions the more likely they neglected Threadz (r(11) = .583, p = .047). Also interesting to note is the fact that there was a positive correlation (r(11) = .649, p= .022) between the survey item Q8_5: “I engage in other social media platforms with the cohort discussing course content.” and Q7_7: “I feel comfortable in my cohort.”. This suggests that online discussions might contributed to a sense of belonging and the overall well-being of students in a course. However, these discussions did not happen under surveillance in Canvas. Students used other channels to communicate, which might be one of the elements, which limited student contribution in online discussions (Khe et al., 2010). Unfortunately, those discussions happening outside of the institutional learningenvironment could not be measured and assessed through this research.5.10 Content analysis of open-ended survey itemsThe general opinion about online discussions in an educational context was mixed but with a positive tendency. 5 out of 12 reported that they benefited from online discussions and regard them as useful. 4 students opposed online discussions and felt it “removes the human aspect from discussion”. This is a recurring theme in the literature (Dawson, 2010) that educational technology might foster social isolation. All students, who opposed online discussions, also questioned the usability of Threadz while 3 out of 12 responses confirmed that 63Threadz had an impact on the way they engaged in online discussions. “Threadz stimulated me to contribute more to the discussion than I would have otherwise”, stated one learner. This confirmed findings of previous research (Wise, 2014), but in this study this experience seems to be the exception. 4 out of 12 students classified Threadz as a “distraction”. Asking the studentsabout their opinion on Threadz mirrored the findings of the previous paragraphs about gamification. 4 students reported that it “turned into a bit of a game to see who could get the larger connection ball.” Keeping in mind that this undergraduate course consisted of teacher candidates and working educators. The last open-ended question investigated if they would use Threadz in their own courses. 4 students expressed their interest to utilize Threadz in their own courses as a teacher. 3 participants stated maybe, while the rest of 5 students would not try it.To summarize, the above-mentioned findings indicated that some students’ engagement in online discussions was influenced by the visualization of SNA data. The majority stated that Threadz had no influence on them. With regards to the third research question it can be summarized that the experience and perception of Threadz varied and depended on the general attitude towards online discussions in an educational context. Another finding indicated that the visualization of SNA data was perceived as distracting by some students. It is “important that analytics have demonstrable benefits for learners and educators and do not distract or mislead them.” (Ferguson et al., 2014, p. 120). However, while some students might enjoy the experience of Threadz, others might not. As always in education, there is no one-size-fits-all solution.64Chapter 6: DiscussionThis chapter discusses the findings of the data analysis and addresses the strengths and limitations of this study phrasing recommendations for future research. Overall the research design proved to be valuable offering detailed insights in order to answer most of the research questions. 6.1 Research question 1Does calling student’s attention to particular learning analytics data change their way of learning? This research question posed a challenge. The research design failed to asses a baseline of the students’ usual way of learning. The SNA indicated that the average participation of the learnerincreased gradually during the online discussions. One might reason that students started to dedicate more attention to the online discussion at the end than in the beginning of the one-month course. However, I could not assume that this had an impact on their learning. Literature highlights that accumulated empirical evidence supports the assumption that LA is able to support student learning (Huberth, Chen, Tritz, & McKay, 2015). If this was the case, in this study it could not be inferred solely relying on the employed instruments. The findings of qualitative data (chapter 5.10) did not support the hypothesis that in this case LA changed the students’ way of learning. Highlighting the following comment from an open-ended survey item, challenged the theoretical assumption of progressive education embedded in LA: 65I feel it [online discussions] removes the human aspect from discussion and [that is] not really what society needs these days.As stated previously (chapter 2), a social-romantic ideology values social interactions and community as a key element in education (Dewey, 1938; Vygotsky, 1978). By employing SNA LA subscribes to this principle and acknowledges the significance of social interactions in the context of learning. The marriage of SNA and LA was motivated by the urge to foster social learning and peer interactions among students in higher education institutes (Chen & Zhu, 2019). According to the quoted student, an online environment in general works against this idea of social learning, forging a strong argument for the need of LA to emphasize and illustrate the social component of it.  In regard to RQ1, this study fails to gather meaningful quantitative data to answer this question. Qualitative data offers only a vague and anecdotal image. This study can neither reject nor confirm the assumption that students change their way of learning due to exposure to LA data in this particular context.6.2 Research question 2Do students presented with SNA data on online course discussions adjust their engagement behavior? Most of the gathered data informed an answer to this question. The correlation study, (chapter 5.4) which took the students’ Threadz-access and ego-centric SNA measures into account, demonstrated that there is no statistically significant correlation measured by the study’s instruments. There existed no linear relationship between Threadz visits and behaviour measured 66by SNA. However, a statistically significant difference of the ego-centric SNA measures existed comparing OD1 and OD3. This research could also identify an outlier, who confirmed the assumption that Threadz-access might have an impact on engagement behaviour in online discussions. However, this did not apply to most of the students, which aligns with prior findings in the literature. According to Czerkawski (2016) and Greenhow & Askari (2017) online discussion environments came along with a need to support learner participation and engagement from pedagogical and technological perspectives. Focusing on the fact that there existed an outlier in the network, who attested that he was influenced by the display of Threadz (chapter 5.7), I might consider that this of course affected the overall network dynamics. The behaviour change of one member might have influenced other participants, too. That is the nature of a network. Following this approach, renders the Threadz-access counts as a cause of change questionable. Even some members, who did not visit Threadz, were part of a network in which this LA tool was enabled, influencing the whole network, even those who did not access it. The SNA and the t-tests (chapter 5.2) showed that after enabling Threadz there was no significant difference between OD1 and OD2. This might indicate that students presented with social network analysis data on online course discussions did not adjust their engagement behaviour. However, a SNA and t-test between OD1 and OD3 revealed a significant increase in engagement measured by ego-centric SNA metrics. Considering the previously mentioned correlation study (chapter 5.4.1), this change might not be correlated with access to Threadz. Those are the findings focusing on quantitative analysis results. The content-analysis of open-ended survey items and discussion posts revealed that qualitative analysis results contradict quantitative findings. While the correlation test and the ANOVA (chapter 5.5) did not reveal any statistically significant findings related to Threadz-access. At least five students stated in the 67online discussion forum and in the open-ended survey questions (chapter 5.10) that visiting Threadz influenced their engagement behavior. Relying on an embedded mixed-method research design this phenomenon is as old as the method itself and appeared quite often in social science research (Deacon, Bryman, & Fenton, 1998). Initially I expected the results to converge rather than diverge. The fact that qualitative and quantitative findings contradicted each other in some points may indicate a poor choice of measurement instruments (quantitative analysis for n=18 is not powerful (Gravetter & Wallnau, 2014)) or a wrong interpretation of content (Elliott, 2005). Self-reported data might also be unreliable (Laven, 2012). However, the contradiction of qualitative and quantitative findings might enrich this research, rather than diminishing it. It emphasizes the importance of choosing the right instrument for the right measure. Summarizing results in regard of RQ2, this study highlights that in this case the answer to the question might be blurry. Quantitative findings reveal that OD2 does not differ significantlyfrom an online discussion without Threadz, while OD3 is different. Qualitative findings suggest that SNA data exposure has a direct impact on the engagement behaviour of some students. 6.3 Research question 3What are student’s perceptions of the utility of learning analytics tools? This question relies solely on qualitative findings. The results of the content analysis of the discussion posts (chapter 5.8) indicated that the SNA data influenced the learner. Previous findings in the literature show that LA tools incorporating SNA have the ability “to support the values of student success, accountability, discussion engagement, and usability, but they can also press students to participate more actively and hereby hinder their autonomy as discussion 68participants.” (Chen & Zhu, 2019, p. 4). Statements of the students (chapter 5.10) aligned with Chen & Zhu’s premise. Some students of this study expressed their critique through the survey stating that Threadz distracts and promotes discouraging behaviour like bragging about SNA rating achievements. Even though the design of Threadz does not come with an obvious rating system. This perception is also supported by the literature: Being able to see all connections of a student at once can benefit the sense of community especially when a student sees herself well connected. However, it may also pose questions to a student’s self-image if she is less connected with peers in the community.(Chen & Zhu, 2019, p. 5)Some students perceived Threadz as beneficial others might avoid the tool. The research question can be answered. In this study, students’ perceptions of the utility of LA tools, Threadz in particular, are mixed.6.4 Limitations & future directionsThe research design could be optimized according to the previous findings. More online discussions with at least 30 participating students should be observed in order to run reliable statistical tests (Gravetter & Wallnau, 2014). Furthermore, online discussions should not be closed after one week to allow long-lasting engagement. In addition, for this research design Threadz-access data has to reveal, which particular online discussion students looked at. The contradiction of qualitative and quantitative findings does not pose a limitation. Further research might explore if this is an inherent phenomenon of embedded mixed method research.69The interrelationship of LA and social well-being (Cebrian & Vaquero, 2013) is another interesting topic which was highlighted by qualitative contributions and seems to play a role in online discussions. Considering the willingness of students to participate in studies utilizing LAtools the literature states that the adoption of such a learning analytics tool is often decided by the instructor and its acceptance among students may vary. Therefore, students may be further divided into those who choose to use the tool and those who reject the tool for various reasons. (Chen & Zhu, 2019, p. 4)This might be a reasonable step for further research.6.5 ConclusionOne of the problems of using the technological innovations in education is focusing on the exploitation of capabilities of technology rather than using it to meet the instructional need. (Chung, 2010, p. 64)This quote embodies the main issue of this research study and the essence of my data analysis. Participating in online discussions might not be pedagogically motivated in this case. This might be an issue of teaching and learning design, which gets highlighted through LA in some cases. LA may simply highlight what is happening in a course and might emphasize the gap between reality and expectations. Simply switching on Threadz might not result directly in higher engagement. Learning technologies should serve a pedagogical purpose “beyond simply providing virtual spaces for bringing learners together” (Dawson, 2010, p. 736). The students of 70this cohort met every day and the instructor’s teaching style, following connectivism, contributed to a vivid exchange of knowledge in class. Discussions were facilitated and students had plenty of room to express their opinions. Accordingly, the online discussions of this particular course might not be the right space to apply the instruments of this study. Thanks to the post-survey data I have evidence that students discussed about course related content on other social media channels, which can not be monitored within the framework of this study. Canvas and the course online discussions were the only space, where LA data could be gathered and exposed to students.  The previous observations and the discussion highlight that the results of this research study may not be suited to report significant findings in order to answer the initial research questions. “Network analysis methods should take into account the nature of MOOC discussion forums and partition social networks based on pedagogical purpose.” (Wise et al., 2017, p. 9). On the one hand, one might argue that the phrasing of the assignment related to the intervention (You will be graded using the following rubric on your original post, and at least two responses.) might have limited the nature of the online discussions. On the other hand, these are the pedagogical boundaries LA has to operate in on a daily basis.Throughout the research I came across some authors, who claim that “one of the major subtasks of learning analytics is to determine the efficacy of a curriculum.” (Gottipati & Shankararaman, 2014, p. 1). Furthermore, the scholars (Gottipati & Shankararaman, 2014; Singh, Singh, & Zerihun, 2018) argue that LA already has a strong impact on curriculum design and student performance in some cases, which aligns with Avella's et al. (2016) review. Looking 71through the lens of curriculum theory, which was established during the first chapters of this thesis, this study demonstrated that LA can be seen as a form of assessment.By design – or else by accident - the use of a learning analytics tool is always aligned with assessment regimes, which are in turn grounded in epistemological assumptions and pedagogical practices. (Knight & Buckingham Shum, 2017, p. 17)Epistemological assumptions and pedagogical practices relate to educational ideologies, which were reviewed earlier. Especially the incident that some students (ab)used LA to develop a ranking system during the experiment, illustrates that LA can function as assessment “by design – or else by accident”. Regarding LA as a form of assessment, LA relates directly to the concept of objective and teaching, according to the Tyler rationale (objective – teaching –assessment). This triangle relationship is also acknowledged by some LA scholars (Knight, Buckingham Shum, & Littleton, 2014, p. 25): Figure 6.5.1 The Epistemology - Assessment - Pedagogy triadThe term “epistemology” relates to Tyler’s thoughts on objective, while “pedagogy”relates to teaching in this context. The triangle is highlighting the importance and impact of LAand its central role in curriculum design. This research study with its Threadz-intervention focused on a meso- and micro- level of a LA application, but also tires to consider the relationship to ideology and epistemology. If LA might not be regarded as a form of assessment 72it might be hard to argue how it wants to improve educational practices. Assessment is a crucial component of education and shaped it throughout the last 70 years following the Tyler rationale(Farenga & Ness, 2015). “Learning analytics, as a new form of assessment instrument, have potential to support current educational practices, or to challenge them and reshape education; […].” (Knight & Buckingham Shum, 2017, p. 19). This claim and the potential of LA bases on the assumption that LA is a form of assessment. However, accepting that LA is assessment might depend on the definition and understanding of the terminology of assessment. As mentioned earlier within the framework of this thesis assessment is defined as “a wide variety of methods or tools that educators use to evaluate, measure, and document the academic readiness, learning progress, skill acquisition, or educational needs of students.” (Great Schools Partnership, 2015). It should be acknowledged that different definitions exist and that a debate of the difference of evaluation and assessment in an educational context lies beyond the scope of this research.  Another concluding thought relates to the research design, which uses SNA as an intervention (in form of Threadz) and additionally as analysis method (through Gephi). This research design embodies one remarkable feature of LA: “The new possibility is that educators and learners […] are for the first time able to see their own processes and progress rendered in ways that until now were the preserve of researchers outside the system.” (Knight & Buckingham Shum, 2017, p. 18). Educators, students and researchers have access to and can use the exact same kind of data at the same time. Some students took advantage of this during this study, as the anecdotal case of a student showed, who complained about the closed online discussion forums. It is not the privilege of a researcher anymore, to use educational data.Educators and students become part of the analytic loop: “the system becomes reflexive (people change in response to the act of observation, and explicit feedback loops).” (Knight & 73Buckingham Shum, 2017, p. 18). However, this study also showed that just enabling the data does not mean that all students might use it and benefit from it. The students’ beliefs had an impact on their perception and use of the LA plugin. When students opposed Threadz because they rejected to see social value in online environments, they might still subscribe to a social-romantic ideology, which aligns with LA’ appreciation of SNA. In conclusion, ideologies play a role in every context of decision-making in an educational context, even if they are not perceived on a conscious level. They are underlying forces shaping the educational landscape. When the instructor enabled Threadz in the course, it was with the intend to support students, based on the assumption that LA brings benefits for the learner. And it did for some of them. As mentioned earlier the classroom consisted of students with heterogenic characteristics. LA has the potential to be the key as a form of effective feedback for those encouraging engagement, as some of the students demonstrated. Through the mining of educational data from the LMS and visualizing the SNA, LA bridged the gap between a scientific-management approach and a social-romantic ideology. LA mined educational data in order to promote progressivism.74BibliographyAdler, M. J. (2004). The Paideia Proposal. In S. J. Thornton & D. J. Flinders (Eds.), The curriculum studies reader (2nd ed., pp. 183–186). Abington, UK: RoutledgeFalmer.Amara, A., & Richards, D. (2013). Learning analytics in higher education: A Summary of tools and approaches. In 30th ascilite conference (Vol. 30). Auerbach, C. F., & Silverstein, L. B. (2003). Qualitative data: An introduction to coding and analysis. New York: New York University Press.Avella, J. T., Kebritchi, M., Nunn, S. G., & Kanai, T. (2016). Learning analytics methods, benefits, and challenges in higher education: A systematic literature review. Journal of Interactive Online Learning, 20(2), 1–17.Baker, R., Wang, Y., Paquette, L., Aleven, V., Popescu, O., Sewall, J., … Bergner, Y. (2016). Educational data mining: A MOOC experience. In S. ElAtia, D. Ipperciel, & O. Zaïane (Eds.), Data mining and learning analytics: Applications in educational research (1st ed.). Hoboken, New Jersey: John Wiley & Sons, Inc.Bandura, A. (2006). Guide for constructing self-efficacy scales. In F. Pajares & T. S. Urdan (Eds.), Self-efficacy beliefs of adolescents (pp. 307–337). Greenwich, UK: Age Information Publishing.Banihashem, S., Aliabadi, K., Ardakani, S., Delaver, A., & Ahmadabadi, M. (2018). A systematic literature review in learning analytics. Interdisciplinary Journal of Virtual Learning in Medical Sciences, 1(June). https://doi.org/10.5812/ijvlms.63024Bastian, M., Heymann, S., & Jacomy, M. (2009). Gephi: an open source software for exploring and manipulating networks. In International AAAI Conference on weblogs and social media.75Biesta, G. (2007). Why “what works” won’t work: Evidence-based practice and the democratic deficit in educational research. Educational Theory, 57(1), 1–22. https://doi.org/10.1111/j.1741-5446.2006.00241.xBlack, E. W., Dawson, K., & Priem, J. (2008). Data for free: Using LMS activity logs to measure community in online courses. Internet and Higher Education, 11(2), 65–70. https://doi.org/10.1016/j.iheduc.2008.03.002Blank, G. (2017). Online research methods and social theory. In N. Fielding, R. Lee, & G. Blank (Eds.), The SAGE handbook of online research methods. London: SAGE Publications. https://doi.org/10.4135/9781473957992Bobbitt, F. (1924). How to make a curriculum. Boston: Houghton Mifflin.Bobbitt, F. (2004). Scientific method in the making. In D. J. Flinders & S. J. Thornton (Eds.), The curriculum studies reader (2nd ed., pp. 9–16). Abington, UK: RoutledgeFalmer.Bogarín, A., Romero, C., & Cerezo, R. (2015). Discovering students’ navigation paths in Moodle. Proceedings of the 8th International Conference on Educational Data Mining, 556–557.Booth, M. (2012). Learning analytics: The new black. 47(4), 52-53. EDUCAUSE Review, 47 July/Au(4), 52f.Bovo, A., Sanchez, S., Heiguy, O., & Duthen, Y. (2013). Analysis of students clustering results based on Moodle log data. Proceedings of the 6th International Conference on EducationalData Mining, 306–307. Retrieved from http://www.educationaldatamining.org/EDM2013/papers/rn_paper_54.pdfBrady, B., & O’Regan, C. (2009). Meeting the challenge of doing an RCT evaluatoin of youth mentoring in Ireland: A journeyy in mexed methods. Journal of Mixed Methods Research, 763(3), 265–280.Bruner, J. S. (2006). In search of pedagogy: Volume I. New York: Routedge.Buckingham Shum, S. (2012). Learning analytics policy brief. UNESCO. Retrieved from https://iite.unesco.org/pics/publications/en/files/3214711.pdfCampbell, J. P., & Oblinger, D. G. (2007). Academic Analytics. EDUCAUSE, Oct.Cannon, M. (2018). Digital media in education: Teaching, learning and literacy practices with young learners. Springer International Publishing. https://doi.org/10.1007/978-3-319-78304-8Carolan, B. V. (2014a). Social network analysis and education: Theory, methods & applications. Los Angeles: SAGE Publications Ltd.Carolan, B. V. (2014b). Structural measures for complete networks. In Social network analysis and education: Theory, methods & applications (pp. 97–110). Thousand Oaks, CA: SAGE Publications. https://doi.org/10.4135/9781452270104.n5Cebrian, M., & Vaquero, L. M. (2013). The rich club phenomenon in the classroom. Scientific Reports, 3, 1–8. https://doi.org/10.1038/srep01174Cela, K., Sicilia, M., & Sanchez, S. (2014). Social network analysis in e-learning environments: A preliminary systematic social network analysis. Educational Psychology Review, 27(1), 219–246. https://doi.org/10.1007/s10648-014-9276-0Charleer, S., Klerkx, J., & Duval, E. (2014). Learning Dashboards. Journal of Learning Analytics, 1(3), 199–202. https://doi.org/10.18608/jla.2014.13.22Charleer, S., Klerkx, J., Odriozola, S. L., & Duval, E. (2013). Improving awareness and reflection through collaborative, interactive visualizations of badges. In 3rd workshop on awareness and reflection in technology-enhanced learning. Retrieved from http://ceur-77ws.org/Vol-1103/paper5.pdfChen, B., & Zhu, H. (2019). Towards value-sensitive learning analytics design. In LAK19 (pp. 2–5). https://doi.org/10.1145/3303772.3303798Chung, S. K. (2010). Cyber media literacy art education. In R. W. Sweeny (Ed.), Inter/actions/inter/sections: Art education in a digital visual culture (pp. 63–71). Reston, VA: NAEA.Conijn, R., Snijders, C., Kleingeld, A., & Matzat, U. (2017). Predicting student performance from LMS data. IEEE Transactions on Learning Technologies, 10(1), 17–29. https://doi.org/10.1109/TLT.2016.2616312Cook, T. D., & Campbell, D. T. (1979). Quasi-experimentation: Design & analysis issues in field settings. Boston, Ma: Houghton Mifflin.Creswell, J. W., & Plano Clark, V. L. (2011). Designing and conducting mixed methods research(2nd ed.). Los Angeles: SAGE Publications.Czerkawski, B. C. (2016). Networked learning: design considerations for online instructors. Interactive Learning Environments, 24(8), 1856–1863. https://doi.org/10.1080/10494820.2015.1057744Dawson, R. (2011). How significant is a boxplot outlier? Journal of Statistics Education, 19(2).Dawson, S. (2010). “Seeing” the learning community: An exploration of the development of a resource for monitoring online student networking. British Journal of Educational Technology, 41(5), 736–752. https://doi.org/10.1111/j.1467-8535.2009.00970.xDawson, S., Burnett, B., & O’Donohue, M. (2006). Learning communities: An untapped sustainable competitive advantage for higher education. The International Journal of Educational Management, 20(2), 127–139.78De Laat, M., Lally, V., Lipponen, L., & Simons, R. (2007). Investigating patterns of interaction in networked learning and computer-supported collaborative learning: A role for Social Network Analysis. International Journal of Computer-Supported Collaborative Learning, 2(1), 87–103.Deacon, D., Bryman, A., & Fenton, N. (1998). Collision or collusion? A discussion and case study of the unplanned triangulation of quantitative and qualitative research methods. International Journal of Social Research Methodology, 1(1), 47–63.Denley, T., & Oblinger, D. G. (2012). Austin Peay State University: Degree Compass - Game Changers. EDUCAUSE - Education and Information Technologies.Dewey, J. (1938). Experience and education. New York: Macmillan.Dewey, J. (2004). My pedagogic creed. In S. J. Thornton & D. J. Flinders (Eds.), The curriculum studies reader (2nd ed.). Abington, UK: RoutledgeFalmer.Doll Jr., W. E. (2004). The four R’s: an alternative to the Tyler rationale. In D. J. Flinders & S. J. Thornton (Eds.), The curriculum studies readertudies Reader (2nd ed.). Abington, UK: RoutledgeFalmer.Dringus, L. P. (2011). Learning analytics considered harmful. Journal of Asynchronous Learning Networks, 16(3), 87–100.Du, X., Yang, J., Shelton, B. E., Hung, J., Zhang, M., Yang, J., & Shelton, B. E. (2019). A systematic meta-Review and analysis of learning analytics research. Behaviour & Information Technology, 3001. https://doi.org/10.1080/0144929X.2019.1669712Educause Learning Initiative. (2012). Seven things you should know about LMS alternatives. Retrieved from https://library.educause.edu/resources/2010/7/7-things-you-should-know-about-lms-alternatives79Edwards, C. H. (2011). Educational change: From traditional education to learning communities. Lanham, Md: Rowman & Littlefield Education.Eisner, E. W. (1979). The educational imagination: On the design and evaluation of school programs. New York: Macmillan.Elliott, J. (2005). Using narrative in social research. Thousand Oaks, CA: SAGE Publications.Farenga, S. J., & Ness, D. (2015). Encyclopedia of education and human development. Taylor and Francis. https://doi.org/10.4324/9781315704760Ferguson, R., Clow, D., Macfadyen, L., Essa, A., Dawson, S., & Alexander, S. (2014). Setting learning analytics in context: Overcoming the barriers to large-scale adoption. Journal of Learning Analytics, 1(3), 120–144. https://doi.org/10.1145/2567574.2567592Fink, A. (2009). How to conduct surveys: A step-by-step guide. Los Angeles: SAGE Publications.Flinders, David J., & Thornton, S. J. (2004). The curriculum studies reader (2nd ed.). New York: Routledge Falmer.Fritz, J. L. (2016). Using analytics to encourage student responsibility for learning and identify course designs that help. Doctoral Dissertation, University of Maryland, Baltimore County.Retrieved from https://search.proquest.com/docview/1795528531Fuchs, C. (2014). Social media as participatory culture. Social Media: A Critical Introduction, 52–68. https://doi.org/10.4135/9781446270066.n3Garbrick, A. H. (2018). Factors influencing student engagement in an online asynchronous discussion forum measured by quantity, quality, survey, and social network analysis. The Pennsylvania State University. Retrieved fromhttps://search.proquest.com/docview/208122204080Gay, L. R., Mills, G. E., & Airasian, P. (2012). Educational research : competencies for analysis. New York: Pearson.Gee, J. P. (2007). What video games have to teach us about learning and literacy (1st ed.). New York: Palgrave Macmillan.Gottipati, S., & Shankararaman, V. (2014). Learning analytics applied to curriculum analysis. In 2014 AIS SIGED: IAIM Conference. Retrieved from http://aisel.aisnet.org/siged2014/2Gravetter, F. J., & Wallnau, L. B. (2014). Essentials of statistics for the behavioral sciences (8th ed.). Belmont, CA: Wadsworth Cengage Learning.Gray, J. (1998). False dawn: The delusions of global capitalism. London: Verso.Great Schools Partnership. (2015). Assessment. Retrieved from https://www.edglossary.org/assessment/Greenhow, C., & Askari, E. (2017). Learning and teaching with social network sites: A decade of research in K-12 related education. Education and Information Technologies, 22(2), 623–645. https://doi.org/10.1007/ s10639-015-9446-9Griffiths, D., Drachsler, H., Kickmeier-Rust, Mi., Steiner, C., Hoel, T., & Greller, W. (2016). Is privacy a show stopper for Learning Analytics? A review of current issues and solutions.Learning Analytics Community Exchange, 6.Ha, L., & Yun, G. W. (2014). Digital divide in social media prosumption: Proclivity, production intensity, and prosumer typology among college students and general population. Journal of Communication and Media Research, 6(1), 45–62.Hern, M. (1996). Deschooling our lives. Gabriola Island, B.C: New Society Publishers.Hernández-García, Á., González-González, I., Jiménez Zarco, A. I., & Chaparro-Peláez, J. (2016). Visualizations of online course interactions for social network learning Analytics. 81International Journal of Emerging Technologies in Learning, 11(7), 6–15. https://doi.org/10.3991/ijet.v11i07.5889Hopkins, M. (2017). A review of social network analysis and education : Theory , methods , and applications. Journal of Educational and Behavioral Statistics, 42(5), 639–646. https://doi.org/10.3102/1076998617698111Howard, R. (2015). Social network analysis and visualization. Cambridge-Intelligence White Paper. Retrieved from https://cambridge-intelligence.com/social-network-analysis/Huberth, M., Chen, P., Tritz, J., & McKay, T. A. (2015). Computer-tailored student support in introductory physics. PloS One, 10(9). https://doi.org/10.1371/journal.pone.0137001Instructional Technology Design and Development. (n.d.). Threadz [Computer Software]. Cheney, Washington: Eastern Washington University. Retrieved from https://threadz.ewu.edu/International Educational Data Mining Society. (2011). Educational Data Mining. Retrieved from http://educationaldatamining.org/Jamieson, S. (2004). Likert scales: How to (ab)use them. Medical Education, 38(12), 1217–1218. https://doi.org/10.1111/j.1365-2929.2004.02012.xJohnson, A. (2019). Essential learning theories: Applications to authentic teaching situations. Lanham, Maryland: Rowman & Littlefield Publishers.Johnson, L., Adams, S., & Cummins, M. (2012). The NMC horizon report: higher education edition. Austin, TX: New Media Consortium.Johnson, R. B., & Onwuegbuzie, A. J. (2004). Mixed methods research: A research paradigm whose time has come. Educational Researcher, 33(7), 14–26. https://doi.org/10.3102/0013189X03300701482Jones, J. (2019). canvancement.Kero, P., & Lee, D. (2015). Slider scales and web-based surveys: A cautionary note. Journal of Research Practice, 11(1), 1–5.Khe, F. H., Wing, S. C., & Connie, S. L. N. (2010). Student contribution in asynchronous online discussion: a review of the research andempirical exploration. Instructional Science, 38(6), 517–606. Retrieved from https://www.jstor.org/stable/23372901Kliebard, H. M. (2002). Changing course: American curriculum reform in the 20th century. New York: Teachers College Press.Kliebard, H. M. (2004). The rise of scientific curriculum making and its aftermath. In D. J. F. & S. J. Thornton (Ed.), The curriculum studies reader (2nd ed.).Klock, A. C. T., Ogawa, A. N., Gasparini, I., & Pimenta, M. S. (2018). Integration of learning analytics techniques and gamification: An experimental study. Proceedings - IEEE 18th International Conference on Advanced Learning Technologies, ICALT 2018, 133–137. https://doi.org/10.1109/ICALT.2018.00039Knight, S., & Buckingham Shum, S. (2017). Theory and learning analytics. In C. Lang, G. Siemens, A. Wise, & D. Gasevic (Eds.), Handbook of learning analytics (First, pp. 17–22). SOLAR. https://doi.org/10.18608/hla17.001Knight, S., Buckingham Shum, S., & Littleton, K. (2014). Epistemology, assessment, pedagogy: Where learning meets analytics in the middle space. Journal of Learning Analytics, 1(2). Retrieved from https://epress.lib.uts.edu.au/jour-nals/index.php/JLA/article/view/3538Knox, D. (2010). Spies in the house of learning: A typology of surveillance in online learning environments. Retrieved from http://www.mun.ca/edge2010/wp-content/uploads/Knox-Dan-Spies-In-the-House.pdf83Kovach, C., & Ke, W. (2016). Handling those pesky statistical outliers. Research in Gerontological Nursing, 9(5), 206–207. https://doi.org/10.3928/19404921-20160902-01Lave, J. (1996). Teaching as learning in practice. Mind, Culture and Activity, 3(3), 149–163. Retrieved from https://people.ucsc.edu/~gwells/Files/Courses_Folder/documents/lave1996_teaching.pdfLaven, H. (2012). Self-reported data may be unreliable. Journal of Primary Health Care, 350.Lewis, M. (2015). LMS data format. Retrieved April 24, 2020, from https://github.com/mcjelewis/threadz/blob/master/LMS-data-format.txt#L3Lewis, M. (2016). Visualize discussions threadz. Retrieved April 24, 2020, from https://community.canvaslms.com/groups/admins/blog/2016/01/07/visualize-discussions-threadzLiu, Y., Gong, L., & Zhao, W. (2018). Using learning analytics to promote student engagement and achievement in blended learning: An empirical study. ACM International Conference Proceeding Series, 19–24. https://doi.org/10.1145/3241748.3241760Macfadyen, L. P., & Dawson, S. (2010). Mining LMS data to develop an ‘‘early warning system” for educators: A proof of concept. Computers & Education, 54(2), 588–599. https://doi.org/10.1016/j.compedu.2009.09.008Macfadyen, L. P., & Dawson, S. (2012). Numbers are not enough. Why e-learning analytics failed to inform an institutional strategic plan. Journal of Educational Technology & Society, 15(3), 149–163. Retrieved from http://www.jstor.org/stable/jeductechsoci.15.3.149Mansur, A. B., Yusof, N., & Othman, M. S. (2011). Analysis of social learning network for wiki in moodle E-learning. Interaction Sciences (ICIS), 4th Intern, 1–4.Mathison, S. (2004). A short history of educational assessment and standard-based educational 84reform. In S. Mathison & E. W. Ross (Eds.), Defending public schools: The nature and limits of standard-based reform and assessment. Westport, LT: Praeger Publishers.McKernan, J. (2007). Curriculum and imagination : process theory, pedagogy and action research. London, UK: Routledge.Minaei-Bidgoli, B., & Punch, W. (2003). Using genetic algorithms for data mining optimization in an educational web-based system. In Genetic and evolutionary computation conference(pp. 2252–2263). Chicago, USA.Mitchell, J. C. (1974). Social Networks. Annual Review of Anthropology, 3, 279–299.National Academies of Sciences Engineering and Medicine. (2018). How people learn II: Learners, contexts, and cultures. Washington D.C.: The National Academies Press. Newman, M. (2010). Networks: An introduction. Oxford: Oxford University Press.Osborne, J. W., & Overbay, A. (2004). The power of outliers (and why researchers should always check for them). Practical Assessment, Research & Evaluation, 9(6).Pardo, A., Han, F., & Ellis, R. A. (2016). Exploring the relation between self-regulation, online activities, and academic performance: A case study. In Sixth International Conference on Learning Analytics & Knowledge.Park, Y., & Jo, I.-H. (2015). Development of the learning analytics dashboard to support students’ learning performance. Journal of Universal Computer Science, 21(1), 110–133.Pashler, H., McDaniel, M., Rohrer, D., & Bjork, R. (2008). Learning styles: concepts and evidence. Psychological Science in the Public Interest, 9(3), 105–119. Retrieved from www.jstor.org/stable/20697325Pate, A., Lafitte, E. M., Ramachandran, S., & Caldwell, D. J. (2019). The use of exam wrappers to promote metacognition. Currents in Pharmacy Teaching and Learning, 11(5), 492–498. 85https://doi.org/10.1016/j.cptl.2019.02.008Petrina, S. (2004). The politics of curriculum and instructional Design/Theory/Form: Critical problems, projects, units, and modules. Interchange, 35(1), 81–126. https://doi.org/10.1023/B:INCH.0000039022.53130.d5Popham, W. J. (1969). Instructional objectives. Chicago: Rand McNall.Poulos, A., & Mahony, M. J. (2008). Effectiveness of feedback: the students’ perspective. Assessment & Evaluation in Higher Education, 33(2). https://doi.org/10.1080/02602930601127869Prinsloo, P., Slade, S., & Galpin, F. (2012). Learning analytics: Challenges, paradoxes and opportunities for mega open distance learning institutions. 2nd International Conference on Learning Analytics and Knowledge, 130–133. https://doi.org/10.1145/2330601.2330635R Core Team. (2018). R: A Language and environment for statistical computing. Retrieved from cran.r-project.org/Romero, C., Espejo, P., Zafra, A., Romero, J. R., & Ventura, S. (2013). Web usage mining for predicting final marks of students that use Moodle courses. Computer Applications in Engineering Education, 21(1), 135–146. https://doi.org/10.1002/cae.20456Romero, C., & Ventura, S. (2007). Educational data mining: A survey from 1995 to 2005. Expert Systems with Applications, 33, 135–146.Romero, C., Ventura, S., & García, E. (2008). Data mining in course management systems: Moodle case study and tutorial. Computers and Education, 51(1), 368–384. https://doi.org/10.1016/j.compedu.2007.05.016Ross, E. W., & Vinson, K. D. (2013). Resisting neoliberal education reform: Insurrectionist pedagogies and the pursuit of dangerous citizenship. Cultural Logic: Marxist Theory & 86Practice, 17–45.Schepis, D. (2011). Social network analysis from a qualitative perspective. In Australian and New Zealand Marketing Academy Conference (pp. 1–7). Perth: University of Western Australia.Schunk, D. H., & Zimmerman, B. J. (2012). Self-regulation and learning. In W. M. R. Weiner & G. E. Miller (Eds.), Handbook of psychology - Educational psychology (7th ed., pp. 45–68). New York: Hoboken.Schwartz, S. J., Alan, M., & Skhirtladze, N. (2006). Psychosocial Moratorium. In M. H. Bornstein (Ed.), The SAGE Encyclopedia of Lifespan Human Development. https://doi.org/10.4135/9781506307633.n662Sclater, N. (2017). Learning analytics explained. New York: Routledge. https://doi.org/10.4324/9781315679563Sclater, N., Peasgood, A., & Mullan, J. (2016). Learning analytics in higher education: A review of UK and international practice. JISC. Retrieved from jisc.ac.ukScott, J., & Carrington, P. J. (2011). The SAGE handbook of social network analysis. London.Shaffer, D. W. (2008). How computer games help children learn. New York: Palgrave Macmillan.Sie, R. L. L., Ullmann, T. D., Rajagopal, K., Cela, K., Bitter-Rijpkema, M., & Sloep, P. B. (2012). Social network analysis for technology-enhanced learning: Review and future directions. International Journal of Technology Enhanced Learning, 4(3–4), 172–190. https://doi.org/10.1504/IJTEL.2012.051582Siemens, G. (2008). Learning and knowing in networks: Changing roles for educators and designers. ITFORUM for Discussion, 27, 1–26.87Siemens, G. (2013). Learning analytics: The emergence of a discipline. American Behavioral Scientist, 57(10), 1380–1400. https://doi.org/10.1177/0002764213498851Siemens, G., & Baker, R. S. J. D. (2012). Learning analytics and educational data mining: towards communication and collaboration. In 2Nd International Conference on Learning Analytics and Knowledge (pp. 252–254).Singh, M. K., Singh, N., & Zerihun, Z. (2018). Impact of Learning Analytics on Curriculum Design and Student Performance. Hershey, PA: Information Science Reference. https://doi.org/10.4018/978-1-5225-5369-4Slade, S., & Prinsloo, P. (2013). Learning analytics: Ethical issues and dilemmas. American Behavioral Scientist, 57(10), 1510–1529. https://doi.org/10.1177/0002764213479366Slaninoa, K., Martinovic, J., Drazdilova, P., & Snasel, V. (2014). From Moodle log file to the students network. Advances in Intelligent Systems and Computing, 239. https://doi.org/10.1007/978-3-319-01854-6Smith, D. G. (2003). Curriculum and teaching face globalization. In International handbook of curriculum research (pp. 44–60). New York: Routledge.SOLAR. (2011). What is learning analytics? Retrieved from https://www.solaresearch.org/about/what-is-learning-analytics/Strauss, A. L., & Corbin, J. M. (1990). Basics of qualitative research: Grounded theory procedures and techniques. Thousand Oaks: SAGE Publications.Suh, H. (2005). Identifying peer interaction patterns and related variables in community-based learning. In Computer support for collaborative learning: the next 10 years! (p. 657).Sung, Y., Yang, J., & Lee, H. (2017). The effects of mobile-computer-supported collaborative learning: Meta-analysis and critical synthesis, 87(4), 768–805. 88https://doi.org/10.3102/0034654317704307Surbhi, S. (2017). Difference between assessment and evaluation. Retrieved from https://keydifferences.com/difference-between-assessment-and-evaluation.htmlTabuenca, B., Kalz, M., Drachsler, H., & Specht, M. (2015). Time will tell: The role of mobile learning analytics in self-regulated learning. Computers and Education. https://doi.org/10.1016/j.compedu.2015.08.004Tamim, R. M., Bernard, R. M., Borokhovski, E., Abrami, P. C., Schmid, R. F., Mohammed, H. Bin, … Schmid, R. F. (2019). What forty years of research says about the impact of technology on learning: A second-order meta-analysis and validation study. Review of Educational Research, 81(1), 4–28. https://doi.org/10.3102/0034654310393361Teitelbaum, K. (1991). Class, curriculum, and the writing of history - a rejoinder. Curriculum Inquiry, 21(2), 223–230. https://doi.org/10.1080/03626784.1991.11075365The Jamovi-project. (2019). jamovi. Retrieved from www.jamovi.orgTyler, R. W. (2004). Basic principles of curriculum and instruction. In David J. Flinders & S. J. Thornton (Eds.), The curriculum studies reader (2nd ed.). Abington, UK: RoutledgeFalmer.Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes.Cambridge, MA: Harvard University Press.Wasserman, S., & Faust, K. (1994). Social network analysis. Cambridge, MA: Cambridge University.Wells, C. (2015). Finding the center as things fly apart: Vocation and the common good. In D. Cunningham (Ed.), At this time and in this place: Vocation and higher education (pp. 583–605). New York: Oxford University Press. https://doi.org/10.1093/acprofWhitelock-Wainwright, A., Gašević, D., Tejeiro, R., Tsai, Y. S., & Bennett, K. (2019). The 89student expectations of learning analytics questionnaire. Journal of Computer Assisted Learning, 35(5), 633–666. https://doi.org/10.1111/jcal.12366Wise, A. F. (2014). Designing pedagogical interventions to support student use of learning analytics, 203–211. https://doi.org/10.1145/2567574.2567588Wise, A. F., Cui, Y., & Jin, W. Q. (2017). Honing in on social learning networks in MOOC forums, 383–392. https://doi.org/10.1145/3027385.3027446Xie, H., Chu, H. C., Hwang, G. J., & Wang, C. C. (2019). Trends and development in technology-enhanced adaptive/personalized learning: A systematic review of journal publications from 2007 to 2017. Computers and Education. https://doi.org/10.1016/j.compedu.2019.103599Zemsky, R., & Massy, W. F. (2004). What happened to e-learning and why. Retrieved from http://www.irhe.upenn.edu/WeatherStation.htmlZimmerman, B. J., Bandura, A., & Martinez-Pons, M. (1992). Self-motivation for academic attainment: The role of selfefficacy beliefs and personal goal setting. American Educational Research Journal, 29(3), 663–676.90AppendicesAppendix A  SurveyThreadz – Pre- SurveyStart of Block: DemographicsQ1 Thanks for taking part in this study. Completing the survey will be taken as consent.    At no time of this course will your instructor have access to the data. Your grades won't be affected.    The full consent form can be found here: Consent formQ2 Please enter your student number:Q3 Please enter your full name:  Q4 Please enter your birth year:Q5 What gender do you identify with?oWoman  (1) oMan  (2) oNon-binary person  (3) o Prefer not to answer  (4) 91Q6 Is English your native language?oYes  (1) oNo  (2) End of Block: DemographicsStart of Block: General behavior/ Self perceptionQ7 Please answer the following statements about you using the slider to indicate your degree of agreement. (0 = not at all, 100 = a lot).not at all somewhat a lot0 10 20 30 40 50 60 70 80 90 100I use social media apps. ()I engage in discussions in class in person. ()I feel proficient speaking English. ()I feel proficient writing in English. ()I feel confident about my discussion skills. ()I feel comfortable in my cohort. ()End of Block: General behavior/ Self perceptionStart of Block: Online discussions92Q8 Please answer the following statements about canvas online discussions.Strongly disagree (1)Disagree (2)Somewhat disagree (3)Somewhat agree (4)Agree (5)Strongly agree (6)Prefer not to answer (7)I perceive the canvas online discussions as clearly laid out. (1) o o o o o o oI prefer in class discussions over online discussions. (3) o o o o o o oI believe I received more replies than I wrote. (4) o o o o o o oI noticed peers, who haven't received any replies. (5) o o o o o o oI engage in other social media platforms with the cohort discussing course content. (7) o o o o o o o93Q22 Nobody replied to my posts in the online discussions.oYes  (1) oNo  (2) End of Block: Online discussionsStart of Block: Learning Analytics ExpectationsQ9 The last two online discussions will enable Threadz. Threadz is a learning analytics tool, which visualizes online discussions using social network analysis. It will look as followed:     Based on this short first impression and your general assumption about learning analytics, we are interested in your expectations in general.Q10 I expect the learning analytics tool tostrongly disagreeneither agree nor disagreestrongly agree0 10 20 30 40 50 60 70 80 90 100encourage my peers to support one another. ()be in an easy read format. ()make me aware of the university's ability to monitor my actions as a result of collecting my data. ()be meaningful and accessible for me. ()be used to build better relationships between myself and my peers/ teaching staff. ()monitor my own engagement behavior. ()encourage me to adjust my own engagement behavior upon the feedback provided. ()94End of Block: Learning Analytics ExpectationsStart of Block: Open 2Q11 What's your opinion on online discussion in an educational context?________________________________________________________________End of Block95Threadz – Post - SurveyStart of Block: DemographicsQ30 Thanks for taking part in this study. Completing the survey will be taken as consent.    At no time of this course will your instructor have access to the data. Your grades won't be affected.    The full consent form can be found here: Consent formQ3 Please enter your full name:________________________________________________________________Q2 Please enter your student number:________________________________________________________________End of Block: DemographicsStart of Block: Online discussions96Q8_Post Please answer the following statements about canvas online discussions.Strongly disagree (1)Disagree (2)Somewhat disagree (3)Somewhat agree (4)Agree (5)Strongly agree (6)Prefer not to answer (7)I perceive the canvas online discussions as clearly laid out. (1) o o o o o o oI prefer in class discussions over online discussions. (3) o o o o o o oI believe I received more replies than I wrote. (4) o o o o o o oI noticed peers, who haven't received any replies. (5) o o o o o o oI engage in other social media platforms with the cohort  discussing course content. (7) o o o o o o o97Q12_Post Nobody replied to my posts in the online discussion.oYes  (1) oNo  (2) End of Block: Online discussionsStart of Block: Learning Analytics ExpectationsQ10_Post After using Threadz let us know about your experience. The learning analytics toolstrongly disagreeneither agree nor disagreestrongly agree0 10 20 30 40 50 60 70 80 90 100encourages my peers to support one another. ()is in an easy read format. ()makes me aware of the university's ability to monitor my actions as a result of collecting my data. ()is meaningful and accessible for me. ()is used to build better relationships between myself and my peers/ teaching staff. ()monitors my own engagement behavior. ()encourages me to adjust my own engagement behavior upon the feedback provided. ()End of Block: Learning Analytics ExpectationsStart of Block: Threadz98Q5 Please answer the following statements about Threadz.Strongly disagree (1)Disagree (2)Somewhat disagree (3)Somewhat agree (4)Agree (5)Strongly agree (6)Prefer not to answer (7)I barely used Threadz. (1) o o o o o o oThreadz changed the way I perceived canvas online discussions. (2) o o o o o o oI do not understand what Threadz is doing. (3) o o o o o o oThreadz helped me to engage in discussions. (4) o o o o o o oEnd of Block: ThreadzStart of Block: Open 2Q6 Did calling your attention to particular learning analytics data (in this case Threadz data) change your way of learning?________________________________________________________________99Q7 What's your opinion on Threadz?________________________________________________________________Q9 From a teacher's perspective, would you use Threadz in your class?100Appendix B  Course syllabus EDUC COURSE DESCRIPTION Inquiry is understood as a deliberate, sustained and systematic process—beyond the everyday reflection required in teaching. Professionals explore what they do and how they do it; it involves sharing one’s inquiries with colleagues. It involves classroom teachers, individually and collectively, in a cycle of action, reflection, sharing and adaptation. Teachers are given opportunities for practice, and to address challenges and issues that arise through discussion and reflection, try out new or revised practices, and evaluate the results. The cycle then begins anew based on the outcomes, responses, and possibilities emerging from the inquiry. EDUC: Inquiry Seminars The inquiry process across the BEd (Secondary) program consists of: 1. Teacher inquiry & support, preparation towards project  2. Refining and sharing the inquiry project; links to practice 3. Reflecting, links to practice, ongoing questions and learning over the year LEARNING OUTCOMES Upon completion of this course, teacher candidates will have: • A thorough knowledge of the BC curriculum; specifically, the Applied Design, Skills, and Technologies curriculum; • Knowledge and skills necessary to preparing and planning curriculum (i.e., unit + lesson plans for the Practicum); and • Knowledge and skills relevant to teacher inquiry (i.e., resolution of an inquiry project). COURSE SCHEDUELWEEK 1 Jan 6 Monday Representation of Process Topic Philosophy of Teaching and Technology Guiding Questions What is this course about? What are the emphases? Activities → Introduce syllabus, assignments → Review inquiry process → ePortfolio setup Jan 7 Tuesday Links to Practice Topic Philosophy of Teaching and Technology Guiding Questions What is your personal teaching philosophy? What is your philosophy of technology? Online Mini Inquiry Topic 1 OPEN Jan 8 Wednesday Practicum Planning and Inquiry Project Topic Independent Inquiry Guiding Questions How does my inquiry relate to the new BC curriculum? What are the emphases of my inquiry? 101Activities → Philosophy of teaching cont'd → Online mini inquiry cont'd → ePortfolio cont'd Jan 9 Thursday Consultation / Collaboration Sessions Topic Philosophy of Teaching and Technology Guiding Questions What resources are available for unit and lesson planning in technology education? How can we support each other as a community of technology educators? Activities → APA citation workshop → Philosophy of teaching cont'd → Online mini inquiry cont'd → ePortfolio cont'd Online Mini Inquiry Topic 1 DUE (January 10) 

Cite

Citation Scheme:

        

Citations by CSL (citeproc-js)

Usage Statistics

Share

Embed

Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                        
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            src="{[{embed.src}]}"
                            data-item="{[{embed.item}]}"
                            data-collection="{[{embed.collection}]}"
                            data-metadata="{[{embed.showMetadata}]}"
                            data-width="{[{embed.width}]}"
                            data-media="{[{embed.selectedMedia}]}"
                            async >
                            </script>
                            </div>
                        
                    
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:
https://iiif.library.ubc.ca/presentation/dsp.24.1-0391921/manifest

Comment

Related Items