Open Collections

UBC Graduate Research

Teaching and Learning in Hard and Soft Disciplines at North American Universities Osama, Ahmed; Andres, Lesley 2016

Your browser doesn't seem to have a PDF viewer, please download the PDF to view this item.

Item Metadata

Download

Media
42591-Osama_A_et_al_EDST_521_Teaching_learning_hard.pdf [ 558.49kB ]
Metadata
JSON: 42591-1.0314909.json
JSON-LD: 42591-1.0314909-ld.json
RDF/XML (Pretty): 42591-1.0314909-rdf.xml
RDF/JSON: 42591-1.0314909-rdf.json
Turtle: 42591-1.0314909-turtle.txt
N-Triples: 42591-1.0314909-rdf-ntriples.txt
Original Record: 42591-1.0314909-source.json
Full Text
42591-1.0314909-fulltext.txt
Citation
42591-1.0314909.ris

Full Text

Teaching and Learning in Hard and Soft Disciplines at North American Universities  Ahmed Osamaa and Lesley Andresb a PhD Student; Department of Civil Engineering, University of British Columbia b Professor; Department of Educational Studies, University of British Columbia In this study we employed National Student and Faculty Surveys of Student Engagement (NSSE and FSSE) from various USA and Canada universities to investigate teaching and learning practices in hard and soft disciplines. Using seven years of quantitative data extracted from the surveys, the trends of different teaching practices and learning outcomes in hard and soft disciplines were assessed. Also, we examined the association between the studied teaching practices and the resulting learning outcomes. Regression analysis was also performed to develop a model for the deep learning index. Results demonstrated that faculty in hard disciplines were more dependent on teacher-centered approaches than those in soft disciplines; the latter relied mainly on student-centered approaches. Moreover, student-centered approaches to teaching were found to have a positive impact on deep learning subscales, while teacher-centered approaches affected those subscales negatively.  Keywords: Deep Learning, Hard and Soft Disciplines, Teacher Centered Approaches, Student Centered Approaches, National and Faculty Surveys of Student Engagement.                                                   Corresponding author. Department of Civil Engineering, University of British Columbia, Vancouver, BC, Canada V6T 1Z4. Email: ahmed.osama@civil.ubc.ca.  Introduction Teaching and learning are not mutually exclusive processes; rather, teaching approaches are associated with the learning outcomes of students (Kember 1997). Over the past three decades, research on students‟ and teachers‟ perceptions concerning teaching and learning has become one of the most important areas of educational research. This area of inquiry commenced in the 1970s by examining students‟ pre-instructional conceptions of various science contents. The goal of such research was to improve the learning process. By studying university teaching and learning, the following principles were developed to improve undergraduate education: “(1) encouraging contacts between students and faculty; (2) developing reciprocity and cooperation among students; (3) using active learning techniques; (4) giving prompt feedback; (5) emphasizing time on task;  (6) communicating high expectations; and (7) respecting diverse talents and ways of learning.” (Chickering and Gamson 1987) Investigations of students‟ understandings across most science domains were also documented in a bibliography by Duit (2009). It has been plausible to assume that if a teacher's focus is on what he/she does or on transmitting knowledge, students are more likely to adopt a surface approach to learning with an emphasis on consuming knowledge. Alternatively, if a teacher embraces student-centered approaches to teaching, students are more likely to adopt a deep approach to learning and focus more on a more profound comprehension of the topics that they are studying (Entwistle, Skinner, Entwistle, and Orr 2000). The study at hand paves the way towards affirming such conclusions using reliable statistical analysis of the relationships between deep learning subscales and adopted teaching approaches in different university disciplines. Using data from student and faculty engagement yearly surveys distributed to different institutions in United States and Canada, we examined four main dimensions. Three assessed teaching quality, including the percentage of in-class time carrying out different teaching activities (lecturing, experiential activities, and other activities), time during the week that faculty devote to teaching activities in relation to other research and advising activities, and the time spent by the student to prepare for class. One element assesses student learning using a deep learning index, including its subscales: reflective learning, integrative learning, and high order learning. The results reveal the relationship between different teaching practices across various university disciplines and learning outcomes. Unpaired t-tests were conducted to explore the contrasts between hard and soft disciplines in relation to teaching and learning practices. In addition, correlation analyses were performed to measure the association between different teaching components and learning outcomes. Regression models were then developed to predict deep learning indices using the teaching variables specified above.  Background Research on the teaching conceptions of university teachers reveals a wide range of variation in teaching approaches that are directly linked to student learning (Postareff, Lindblom-Ylänne, and Nevgi 2007; Kember and Kwan 2002). These approaches vary from teaching as presenting structured knowledge to teaching as facilitating understanding and stimulating conceptual and intellectual progress. Teaching approaches can be categorized into two main categories: teacher-centered approaches and student-centered approaches. Teachers who consider teaching as transmitting knowledge are more likely to adopt a teacher-centered approach, while those who conceive teaching as facilitating learning tend to use student-centered approaches.  In teacher-centered approach, transmitted knowledge is mainly constructed by the teacher (Postareff, Lindblom-Ylänne, and Nevgi 2007), who considers students to be passive recipients of information and do not take into account their existing knowledge. In this approach, learning outcomes are expressed in quantitative rather than qualitative terms, without giving enough credit to the comprehension of knowledge by students. Moreover, teachers may try to make learning easier for students by organizing their teaching thoroughly and structuring knowledge in a way that is easier to remember (Kember and Kwan 2002; Prosser, Trigwell, and Taylor 1994). In one study, Lindblom-Ylänne and Nevgi (2003) conducted several interviews with faculty members to describe teacher-centered approaches to teaching. Interviewees believed that they were not experienced teachers and that was why they preferred giving mass lectures than small groups. Also, they thought that activating students and making them participate in discussions was difficult for them and they did not have tools to do so; hence, they relied more on transmitting knowledge using a teacher-centered approach. Some also mentioned that they did not like teaching very much but they had to do so to work in a university. Student-centered approaches to teaching focus more on students and their learning, rather than on teachers and their teaching. Transmission of knowledge may be a component, but not an end in itself, as teaching becomes more interactive and observant of students‟ existing conceptions (Postareff, Lindblom-Ylänne, and Nevgi 2007). A student-centered teacher tries to identify the diverse needs of the student as a first step in planning the course (Biggs 1999; Samuelowicz and Bain 2001). In this approach, teaching is about facilitating student learning, where students are encouraged to construct their own comprehension and to do their best towards becoming independent learners. Student-centered teachers were found to use a broader range of teaching methods than teachers who adopted a teacher-centered approach (Coffey and Gibbs 2002). In a study by Lindblom-Ylänne and Nevgi (2003), faculty interviewees said that they adopted a student-centered approach because they realized the importance of having the students in the centre of the learning process. Accordingly, faculty members started thinking about how to best teach the students and how students experience different learning situations. They think that their job is to facilitate student learning not just stand in front of the students and deliver information to them. There are different views regarding whether student-centered and teacher-centered approaches are completely distinct. One point of view emphasizes that a student-centered teacher might sometimes use teacher-centered teaching features depending on the teaching context, but others maintain that teacher-centered approaches cannot be combined with student-centered teaching elements (Åkerlind 2003). Shifts from teaching-centered to student-centered orientations are possible (Samuelowicz and Bain 2001), although some claim that enormous efforts are needed to change underlying beliefs (Kember 1997). Lueddeke (2003) demonstrated that teachers from hard disciplines (e.g. engineering, biology, and physics) were more likely to adopt an information transmission/teacher-focused (ITTF) approach to teaching, while teachers from soft disciplines (e.g. sociology, arts, and education) took more conceptual change/student-focused (CCSF) approach to teaching. Lindblom-Ylanne, Trigwell, Nevgi, and Ashwin (2004) reported similar findings. More specifically, they determined that teachers from pure hard sciences (e.g., chemistry) scored significantly lower on the CCSF scales compared to teachers representing pure soft science (e.g., history) and applied soft sciences (e.g., education). Moreover, they found that teachers from applied hard sciences (e.g., engineering), scored significantly higher on the ITTF scale than teachers from pure and applied soft disciplines.  Improvements in learning outcomes have been reported when there is a shift from teacher-centered approaches towards more student-centered approaches in hard science disciplines. Crouch and Mazur (2001) revealed that when peer instruction techniques were implemented with students attending introductory physics courses, their ability to solve traditional quantitative problems improved dramatically. Wood (2009) demonstrated how traditional teaching methods fail the majority of biology students. He concluded that widespread adoption of promising practices based on sound research in STEM (Science, Technology, Engineering, and Mathematics) classes can have a major impact on better preparing undergraduate students for their future endeavours. Also, Deslauriers, Schelew, and Wieman (2011) measured learning of specific topics in a large enrolment physics class when taught as traditional lecture given by an experienced instructor and when taught by a trained inexperienced instructor using instruction based on research in cognitive psychology and physics education. They found increased student attendance, higher engagement, and more than twice the learning in the section taught using research-based instruction. Research Objectives This research aims to study various teaching practices‟ and learning outcomes‟ trends in different soft and hard disciplines. Moreover, it examines the association between the studied teaching practices and the resulting learning outcomes, and attempts to develop regression models to predict deep learning subscales using teaching practices‟ measures. Data Collection The National Survey of Student Engagement (NSSE) was conceived in 1998 as a new approach to gathering information about quality of the collegiate experience. Since then, it has been administered at over 1100 postsecondary institutions in the United States and Canada to evaluate student engagement and what they gain from their experience in higher education (Kuh 2001). Student engagement is defined as “the time and energy that students devote to educationally purposeful activities and the extent to which the institution intentionally creates opportunities and provides resources for students to participate in activities that lead to student success” (Kuh 2003). According to Kezar (2006) and Bradforth et al. (2015), student engagement in activities should increase student learning. The NSSE instrument assesses engagement in effective educational practices by setting five targets for an engaged campus through the following: academic challenge, student interactions with faculty, active and collaborative learning, enriching educational experiences, and supportive campus environments (Kuh 2001). Postsecondary institutions use their independent along with comparative NSSE quantitative data to identify various dimensions of undergraduate students‟ learning experience. Such experience can be improved through changes in policies, practices, and resources distribution to be more aligned with the best practices of undergraduate education (Webber, Laird, and BrckaLorenz 2013). NSSE is widely used by various stakeholders of higher education such as students, parents, advisors, researchers, and policy makers for a wide range of purposes – from assisting with higher education enrolment choices to assessing institutional characteristics and improving the quality of education.  The Faculty Survey of Student Engagement (FSSE) questionnaire was designed to complement NSSE by collecting information about how faculty spend their time and structure their classroom activities. It collects information about faculty expectations and perceptions of undergraduate students engagement in educational activities and endeavours to determine the extent to which faculty promote student learning development and interactions with students in their courses. Extensive quantitative analyses of NSSE responses can play a significant role for researchers, educators, and decision makers in widening the comprehension of various ways of promoting student success in postsecondary education (Kuh 2003). It is reasonable to assume the same role for the analysis of FSSE data. Numerous studies have confirmed the reliability and validity of the NSSE and FSSE data (Carini, Kuh, and Oiumet 2001; Umbach and Warwynski 2005; Pascarella, Seifert, and Blaich 2010). Survey Components Usually NSSE and FSSE include many components that assess different aspects of engagement from student and faculty perspectives; however, this study focuses on four main components due to a data shortage of other components. Those components are:  Percentage of time spent teaching: In FSSE, faculty are asked to report the percentage of time that he/she spends on research, teaching, and other professorial activities disaggregated by the disciplinary area in a 7-day work week.   Students’ time spent preparing for a class: NSSE asks the students to report how many hours per week they spend preparing for different classes disaggregated by the disciplinary area of the selected courses.   Teaching activities: FSSE investigates the percentage of in-class time devoted by faculty to different teaching practices, disaggregated by the disciplinary area. Teaching practices are divided into three main categories: lecturing, experiential activities, and other activities (e.g., discussions, student presentations, small group activities).   Deep approaches: An emphasis on deep approaches to learning is captured through a combination of three sub-scales (integrative learning, higher-order learning, and reflective learning). The integrative learning subscale contains items that measure the amount of student participation in activities that require integrating ideas from various sources including diverse perspectives in their academic work and discussing ideas with others outside of class. The higher order learning subscale focuses on the extent to which students believe that their courses support advanced thinking skills such as analysing the basic elements of an idea, experience, or theory, and synthesizing ideas, information, or experiences into new more complex interpretations. The reflective learning sub-scale was designed to complement the higher order and integrative learning items that had been on the core survey for several years (Laird, Shoup, Kuh, and Schwarz 2008). Assessment components of each deep learning subscale along with the appropriate question for each element are listed in FSSE as demonstrated in Table 1. The results from those components are aggregated to obtain an index for each subscale and then the different indices are averaged to obtain the overall deep learning index. Characteristics and Demographics of Institutions and Respondents The dataset includes faculty and student responses from different institutions across USA with 3% of the participating institutions are from Canada. Table 2 shows the number of institutions participating in the surveys across the years. The characteristics of institutions varied as follows:  Slightly more than half are private institutions.   Undergraduate enrolment ranged from 1000 to 20,000, and nearly half of the institutions are of a smaller student population size.   Half are master‟s granting and one third are baccalaureate granting.   Around three in 10 are less competitive and three in 10 are highly competitive. Faculty respondents from different institutions are fairly evenly divided among different academic ranks (Professor, Associate Professor, Assistant Professor, and Lecturers); with most of the respondents employed full time (more than 75%) and most are of the Caucasian race (more than 70%). Table 3 summarizes the means and standard deviations by gender of the students and faculty participating in the NSSE and FSSE in the different study years. The number of participating female students is significantly lower than males in engineering, while it is significantly higher in arts, education, social science, and biological science. Faculty male respondents were significantly higher in engineering, business and physical science, in contrast to education. Under-representation in the disciplines of science, mathematics, and engineering (SM&E) by women has always been an issue in the United States (Moore 2001; National Science Foundation 2004; Kahveci, Southerland, and Gilmer 2006). Women are still less likely than men to choose a career that involves SM&E and are more likely than men to earn bachelor's degrees in non-science and non-engineering fields. Among those who do choose a major in SMandE, the majority are still concentrated in certain fields such as biology, psychology, and the social sciences (National Science Foundation 2004; Kahveci, Southerland, and Gilmer 2006). Accordingly, although there is a gender bias in response to the NSSEs and FSSEs in the years under investigation in this study, it is reasonable to assume that the respondent numbers fairly represent the students and faculty by gender in those disciplines. Description of the Data Set Seven years of data (2006–2012) from the administration of NSSE and FSSE in different institutions were collected for seven disciplines (arts and humanities, education, social sciences, business, physical sciences, biological sciences, and engineering) and combined for use in this study. A summary of the survey components results is provided in Table 4 aggregated form for the disciplines throughout the seven years of analysis (six years only for deep learning indices because year 2012 is missing). Forty-nine data points (forty-two only for deep learning indices) were used in the analysis. Each data point represents the average of the respondents‟ data for the participating institutions in a teaching/learning component in a specific discipline in a certain year. Data Analysis  Teaching and Learning Trends in Hard and Soft Disciplines Unpaired t-tests were performed to determine if there is a significant difference in teaching and learning trends between hard and soft disciplines. In this study, hard disciplines include engineering, physical science, and biological science; while soft disciplines include arts and humanities, business, education, and social sciences. Table 5 demonstrates the percentage of time spent by teachers on teaching activities throughout the week for different disciplines. It is clear that engineering faculty members tend to devote the least time to teaching activities compared to other disciplines, while arts and humanities devote the highest amount of time to this task. Faculty from soft disciplines tend to significantly spend more time on teaching activities than faculty in the hard disciplines, as revealed by the unpaired t-test results. However, across the years, there is a trend upward for disciplines other than the humanities.  The average preparation time for a class reported by students is portrayed in Table 6. Engineering students reported spending the highest number of hours preparing for class compared with other disciplines, while business students reported the least time. For students in all disciplines, there is a trend across time toward more time preparing for class, more so for engineering students than those in other disciplines.  In Table 7, faculty reliance on lecturing compared to experiential activities and other activities (e.g. discussion, small group activities, student presentations, etc.) in each discipline is portrayed. When compared with faculty from the soft disciplines, faculty from the hard disciplines tend to depend more on lecturing and experiential activities as their main teaching activities. Also, there has been little change in the former and no clear pattern in the latter between 2005 and 2013. Conversely faculty from soft disciplines rely mainly on other activities for teaching. This is confirmed by the unpaired t-test results which reveal highly significant differences in teaching practices between hard and soft disciplines. Again, the trend over time is rather constant. Deep learning subscales were also monitored for the different disciplines across the study years (Table 8). Unpaired t-tests results demonstrate that in hard disciplines, deep learning indices are significantly lower than in soft disciplines. Engineering, along with physical sciences, were found to score the lowest in reflective and integrative learning indices compared to other disciplines. In contrast, engineering faculty reported a higher instance than most of the other disciplines in high order learning. This finding suggests that the low levels of reported deep learning in hard disciplines (especially engineering) could be related to a higher dependence on lecturing as a main teaching activity; we will confirm this conclusion shortly through correlation analyses and regression models. High reported levels of high order learning in engineering may be due to the high percentage of experiential activities (e.g., labs and field work) that is usually an essential part of most of engineering courses. Taking that into consideration, in total the hard disciplines still have lower reported levels of higher order learning than soft sciences. This contrast is less significant than other deep learning indices, where hard sciences have highly significant lower trends than soft disciplines in integrative and reflective learning. Moreover, education faculty report the highest levels in almost all of the deep learning subscales. Across the seven years under consideration, there is fluctuation in terms of percentages; however, the hard disciplines are more likely than the soft discipline to be located at the lower end of the scale.  Association Between Teaching Practices and Learning Outcomes Correlation analyses were performed to assess the association between teaching components and different deep learning subscales. Table 9 shows the pearson correlation coeficients resulting from the various correlation analyses, where the overall deep learning index was found strongly and negatively correlated to student preparation time and the percentage of time lecturing, while positively correlated to the percentage of time teaching through other activities. More specifically, the deep learning subscales (high order, reflective, and integrative learning) have various levels of association as shown in Table 10. It should be noted that all the correlation analyses results are significant at 99.9% significance level. Regression Models Finally, regression models were developed to predict the deep learning subscales using the variables measured in the surveys. Some of the studied variables were found significant for reflective and integrative scales, but not for the high order learning scale. This corresponds with the correlation analyses results, where no teaching variables were strongly correlated to the high order learning index. The overall deep learning scale was then modeled using linear regression. On one hand, for integrative learning, the regression model contains only one significant variable, which is the percentage of time lecturing in-class. Through this variable and an intercept, the integrative learning index can be predicted at an adjusted R2 of 0.34 with 99.9% significance level. On the other hand, two variables are significant in predicting the reflective learning index. The variables are the percentage of time lecturing in-class and the time spent by students preparing for the class. The adjusted R2 is 0.42, with 95 % significance level for students‟ preparation time coeficient and 92.50 % significance level for the percentage of lecturing coeficient. Table 10 summarizes the regression models for reflective and integrative learning indices. Finally, deep learning index is modeled using lecturing and student preparation variables, which yielded an aceptable adjusted R2 of 0.42 with sufficient significance level of 97.50 % for the variables‟ coeficients. Table 11 shows the regression statistics of the developed model, and Figure 1 illustrates the model‟s three-dimensional surface. Discussion and Recommendations In this study, we conducted statistical analyses of national and faculty surveys of student engagement in hard and soft disciplines in order to answer the following questions: how teaching and learning practices vary in different hard and soft disciplines, what is the association between the adopted teaching practices and the resulting learning outcomes, and is it possible to predict learning outcomes using teaching practices‟ measures. The hard disciplines demonstrated lower levels of time devoted by faculty to teaching, along with higher levels of student preparation time for classes and lecturing activities in class; in other words, a teacher-centered approach. In contrast, faculty from soft disciplines reported devoting more time to teaching and relying more on student-centered teaching practices through various activities other than lecturing. Hard disciplines had significantly lower deep learning subscales (reflective, integrative, and higher order learning) when compared to soft disciplines. If the goal is to foster deep learning in students, faculty in hard disciplines may wish to reconsider their teaching practices. The percentage of lecturing in class and the time spent by students preparing for class was strongly and negatively correlated with deep learning. Regression analyses were used to assess the extent to which students experienced deep learning.  The findings of this study could be shared with faculty from different disciplines to promote a discussion of different teaching and learning strategies in relation to desired learning outcomes. It has been demonstrated that when findings such as these are presented, an array of responses are provoked (Laird, Smallwood, Niskodé‐Dossett, and Garver 2009). Some faculty members may react positively to the findings by reconsidering their current practices as well as implementing curriculum improvements. However, many faculty may ignore the findings due to the lack of trust in data collection methods. Others may suggest that the results may be valid but simply do not apply to their particular students. Although some feedback can be useful for institutions when they consider ways to improve their assessment processes, Astin (2012) points out that this line of thinking and arguing often hinders actions for educational improvement. However, in this study, collecting student engagement and learning outcomes information from faculty themselves through FSSE may offer a way to strongly counter these responses. It is more difficult to ignore assessment results when student findings are contrasted against faculty members‟ responses.  Quantitative data extracted from NSSE and FSSE allowed us to monitor the impact of different education practices in this study. However, the primary focus on quantitative data provided by both surveys probably missed some amount of in-depth information that could be expressed by faculty and students in their own words, suggesting that there may be other important factors influencing learning, development, and academic achievement. Further research could focus on the ways that education faculty adopt to keep learning outcomes that high, and how faculty in hard disciplines could benefit from their teaching practices. Institutions can present and discuss student engagement results with their faculty in many ways. The means for disseminating outcomes may vary from simply circulating the final reports to more structured approaches, such as designing customized training sessions, workshops and reports by departments or universities. There are successful examples demonstrating how campus leaders shared engagement findings with their faculty (Laird, Smallwood, Niskodé‐Dossett, and Garver 2009): Washington State University asked the President‟s Teaching Academy, a selected group of honoured faculty, to review the NSSE findings and to develop ideas for improving the undergraduate experience. Leaders at the University of Georgia developed a series of NSSE campus conversations to establish an opportunity for many campus stakeholders, including deans, departmental faculty, the student government association, academic advisors, members of the teaching academy, and the university curriculum committee, to gather and discuss NSSE results. The South Dakota School of Mines and Technology developed internally a card game with small groups of faculty to discuss the interesting NSSE and FSSE findings. Concordia College shares student engagement findings with the campus community through a monthly newsletter developed by the Office of Assessment and Institutional Research.  After interrogating the findings from this and other studies from NSSE and FSSE, professional development of faculty may be required. There is some debate about whether faculty training in higher education has an effect on their teaching roles. Coffey and Gibbs's (2000) study revealed that faculty in some UK universities showed significant improvement in learning, enthusiasm, organization, and rapport scores that are measured by the student evaluation of educational quality questionnaires after one semester of two- and three-semester long training programs. Another study used the approaches to teaching inventory (Prosser and Trigwell 1999) in 22 universities in eight countries to study the effectiveness of university faculty training (Gibbs and Coffey 2004). A group of trained faculty and their students followed at the beginning of the training period and again one year later. The training group became less teacher-centered and more student-centered by the end of the four to 18 months of training. Their teaching skills improved significantly after the training as judged by the students. Their students took a deep approach to learning to a greater extent after the instructors had been trained, although this change was small. The authors, however, point out that they are not in a position to demonstrate whether it was the training itself that resulted in the positive changes. Despite these studies, Norton, Richardson, Hartley, Newstead and Mayes (2005) consider the effect of faculty training in higher education still questionable and there is no significant evidence to demonstrate that training would have a strong effect on teaching behaviour. They conducted a study of university faculty in the UK using a questionnaire that measured different attributes of their beliefs and intentions about teaching in higher education. No significant differences were found between two groups of those who took training on teaching and learning in higher education and those who did not on the teaching beliefs and intentions scales. These results suggest that genuine improvement will come about only by addressing the underlying conceptions of teaching and learning held by university faculty. Moreover, we will not be able to implement any true attempts in changing teaching practices in higher education without concurrent change in the way we educate prospective faculty members (Owens 2010).  Lastly, in this study we employ a quantitative design to demonstrate the relationship between the teaching practices in different disciplines and how they affect the learning outcomes. Further research, for example through qualitative interviewing, is needed to more deeply investigate how to advance more efficient teaching and learning practices in academia.   References   erlind, Gerlese S. 2003. "Growing and developing as a university teacher–variation in meaning." Studies in higher education 28 (4): 375-390. doi:10.1080/0307507032000122242 Astin, Alexander W. 2012. Assessment for excellence: The philosophy and practice of assessment and evaluation in higher education. Lanham: Rowman and Littlefield Publishers.  Biggs, John B. 1999. Teaching for quality learning at university. Buckingham: Open University Press. Bradforth, Stephen E., Emily R. Miller, William R. Dichtel, Adam K. Leibovich, Andrew L. Feig, James D. Martin, Karen S. Bjorkman, Zachary D. Schultz, and Tobin L. Smith. 2015. "University learning: Improve undergraduate science education." Nature 523 (7560): 282-284. doi:10.1038/523282a Carini, R., G. Kuh, and J. Oiumet. 2001. "Using focus groups to establish validity and reliability of a college student survey." Paper presented at the Association for Institutional Research Forum, Long Beach, CA. Chickering, Arthur W., and Zelda F. Gamson. 1987. "Seven principles for good practice in undergraduate education." AAHE bulletin 3 (7): 3-7. Retrieved from http://files.eric.ed.gov/fulltext/ED282491.pdf Coffey, Martin, and Graham Gibbs. 2000. "Can academics benefit from training? Some preliminary evidence." Teaching in Higher Education 5 (3): 385–389. doi:10.1080/713699136 Coffey, Martin, and Graham Gibbs. 2002. "Measuring teachers‟ repertoire of teaching methods." Assessment and Evaluation in Higher Education 27 (4): 383–390. doi:10.1080/0260293022000001382 Crouch, Catherine H., and Eric Mazur. 2001. "Peer instruction: Ten years of experience and results." American Journal of Physics 69 (9): 970-977. Retrieved from http://dx.doi.org/10.1119/1.1374249 Deslauriers, Louis, Ellen Schelew, and Carl Wieman. 2011. "Improved learning in a large-enrollment physics class." Science 332 (6031): 862-864. doi:10.1126/science.1201783 Duit, Reinders. 2009. Bibliography STCSE: Students ‘and teachers’ conceptions and science education. Kiel: University of Kiel. Entwistle, Noel, Don Skinner, Dorothy Entwistle, and Sandra Orr. 2000. "Conceptions and beliefs about „„Good Teaching‟‟: An integration of contrasting research areas." Higher Education Research and Development 19 (1): 5–26. doi:10.1080/07294360050020444 Gibbs, Graham, and Martin Coffey. 2004. "The impact of training of university teachers on their teaching skills, their approach to teaching and the approach to learning of their students." Active Learning in Higher Education 5 (1): 87–100. doi:10.1177/1469787404040463 Kahveci, Ajda, Sherry A. Southerland, and Penny J. Gilmer. 2006. "Retaining undergraduate women in science, mathematics, and engineering." Journal of College Science Teaching 36(3): 34-38. Retrieved from http://web.b.ebscohost.com/ehost/pdfviewer/pdfviewer?sid=231ef03e-ee43-4aa3-9c3f-ec5c4d8cee87%40sessionmgr115andvid=1andhid=102 Kember, David. 1997. "A reconceptualization of the research into university academics' conceptions of teaching." Learning and instruction 7 (3): 255-275. doi:10.1016/S0959-4752(96)00028-X Kember, David, and Kam-Por Kwan. 2002. "Lectures‟ approaches to teaching and their relationship to conceptions of good teaching." In Teacher thinking, beliefs and knowledge in higher education, 219-239. Netherlands: Springer. Kezar, Adrianna J. 2006. "The impact of institutional size on engagement." Journal of Student Affairs Research and Practice 43 (1): 87–114. doi:10.2202/1949-6605.1573 Kuh, George D. 2001. "Assessing what really matters to student learning: Inside the national survey of student engagement." Change: The Magazine of Higher Learning 33 (3): 10-17. doi:10.1080/00091380109601795 Kuh, George D. 2003. "What we're learning about student engagement from NSSE: Benchmarks for effective educational practices." Change: The Magazine of Higher Learning 35 (2): 24-32. doi:10.1080/00091380309604090 Laird, Thomas F. Nelson, Michael J. Schwarz, Rick Shoup, and George D. Kuh.  2005. "Disciplinary differences in faculty members‟ emphasis on deep approaches to learning." Paper presented at the Annual Meeting of the Association for Institutional Research, Chicago, May. Laird, Thomas F. Nelson, Rick Shoup, George D. Kuh, and Michael J. Schwarz. 2008. "The effects of discipline on deep approaches to student learning and college outcomes." Research in Higher Education, 49 (6), 469-494. doi:10.1007/s11162-008-9088-5 Laird, Thomas F. Nelson, Robert Smallwood, Amanda Suniti Niskodé‐Dossett, and Amy K. Garver. 2009. "Effectively involving faculty in the assessment of student engagement." New Directions for Institutional Research 2009 (141): 71-81. doi:10.1002/ir.287 Lindblom-Ylänne, S., and A. Nevgi. 2003. "The effect of pedagogical training and teaching experience on approach to teaching." Paper presented at the 11th EARLI Conference, Padua, Italy, August. Lindblom-Ylanne, S., K. Trigwell, A. Nevgi, and P. Ashwin. 2004. "Variation in approaches to teaching: The role of discipline and teaching context." Paper presented at the EARLISIG Higher Education Conference, June. Lueddeke, George R. 2003. "Professionalising teaching practice in higher education: A study of disciplinary variation and „teaching-scholarship‟." Studies in Higher Education 28 (2): 213–228. doi:10.1080/0307507032000058082 Moore, Randy. 2001. "The "pretty redhead" who changed science education." Journal of College Science Teaching 31 (3): 194-96. Retrieved from http://search.proquest.com/openview/608b3c85b35c88e4998336f474396096/1?pq-origsite=gscholar Norton, Lin, T. E. Richardson, James Hartley, Stephen Newstead, and Jenny Mayes. 2005. "Teachers‟ beliefs and intentions concerning teaching in higher education." Higher education 50 (4): 537-571. doi:10.1007/s10734-004-6363-z National Science Foundation 2004. Women, minorities, and persons with disabilities in science and engineering (NSF 04-317). Arlington: ERIC Clearinghouse. Owens, Stephanie. 2010. "We teach how we‟ve been taught: Expeditionary learning unshackling sustainability education in US public schools." Education 2010. Retrieved from http://www.jsedimensions.org/wordpress/content/we-teach-how-weve-been-taught-expeditionary-learning-unshackling-sustainability-education-in-u-s-public-schools_2013_06/ Pascarella, Ernest T., Tricia A. Seifert, and Charles Blaich. 2010. "How effective are the NSSE benchmarks in predicting educational outcomes." Change 42 (1): 16–22.  doi:10.1080/00091380903449060 Postareff, Liisa, Sari Lindblom-Ylänne, and Anne Nevgi. 2007. "The effect of pedagogical training on teaching in higher education." Teaching and Teacher Education 23 (5): 557-571. doi:10.1016/j.tate.2006.11.013 Prosser, Michael, and Keith Trigwell. 1999. Understanding learning and teaching: The experience in higher education. Suffolk: Society for Research into Higher Education and Open University Press. Prosser, Michael, Keith Trigwell, and Philip Taylor. 1994. "A phenomeno-graphic study of academics‟ conceptions of science teaching and learning." Learning Instruction 4 (3): 217–231. doi:10.1016/0959-4752(94)90024-8 Samuelowicz, Katherine, and John D. Bain. 2001. "Revisiting academics‟ beliefs about teaching and learning." Higher Education 41 (3): 299–325. Retrieved from http://www.jstor.org/stable/3447978?seq=1#page_scan_tab_contents Umbach, Paul D., and Matthew R. Wawrzynski. 2005. "Faculty do matter: The role of faculty in student learning and engagement." Research in Higher Education 46 (2): 153–184. doi:10.1007/s11162-004-1598-1  Webber, Karen L., Thomas F. Nelson Laird, and Allison M. BrckaLorenz. 2013. "Student and faculty member engagement in undergraduate research." Research in Higher Education 54 (2): 227-249. doi:10.1007/s11162-012-9280-5 Wood, William B. 2009. "Innovations in teaching undergraduate biology and why we need them." Annual Review of Cell and Developmental 25: 93-112. doi:10.1146/annurev.cellbio.24.110707.175306                        Table 1. Components of deep learning subscales (Laird, Schwarz, Shoup, & Kuh, 2005) Integrative Learning High Order Learning Reflective learning Work on a paper or project that requires integrating ideas or information from various sources (b) Analysing the basic elements of an idea, experience, or theory, such as examining a particular case or situation in depth and considering its components (a)  Examine the strengths and weaknesses of their views on a topic or issue (b) Have class discussions or writing assignments that include diverse perspectives (different races, religions, genders, political beliefs, etc.) (c) Making judgments about the value of information, arguments, or methods, such as examining how others gathered and interpreted data, and assessing the soundness of their conclusions (a) Try to better understand someone else's views by imagining how an issue loo s from that person‟s perspective (b) Put together ideas or concepts from different courses when completing assignments or during class discussions (b) Applying theories or concepts to practical problems or in new situations (a) Learn something that changes the way they understand an issue or concept (b)  At least once, discuss ideas from your readings or classes with you outside of class (d) Synthesizing and organizing ideas, information, or experiences into new, more complex interpretations and relationships (a)  Discuss ideas or readings from class with others outside of class (other students, family members, co-workers, etc.) (b)     (a) Faculty were asked how much (1=Very little, 2=Some, 3=Quite a bit, 4=Very much) a selected course emphasized this.  (b) Faculty were asked how important (1=Not Important, 2=Somewhat Important, 3=Important, 4=Very Important) it was for students to do this in a selected course.  (c) Faculty were asked how often (1=Never, 2=Sometimes, 3=Often, 4=Very often) students in a selected course engaged in this.  (d) Faculty were asked the percentage (1=None, 2=1-24%, 3=25-49%, 4=50-74%, 5=75-100%) of students in a selected course that did this.           Table 2. Number of institutions participating in the previous years‟ surveys                          Year 2012 2011 2010 2009 2008 2007 2006 Participating Institutions  117 157 154 148 160 162 131 Table 3. Summary of the participating students and faculty numbers by gender                   Disciplines Gender MEAN Faculty STDEV Faculty MEAN Student STDEV Student Arts and Humanities Male  2,316 669 13,491 2,634 Female 2,313 637 27,892 5,650 Education Male  422 157 4,870 670 Female 850 243 22,852 3,520 Social Science Male  1,271 437 11,602 1,882 Female 1,150 293 30,492 4,344 Engineering Male  527 181 15,066 3,162 Female 99 35 4,642 878 Biological Science Male  588 227 8,407 1,366 Female 419 82 16,988 2,749 Business Male  994 221 21,423 3,313 Female 542 91 28,686 4,526 Physical Science Male  1,297 333 5,301 903 Female 612 109 5,486 966 Table 4. Survey Dataset Summary                    Survey Component Min Max Mean Standard Deviation Hours Per week spent by students preparing for class 3 4.6 3.74 0.39 Percentage of time faculty reports spending on teaching per week 50 66 59.38 4.42 Percentage of lecturing in class 22.5 62 45.13 13.22 Percentage of experiential activities in class 4 28 13.15 6.89 Percentage of other activities in class 10 62 41.71 16.66 High order learning index 2.1 3.25 2.74 0.28 Integrative learning index 1.85 3.02 2.50 0.27 Reflective learning index 1.8 3.41 3.41 1.80 Deep learning index 2 3.14 2.51 0.26 Table 5. Percentage of time spent teaching in different disciplines Trends Unpaired T-test Results    Hard Disciplines Soft Disciplines Mean 58.09 60.82 Variance 23.89 11.41 Observations 21 28 P-value 0.035                   404550556065702005 2007 2009 2011 2013Percetage of Time Spent Teaching Year Arts and HumanitiesEducationSocial ScienceEngineeringBiological SciencesBusinessPhysical ScienceTable 6. Time spent by students preparing for class in different disciplines Trends Unpaired T-test Results    Hard Disciplines Soft Disciplines Mean 4.15 3.50 Variance 0.049 0.058 Observations 21 28 P-value <0.001                   22.533.544.552005 2007 2009 2011 2013Preparation Time Year Physical ScienceBiological ScienceBusinessEngineeringArts and HumanitiesSocial ScienceEducationTable 7. Percentage of teaching activities in different disciplines Trends Unpaired T-test Results    Hard Disciplines Soft Disciplines Mean 57.35 35.64 Variance 4.37 90.79 Observations 21 28 P-value <0.001    Hard Disciplines Soft Disciplines  Mean 18.42 9.69 Variance 24.65 30.80 Observations 21 28 P-value <0.001    Hard Disciplines Soft Disciplines Mean 24.21 54.66 Variance 33.26 33.42 Observations 21 28 P-value <0.001      0102030405060702005 2007 2009 2011 2013Percentage of lecturing Year Arts and HumanitiesEducationSocial ScienceEngineeringBiological ScienceBusinessPhysical Science0510152025302005 2007 2009 2011 2013Percentage of experantial activities Year Arts and HumanitiesEducationSocial ScienceEngineeringBiological ScienceBusinessPhysical Science0102030405060702005 2007 2009 2011 2013Percentage of other activities Year Arts and HumanitiesEducationSocial ScienceEngineeringBiological ScienceBusinessPhysical ScienceTable 8. Deep learning subscales indices for different disciplines Trends Unpaired T-test Results   Unpaired t-test Hard Disciplines Soft Disciplines Mean 2.15 2.64 Variance 0.05 0.09 Observations 18 24   P-value <0.001   Unpaired t-test Hard Disciplines Soft Disciplines Mean 2.32 2.63 Variance 0.08 0.03 Observations 18 24 P-value <0.001   Unpaired t-test Hard Disciplines Soft Disciplines Mean 2.63 2.83 Variance 0.11 0.04 Observations 18 24 P-value 0.035   Unpaired t-test Hard Disciplines Soft Disciplines Mean 2.30 2.66 Variance 0.034 0.044 Observations 18 24 P-value <0.001    1.82.32.83.33.82005 2007 2009 2011Reflective learning index Year Arts and HumanitiesEducationSocial ScienceEngineeringBiological SciencesBusinessPhysical Science22.22.42.62.833.22005 2007 2009 2011Integrative learning index Year Arts and HumanitiesEducationSocial ScienceEngineeringBiological SciencesBusinessPhysical Science22.22.42.62.833.23.42005 2007 2009 2011High order learning index Year Arts and HumanitiesEducationSocial ScienceEngineeringBiological SciencesBusinessPhysical Science22.22.42.62.833.22005 2007 2009 2011Deep learning index Year Arts and HumanitiesEducationSocial ScienceEngineeringBiological SciencesBusinessPhysical ScienceTable 9. Correlation analyses results                     Student Preparation Time Percentage of Faculty Weekly Teaching Time  Percentage of Lecturing Percentage of Experiential Activities   Percentage of Other Activities  High Order Learning -0.362  -0.089 -0.312  -0.148  0.309  Integrative Learning -0.380  0.071 -0.597  -0.179  0.548  Reflective Learning -0.637  0.152 -0.562  -0.225  0.539  Deep learning -0.612 0.072 -0.593 -0.241 0.571 Table 10. Integrative learning and reflective learning regression models                    Integrative Learning Index (IL) Model Reflective Learning Index (RL) Model                  IL=β0+β1*Lec    RL=β0+β1*Lec+β2* Prep Adjusted R2 0.34 Adjusted R2 0.42 Observations 42 Observations 42 IndependentVariables Coefficients P-Value Independent Variables Coefficients P-Value Intercept β0 = 3.07 <0.001 Intercept β0 = 4.42 <0.001 Lecturing (Lec) β1 = -0.0125 <0.001 Lecturing (Lec) β1 = -0.0076 0.070  Student Preparation (Prep) β2 = -0.437 0.032 Table 11. Deep learning regression model      Deep Learning Index (DL) Model                              DL=β0+β1*Lec+β2* Prep Adjusted R2 0.42 Observations 42 IndependentVariables Coefficients P-Value Intercept β0 = 3.85 <0.001 Lecturing (Lec) β1 = -0.007 0.024 Student Preparation (Prep) β2 = -0.272 0.011 

Cite

Citation Scheme:

        

Citations by CSL (citeproc-js)

Usage Statistics

Share

Embed

Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                        
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            src="{[{embed.src}]}"
                            data-item="{[{embed.item}]}"
                            data-collection="{[{embed.collection}]}"
                            data-metadata="{[{embed.showMetadata}]}"
                            data-width="{[{embed.width}]}"
                            async >
                            </script>
                            </div>
                        
                    
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:
http://iiif.library.ubc.ca/presentation/dsp.42591.1-0314909/manifest

Comment

Related Items