Open Collections

UBC Undergraduate Research

Videos vs Infographics : The Effectiveness of Different Media Types in Climate Education Chatterjee, Aanchal; Ng, Cheryl; Chen, Li-Hao; Sirohi, Tanika; Hill, Xiluva 2020-04-14

Your browser doesn't seem to have a PDF viewer, please download the PDF to view this item.

Notice for Google Chrome users:
If you are having trouble viewing or searching the PDF with Google Chrome, please download it here instead.

Item Metadata

Download

Media
18861-Chatterjee_A_et_al_PSYC_421_Videos_infographics_final_report.pdf [ 1.48MB ]
Metadata
JSON: 18861-1.0392726.json
JSON-LD: 18861-1.0392726-ld.json
RDF/XML (Pretty): 18861-1.0392726-rdf.xml
RDF/JSON: 18861-1.0392726-rdf.json
Turtle: 18861-1.0392726-turtle.txt
N-Triples: 18861-1.0392726-rdf-ntriples.txt
Original Record: 18861-1.0392726-source.json
Full Text
18861-1.0392726-fulltext.txt
Citation
18861-1.0392726.ris

Full Text

UBC Social Ecological Economic Development Studies (SEEDS) Sustainability Program Student Research Report         Videos vs Infographics: The Effectiveness of Different Media Types in Climate Education Aanchal Chatterjee, Cheryl Ng, Li-Hao Chen, Tanika Sirohi, Xiluva Hill University of British Columbia Course: PSYC 421 Themes: Food, Climate, Procurement Date: April 14, 2020       Disclaimer: “UBC SEEDS Sustainability Program provides students with the opportunity to share the findings of their studies, as well as their opinions, conclusions and recommendations with the UBC community. The reader should bear in mind that this is a student research project/report and is not an official document of UBC. Furthermore, readers should bear in mind that these reports may not reflect the current status of activities at UBC. We urge you to contact the research persons mentioned in a report or the SEEDS Sustainability Program representative about the current status of the subject matter of a project/report”. PSYC421 SEEDS Project Report Humans for Sustainability: Aanchal Chatterjee, Cheryl Ng, Li-Hao Chen, Tanika Sirohi, Xiluva Hill  1 Videos vs Infographics:  The Effectiveness of Different Media Types in Climate Education  Executive Summary This study aims to compare the effectiveness of videos versus infographics in increasing climate literacy among individuals aged 18 to 30 in Vancouver. Two groups of participants first filled out a brief questionnaire assessing their climate literacy, then proceeded to either watch a video or read an infographic about measuring one’s carbon footprint, and finally took the same questionnaire again. Analysis of participants’ score improvements using an independent samples T-test showed that participants watching the video had greater improvement than participants viewing the infographic. However, the Cohen’s D showed a small effect size and a non-significant p-value, suggesting that these results were inconclusive. These results imply that future studies require more stringent experimental measures and more engaging media strategies to increase participants’ engagement with the material, as well as a larger sample size and longer data collection period to derive more conclusive results.  Introduction Within the last decade, increasing coverage of climate change in the news, along with the widespread availability of climate science data, have generally increased the general public’s awareness about climate change (Niepold et al, 2007). However, this awareness tends to pertain mainly to knowledge about large-scale climate causes and impacts, rather than knowledge of one’s individual impact on the climate. Globally, there is still relatively low understanding amongst the general public on how to relate the science to everyday behaviours and individual impact (Shafer et al, 2009). The National Oceanic and Atmospheric Administration of USA defines a climate-literate person as someone who is able not just to understand and communicate climate science, but also translate this knowledge into environmentally-responsible decisions (NOAA, 2008). Based on research in the last 20 years, there is a clear need to improve the latter aspect of climate literacy amongst the general public. Our study thus focuses on one key aspect of climate literacy: the carbon footprint.  In a study of 965 members of the public in North America, Wynes et al (2019) found that individuals tend to underestimate the carbon footprint of various behaviors, reflecting low carbon numeracy. Yet in order for individuals to act pro-environmentally, a fundamental level of awareness of their own environmental footprint is needed (Lee et al, 2015; Wells et al, 2011). Thus, it is important to increase the general public’s carbon numeracy through educational platforms. However, which educational platform(s) would be most effective is what comes into question. Although much has been written about methods to improve the public’s climate literacy (e.g. Cooper, 2011; Satchwell, 2013; Shafer et al, 2009), there is relatively little literature specifically comparing the efficacy of various media forms on climate literacy. This study thus aims to fill that research gap by comparing the effectiveness of infographics versus videos on improving an individual’s climate literacy. These two media forms were chosen because in today’s digital age, individuals’ attention span has been shown to decrease (Levitin, 2014), making short, catchy and engaging media the most effective educational tools (Cordero, 2012; Hsin & Cigas, 2013; Tenenbaum et al, 2012).  PSYC421 SEEDS Project Report Humans for Sustainability: Aanchal Chatterjee, Cheryl Ng, Li-Hao Chen, Tanika Sirohi, Xiluva Hill  2 Hence, our research question is as follows: Which educational medium is more effective at increasing an individual’s climate literacy: a video or infographic? We hypothesize that watching a short, catchy video about carbon footprint would result in a greater improvement in climate literacy than reading an infographic with similar content would. This is because videos tend to be more interactive and engaging than infographics, leading to deeper learning. Videos incorporate two forms of learning: visual and audio. In contrast, infographics are less interactive and only incorporate one form of learning: visual.   Methods Experimental design In our study, participants were split into two groups: Group 1 watched a video about the carbon footprint, while Group 2 read an infographic about the carbon footprint (Appendix 1). Both media forms contained the same content, only delivered in different ways. For Group 1, the video was selected from a shortlist of videos on carbon footprint that we found on YouTube. We only looked for videos that were freely available online because our project is about public engagement which relies heavily on having content that is easily accessible to the public. In order to maximize participants’ retention of the information presented, we chose the video based on these criteria: (i) under three minutes long; (ii) contains three to five key points that can be easily discerned from watching the video just once; (iii) contains up-to-date information (within the last 10 years); (iv) uses simple, non-technical language. We chose the final video based on a team vote on the most engaging and informative video, and crafted our survey questions based on the five most important points from this video. We then converted these essential points into an infographic for Group 2. We made sure to include the exact same information and use the same phrasing as the video, to ensure the consistency of content received by both participant groups.  To measure the difference that the two media forms made on participants’ climate literacy, we made both groups answer the exact same questions in a pre-test and a post-test survey (Appendix 2). These questions were crafted based on the content of the media forms, ensuring that all answers could be found from either watching the video or reading the infographic, without the need for any additional knowledge or research. To ensure that participants had fully engaged with the video or infographic, we made them check a box in the survey that said that they had spent at least two minutes watching or reading (respectively), before they could proceed to the post-test survey. We also made sure that participants did not know they would be tested on the same information after the intervention, so as to simulate real-world conditions as far as possible. We were fully aware that this might lead to participants not paying close attention to the material. However, we believed that participants’ level of engagement with the material would also shed light on how engaging and effective the material was, and coupled with simulating real-world conditions where participants may come across visual material without being tested on it, we felt this was a worthy tradeoff.  Data collection All surveys were disseminated online via the UBC Qualtrics tool, in accordance with the UBC research ethics policy, in order to protect participants’ data and anonymity. Participants were young adults between age 18 and 30 living in Vancouver at the time of the study. We first pilot-PSYC421 SEEDS Project Report Humans for Sustainability: Aanchal Chatterjee, Cheryl Ng, Li-Hao Chen, Tanika Sirohi, Xiluva Hill  3 tested the survey with ten of our peers, to verify its clarity and validity. We then sent the survey to 300 individuals via emails and social media, aiming to get at least 188 respondents, given that the effect size was 0.29, alpha level 0.05 with a power of 0.8 (Ebrahimabadi et. al, 2018). We received 135 responses in total, out of which 94 were usable.   Data analysis We first calculated the improvement score for each of the 94 participants, with a simple formula that subtracts the pre-test from the post-test score (post - pre = improvement score) to obtain a score ranging from -4 to 4. Negative indicating deterioration in knowledge while positive numbers would indicate some sort of learning. We organised the data into a simple table with participant number, condition and improvement score, which was used in our final analysis on the JASP software. We carried out an independent samples student t-Test comparison to see whether the difference in score improvements was significant.  Results Using an independent samples T-test, we found that participants of Group 1 (video) had higher improvement scores than those of Group 2 (infographic). Improvement scores were 1.111 for Group 1 and 0.735 for Group 2 (Table 1). The improvement scores also showed that there was a clear distinction between the two conditions (Figure 1).  Table 1:Improvement Score for Video and Infographic Condition     Figure 1: Improvement Score Graph for Video and Infographic Condition However, using Cohen’s d we found an effect size of -0.332, which is considered significantly small (Table 2). Another critical finding was the p-value of 0.111, which based on an alpha level of 0.05 means that our results are not significant. Thus, we will be retaining our null hypothesis. Participants who engaged in the video condition did not necessarily retain information better than those in the infographic condition. It is still possible that infographics are equally or more effective than videos in educating the public about climate change. PSYC421 SEEDS Project Report Humans for Sustainability: Aanchal Chatterjee, Cheryl Ng, Li-Hao Chen, Tanika Sirohi, Xiluva Hill  4  Table 2: Independent Samples T-Test Results   Discussion There were a few possible reasons for our results. To begin with, the effect size and p-value were likely insignificant because of our small sample size. Due to time constraint, we could not collect questionnaire responses from as many participants as we had aimed for, and the 94 usable responses were not statistically significant enough. Future studies should thus aim for a higher sample size which would be more likely to generate conclusive results about the effectiveness of different media learning tools.  Additionally, the lack of significant difference between the two conditions could be attributed to experiment design. Our inability to entirely control whether participants fully engaged in the video or infographic unfortunately introduced a key confounding variable in our experiment. Although we did try to minimize the effect of this variable by making participants check a box saying they had spent at least two minutes watching or reading, we could not guarantee that participants answered truthfully. Even if they did, they may not have paid attention and tried to retain the information during the two minutes, for many reasons including boredom, fatigue, and lack of knowledge that they would be tested on the material. This confounding variable means that the difference between pre-test and post-test scores may not necessarily reflect the effect of the material itself. Future studies would need to include stringent measures to ensure participants fully engage with the material. For instance, researchers could replicate our questionnaire but signal to the participant in the pre-test that they will receive a post test, to measure if this increases learning more effectively. The experiment could be administered in person instead such that the researcher could monitor participants’ level of engagement. Finally, conducting the experiment in a lab setting instead of online may also help to minimize distractions and other confounding variables that could undermine the study’s validity.  If participants did get bored and lose interest while viewing the material, this suggests that the material may not be engaging enough in the first place, which leads to further questions about how to improve the quality of these materials. Our study thus reveals the need to include secondary measures in similar studies. For example, participants could be made to answer follow-up questions about how much time they spent viewing the material, how engaging they found it, and how much they felt they had learnt from viewing it. Responses to these questions can bring up valuable insight on how to design more effective educational material about climate change.  In the same vein, it is crucial to remember that our study only included two media types: videos and infographics. However, there is a plethora of other visual media that can be used in climate education. Examples include songs, dance, visual art, and documentaries. Hence, future research can examine the effectiveness of each of these media types and conduct cross-comparisons to find out which type(s) is most effective for various audiences. Studies can also look into PSYC421 SEEDS Project Report Humans for Sustainability: Aanchal Chatterjee, Cheryl Ng, Li-Hao Chen, Tanika Sirohi, Xiluva Hill  5 educational materials for different demographics, e.g. age, level of education, and culture. Finally, our study only measured the effects of short-term memory retention, but longer term memory retention and translation of climate science knowledge into climate-friendly behaviour are even more important. Thus, there needs to be continued research on ways to encourage individuals to move from knowledge and attitudes into daily behaviour and action.   Conclusion & Recommendations Our world is a cauldron of rising problems, of which climate change will remain a key contributor for the indefinite future. The effect it has had globally on weather patterns, socio-economic systems and both human and wildlife communities will continue to exist and increase, unless effective intervention techniques are taken. While large-scale, top-down interventions can be highly efficient, individual behavioural change can arguably have a more lasting impact. It is thus important to improve the general public’s climate literacy, ensuring they understand not just the science behind climate change but also how their own actions can contribute to the solution. In this regard, media tools can be the best way to disseminate information to the public quickly and effectively, given the widespread accessibility of digital media today. Furthermore, the most effective tools are those that are capable of truly causing a behaviour change, and previous studies have found that web-based interventions are more effective at increasing specific-based knowledge or causing a behaviour change than non-web-based interventions (Wantland et al, 2004). Our study contributes to the ongoing research about effectiveness of various media tools in climate education, and provides valuable recommendations for future research.  For our client, the UBC Botanical Garden, this study highlights the importance of ensuring that educational media forms are engaging and informative in order to capture the audience’s attention and maximize learning. Since participants in our study exhibited signs of boredom or fatigue in viewing the video or infographic, UBC Botanical Garden can consider more interactive forms of media (e.g online games) or combining various visual media instead to capture a wider audience. Our study also implies the need to provide incentives for engaging with educational media. We suggest that UBC Botanical Garden can run an online campaign to acquire participants to replicate our study, with a larger sample size and longer duration for data collection, while also achieving the goal of educating the community. The Botanical Garden could set up both media learning tools to be picked randomly throughout the campaign where participants join the study through links shared on websites and public social media pages. The incentive to learn should be tied in with the cause at hand (Vark, 2014); in this case the opening message (prior to pre-test) can inform participants that they will be entered into a draw to receive a prize if they increase their score in the post-test. The prize will tie into the cause, such as free tickets to the TreeWalk (showing them the beauty of conserved carbon), or coupons for the gift shop where winners can get house-plants and other items that act as daily visual reminders about their carbon footprint contributions. Ultimately, Vancouver consists of a relatively highly-educated and environmentally-conscious population, providing a great testbed for climate education. As a key provider of environmental outreach and education, UBC Botanical Garden is well-placed to contribute to improving climate literacy amongst Vancouverites, and our study has revealed possible steps forward in doing so.  PSYC421 SEEDS Project Report Humans for Sustainability: Aanchal Chatterjee, Cheryl Ng, Li-Hao Chen, Tanika Sirohi, Xiluva Hill  6 References Cooper, C. B. (2011). Media literacy as a key strategy toward improving public acceptance of climate change science. BioScience, 61(3), 231-237. Cordero, E. (2012). The use of social media to improve climate literacy: The Green Ninja Project. Bulletin of the American Meteorological Society, 93(12), 1813-1814. Davis, L., Rountree, M., & Davis, J. (2016). Global Cause Awareness: Tracking Awareness Through Electronic Word of Mouth. Journal Of Nonprofit & Public Sector Marketing, 28(3), 252-272.  Gallagher, S., O’Dulain, M., O’Mahony, N., Kehoe, C., McCarthy, F., & Morgan, G. (2017). Instructor-provided summary infographics to support online learning. Educational Media International, 54(2), 129-147. Hsin, W. J., & Cigas, J. (2013). Short videos improve student learning in online education. Journal of Computing Sciences in Colleges, 28(5), 253-259. Ledley, T. S., Gold, A. U., Niepold, F., & McCaffrey, M. (2014). Moving toward collective impact in climate change literacy: The Climate Literacy and Energy Awareness Network (CLEAN). Journal of Geoscience Education, 62(3), 307-318. Lee, T. M., Markowitz, E. M., Howe, P. D., Ko, C. Y., & Leiserowitz, A. A. (2015). Predictors of public climate change awareness and risk perception around the world. Nature climate change, 5(11), 1014-1020. Levitin, D. J. (2014). The organized mind: Thinking straight in the age of information overload. New York, NY: Penguin. Niepold, F., Herring, D., & McConville, D. (2007). The case for climate literacy in the 21st century. In 5th International Symposium on Digital Earth. Retrieved 7 April 2020, from http://rose.geog.mcgill.ca/geoide/files/geoide/CaseForClimateLiteracy.pdf.  Satchwell, C. (2013). “Carbon literacy practices”: textual footprints between school and home in children's construction of knowledge about climate change. Local Environment, 18(3), 289-304. Shafer, M. A., James, T. E., & Giuliano, N. (2009, January). Enhancing climate literacy. In 18th Symposium on Education, American Meteorological Society, Phoenix AZ. Tenenbaum, L. F., Kulikov, A., & Jackson, R. (2012, December). Headlines: Planet Earth: Improving Climate Literacy with Short Format News Videos. In AGU Fall Meeting Abstracts. Vark, Caspar. 2014. "World Hunger Day: Can Twitter End World Hunger?". The Guardian. Retrieved 7 April 2020, from https://www.theguardian.com/global-developmentprofessionalsnetwork/2014/may/28/social-media-raising-awareness-world-hunger.  Wantland, D. J., Portillo, C. J., Holzemer, W. L., Slaughter, R., Eva, Wantland, D. J., … (2004). The Effectiveness of Web-Based vs. Non-Web-Based Interventions: A Meta-Analysis of Behavioral Change Outcomes. Department of Community Health Systems. PSYC421 SEEDS Project Report Humans for Sustainability: Aanchal Chatterjee, Cheryl Ng, Li-Hao Chen, Tanika Sirohi, Xiluva Hill  7 Wells, V. K., Ponting, C. A., & Peattie, K. (2011). Behaviour and climate change: Consumer perceptions of responsibility. Journal of Marketing Management, 27(7-8), 808-833. Wynes, S. C., Donner, S. D., & Zhao, J. (2019, December). The limits of public carbon numeracy. In AGU Fall Meeting 2019. AGU.   PSYC421 SEEDS Project Report Humans for Sustainability: Aanchal Chatterjee, Cheryl Ng, Li-Hao Chen, Tanika Sirohi, Xiluva Hill  8 Appendices Appendix 1: Media used in our experiment  Video: https://www.youtube.com/watch?v=YseZXKfT_yY&t=2s       PSYC421 SEEDS Project Report Humans for Sustainability: Aanchal Chatterjee, Cheryl Ng, Li-Hao Chen, Tanika Sirohi, Xiluva Hill  9 Infographic:    PSYC421 SEEDS Project Report Humans for Sustainability: Aanchal Chatterjee, Cheryl Ng, Li-Hao Chen, Tanika Sirohi, Xiluva Hill  10 Appendix 2: Pre-test and post-test survey questionnaire 1. What are five key aspects of our everyday lives that contribute to our carbon footprint? 2. “Food miles” refers to the distance over which your food travels from the farm to your plate. True or false? 3. On average, how many kilograms of waste does an individual produce a day? ○ 1kg ○ 2kg ○ 3kg ○ 4kg 4. How much of the world’s water is actually usable? ○ 0.03% ○ 0.3% ○ 1% ○ 3%  

Cite

Citation Scheme:

        

Citations by CSL (citeproc-js)

Usage Statistics

Share

Embed

Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                        
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            src="{[{embed.src}]}"
                            data-item="{[{embed.item}]}"
                            data-collection="{[{embed.collection}]}"
                            data-metadata="{[{embed.showMetadata}]}"
                            data-width="{[{embed.width}]}"
                            data-media="{[{embed.selectedMedia}]}"
                            async >
                            </script>
                            </div>
                        
                    
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:
https://iiif.library.ubc.ca/presentation/dsp.18861.1-0392726/manifest

Comment

Related Items