UBC Social Ecological Economic Development Studies (SEEDS) Student ReportEducation Programs in the LFS Orchard Garden and Agora Café:Creating a Survey for Think&EatGreen@School Summer InstituteCandy ChangTannis DobsonDorothy KwokJenn PerrinMiranda TseUniversity of British ColumbiaLFS 450April 8, 2011Disclaimer: “UBC SEEDS provides students with the opportunity to share the findings of their studies, as well as their opinions,conclusions and recommendations with the UBC community. The reader should bear in mind that this is a student project/report andis not an official document of UBC. Furthermore readers should bear in mind that these reports may not reflect the current status ofactivities at UBC. We urge you to contact the research persons mentioned in a report or the SEEDS Coordinator about the currentstatus of the subject matter of a project/report”.1The University of British Columbia Food System Project(UBCFSP)Scenario 4: Education Programs in the LFS Orchard Garden andAgora CafeCreating a Survey for Think&EatGreen@School Summer InstituteGroup 5Candy ChangTannis DobsonDorothy KwokJenn PerrinMiranda TseLFS 450Instructor: Dr. Andrew RisemanApril 8th, 20112Table of ContentsABSTRACT........................................................................................................ 3INTRODUCTION................................................................................................. 4METHODOLOGY................................................................................................ 5FINDINGS AND DISCUSSIONS............................................................................. 7RECOMMENDATIONS....................................................................................... 14For Stakeholders........................................................................................................................14For Future LFS 450 students.................................................................................................. 15For Next Year’s Teaching Team........................................................................ 16PROJECT EVALUATION................................................................................... 16CONCLUSION................................................................................................... 17ACKNOWLEDGEMENTS................................................................................... 17REFERENCES.................................................................................................. 18APPENDICES.................................................................................................... 193ABSTRACTThe Think&EatGreen@School project is developing a Summer Institute, a three-day seminarthat provides teachers with a professional development opportunity to learn how to strengthenthe connections between components of the food system within their schools. The goal of thisproject was to design a survey that will determine what kind of workshops should beimplemented into the Summer Institute that will help to provide teachers with ideas andresources for garden based professional development.A literature review was first conducted by using Designing Surveys: a guide to decisions andprocedures by Ronald Czaja and Johnny Blair. Information was then collected using variouswebsites, including thinkeatgreen.ca as our primary source. While developing the survey, wewere mindful to keep the questions succinct, to minimize the use of open-ended questions,negative connotations and leading questions, and to use universal language. The completion ofthe final survey came after extensive revising and countless editing. The online survey softwareSurveyMonkey was first used but because a subscription was required for surveys exceeding tenquestions, we were advised by the LFS IT department to use the Google spreadsheet programinstead. This turned out to be a good option as it automatically interprets and analyzes the surveyquestions. One limitation of using the Google survey is that we are unsure if the format complieswith the ethics regulations compliance for distribution; therefore we recommend that ourstakeholders deal with this issue accordingly. To future LFS students, we recommend followingour methods for creating a survey as they were successful for this project.Formatted: Default Paragraph Font,Font: Cambria, 14 pt, Font color:Custom Color(RGB(54,95,145))4INTRODUCTIONThink&EatGreen@School is a research project addresses the issues of food security, foodsystem sustainability, and adaptations to climate change within the context of Vancouver schools.At UBC, the Faculty of Education has teamed up with the Faculty of Land and Food Systems inthe implementation of the Summer Institute as an extension of Think&EatGreen@School. TheSummer Institute is a three-day seminar that provides teachers with a professional developmentopportunity to learn how to strengthen the connections between components of the food systemand their schools. Our goal was to create a survey to find out what teachers already know aboutschool garden systems, and what workshops would be valuable for them to learn about. Uponcompletion of this project, the Summer Institute will be able to implement successful workshopsin an outdoor classroom / teaching and learning garden, as an extension of the Land and FoodSystems Orchard Garden (LFSOG) at UBC.By providing this information to educators, with the expectation that they will implementit into their curriculum, generates an opportunity for children to see first-hand the workings of alocal food system. This is important because in order to maximize production, our currentglobalized food system neglects stewardship of the land (Think&EatGreen@School, 2011).These practices are neither sustainable nor responsible and thus require a fresh outlook. TheUBC food system is a good example of a food system which strives to create positive change, byimproving the health of the both environment and people. They do this using their ValuesStatement: SPICE. SPICE stands for: Sustainable, People first, Innovative, Caring andExcellence (UBC Food Services, 2011). This system can be used as a base model for teacherswhen they participate in the Summer Institute.5The purpose of our project is to design a survey that will determine what kind ofworkshops should be implemented into the Summer Institute that will provide teachers withideas and resources for garden based professional development. Prior to creating the survey, weconducted a literature review. The literature review helped us to become more comfortable withthe idea of designing and implementing a survey due to our lack of experience in the field. Ittaught us the basic concepts and ideas that encompass the creation of surveys and allowed us tocreate a successful survey.METHODOLOGYIn hopes of finding an appropriate journal to review, we began our research on GoogleScholar on January 12th. We used phrases such as ‘survey writing’, ‘survey processes’, and‘survey guide’, but found all of the literature to be too confusing and not very useful to us. Wethen decided to do further research in the UBC library where we found the perfect survey-writingbook called Designing Surveys: a guide to decisions and procedures by RonaldRondald Czajaand Johnny Blair on January 19th. This journal contains examples of successful and non-successful survey questions supported by flow charts and tables. In preparation for our literaturereview, we divided the journal into sections amongst our group members. Each group memberanalyzed and interpreted their section and on February 8th we combined our work to put togethera PowerPoint presentation. On February 9th, we presented our literature review to our TeachingAssistant (TA) and fellow classmates. The review consisted of the following topics: factors inquestionnaire development, understanding the data collection process, how to construct a survey,criteria for survey questions, and organizing the questions. Our presentation also included aninteractive portion where we showed our classmates examples of bad survey questions and asked6them what kinds of revisions they would make. Through classmate interaction, we discoveredareas and aspects of survey writing that may not have been noticed individually.After our literature review, we began our primary data collection by researching theSummer Institute through the project website www.thinkeatgreen.com on March 4th at 3:40pm.We then met with our project stakeholders to gain a full understanding of the Summer Instituteand to determine our stakeholder’s expectations for us and our survey. We started our project bycreating a SurveyMonkey account at www.surveymonkey.com. This is an online database thataids in the development and preparation of surveys for distribution. We also created a‘Gmail’‘gmail’ account so that survey participants have a contact for further inquiries. We intendfor this email to be used by future LFS 450 students who are working on this scenario.Next, we began researching other sources and examples of surveys online in order togather thoughts and ideas on what we would like to ask in our survey and on the kinds ofquestions we would like to use. On February 9th Google searches were conducted using phrasessuch as ‘garden surveys’, ‘garden-based learning surveys’, and ‘teaching-garden surveys’, inorder to gain experience as to what type of questions should be used in our survey (Appendix 1).On February 16th, we posted a tentative timeline on Vista, regarding the progress of ourproject (Appendix 2). We then worked as a group to create a pilot survey (please see Appendix 3)and executed a ‘trial run’ on our fellow LFS break-out room members who critically reviewedthe rough draft on March 9th. Afterward, we compiled the feedback from our classmates andrestructured our survey to incorporate their suggestions. Upon reaching agreement with regardsto survey revisions, we submitted a copy to our TA and our stakeholders to be reviewed onceagain. After analyzing their feedback and incorporating their revisions into our survey, weemailed it back to them for a second overview. On March 17th, they replied through email and7requested further changes to be made. After revising and incorporating changes (Appendix 4-editing process), we finally developed a final copy of our survey which was entered intoSurveyMonkey for our stakeholders meeting on March 30th. Due to a budget constraint for theSurveyMonkey account, we decided to talk to Duncan McHugh, the LFS Multimedia Developerand Morgan Reid, the Learning Technologies Specialist for advice on how to proceed. Afterfurther discussion, they suggested that we use a website called “Google Forms” found onwww.spreadsheet.google.com/gforms where the final copy of our survey was posted and can beseen in Appendix 5.After countless revisions, we emailed the finalized copy to our stakeholders so that theycan distribute the survey to the participating teachers on our behalf. We anticipate that thesurveys will be distributed in mid-late April, and should expect to receive responses within aweek’s time. Due to time constraints, we will not be receiving responses back in time; thereforewe were unable to compile and analyze the data. The Google form website, however, allows forthe participants’ answers to be entered into an Excel spreadsheet for future analysis. Furtherinstructions regarding analysis can be found athttp://docs.google.com/support/bin/answer.py?answer=139706.FINDINGS AND DISCUSSIONOverall, we found that developing and implementing a survey is much harder than we allexpected. All of us came into this project as amateurs with little to no experience in working withor creating surveys, so we were surprised at how difficult and tedious the actual process was.After conducting a literature review using the book Designing Surveys: a guide to decisions andprocedures by RonaldRondald Czaja and Johnny Blair, we found that there are actually many8different steps and components that must be taken into consideration when designing a survey.One of the main objectives to survey development is to keep the survey questions short andconcise while still ensuring that all relevant information is captured (Czaja & Blair, 1996, p.60).“As much as [survey questions should] sound natural, they are not simply forms of conversation.Ideally, survey questions are shorn of the vagueness, ambivalence, asides, and digressions ofeveryday speech” (Czaja & Blair, 1996: p.63). Another objective within survey development isto be able to “write questions that are unadorned and uncomplicated, as explicit and singleminded as a lawyer’s interrogation. While we want self-administered survey questions to be readsmoothly, it is important to recognize that a survey question is a very special construct with aclearly focused purpose” (Czaja & Blair, 1996, pp. 62-63). In other words, unlike everydaylanguage, survey questions must be able to stand on their own without any supplementaryexplanation. Czaja and Blair suggest minimizing the use of open-ended questions, whererespondents are not given explicit answer choices, and using closed-ended questions, whererespondents are given explicit answer choices, as much as possible when creating a survey. Thisis because “data from open-ended questions are essentially narratives that must be interpretedand coded. After the survey is over, the researcher is still a step away from having results thatcan be analyzed quantitatively” (Czaja & Blair, 1996: p.63), which therefore increases the timethe researcher spends on the analyzing process. The use of open-ended questions can also bedangerous if different researchers analyze the open-ended questions because these kinds ofquestions are often open to subjective interpretation which can cause misleading survey results.Because “the reliability of the data obtained through survey research [also] rests on the uniforminterpretation by respondents” (Czaja & Blair, 1996, p.63), it is critical to avoid complex,9technical terms and to use universal language so that all respondents, of different ethnicity's,backgrounds, and professions, are able to understand the survey questions.We also learned about types of questions we should avoid when constructing a survey.Czaja and Blair stressed that it is important to avoid using agree-disagree questions whererespondents are given a statement and asked whether they agree or disagree with it. This isbecause, “research has shown that there is a tendency toward agreement, regardless of thequestion's content” (Czaja & Blair, 1996, p.73). We learned that we had to be careful to avoid thestructurally flawed double-barrelled question which often “unintentionally has two parts, each ofwhich the respondents may feel differently about.” A question that states: “Do you think yourschool garden and other school gardens throughout the lower mainland are doing an excellent,good, or poor job at educating children on healthy eating?” is an example of a double-barrelledquestion. A respondent, who thinks their school garden is doing an excellent job while otherschool gardens are doing a poor job, has no way to answer. We also discovered that we need toavoid using questions that contain negative connotations as well as leading questions. Questionswith negative connotations contain misleading words or phrases that can automatically sway theparticipant’s response towards a certain answer that they may not have originally chosen.Leading questions may also persuade a respondent’s decision one way or another by ‘leading’them towards a certain answer through the use of persuasive or positive words or phrases.Although we ended up borrowing questions from other reliable surveys online such asthose from the University of California (University of California Agriculture and NaturalResources, 2011), and from the California School Garden Network (California School GardenNetwork, 2010) to include into our survey, we modified the questions to ensure that they wereappropriate based on Czaja and Blair’s suggestions. We decided to borrow rather than create our10own questions because an advantage of borrowing questions from previous studies is that we can,or may need to, compare results to previous findings. Such a comparison is much moreproblematic if our questions are different from those used previously. If our findings differ fromthe earlier study, we cannot say how much of the difference is due to the changed questions. Inaddition, questions from other surveys may have already undergone a great deal of testing, whichmay save us some effort (Czaja & Blair, 1996, p.60).After we established the most important questions we wanted to include in our survey,we started to write an introduction that would complement and precede our final survey. Czajaand Blair state that an introduction is the place that should contain what the study is about, whois conducting it, who is the sponsor, why the study is important and what will be done with thestudy results (Czaja & Blair, 1996, p.34). The introduction should contain information to“convince potential respondents that the study is important enough for them to devote theirpersonal resources of time and effort in to it” (Czaja & Blair, 1996, p.78). Therefore, this is why,in our introduction, we decided to provide information on what the Think&EatGreen program isabout along with their website's link, details on the Summer Institute, and an incentive uponcompletion of the survey (i.e. a chance to win free enrolment to the Summer Institute). After theintroduction was complete, we organized our chosen survey questions into sections and orderedthe questions from general to more specific with the first question in the survey being the mostgeneral and easiest question to answer. This is important because “most refusals occur at theintroduction or during the very first few questions in self administered surveys” (Czaja & Blair,1996, p.83). The purpose of making the questions progressively harder and more in detail is tograsp the respondent's attention and motivate them to finish the survey. Making the survey easyfor respondents is also crucial. It is recommended that it is best to offer possible answers to select11from, appropriate space for each answer, simple instructions, and overall, an appealingappearance where the size, spacing of words and survey length are all taken into consideration(Czaja & Blair, 1996 p.92).Presenting our literature review as a PowerPoint to our classmates, allowed us toconsolidate our knowledge and find out what they think should be included in our survey. Duringthe interactive session of our presentation, where we presented poor examples of surveyquestions and asked our classmates how they would improve/fix the questions, we discoveredthat many of our classmates had difficulty with revising questions to make them moreappropriate. We also found that receiving opinions and outlooks from so many differentclassmates who come from different backgrounds with different beliefs and outlooks was verybeneficial. Aspects and areas of survey development that may have never been brought to ourattention were brought up through classmate interaction. For example, a classmate pointed outthat it is important to give proper definitions to uncommon words such as, vermiculture. Anotherclassmate suggested incorporating a scale from one to five in our final survey. However, weexplained that scales are not necessarily the best tool to use because, from our own personalexperiences, we believe that respondents may often choose the neutral option in order to get thesurvey done faster. It was also reinforced how important spelling, grammar, and punctuation isas one classmate got confused with what we were actually asking in one of the survey questionsdue to poor punctuation. Avoiding ambiguous questions also came up as an important topicthrough classmate interaction which forced us to reanalyze a few of the questions to make themclearer.During the first encounter with our stakeholders, we were informed of theThink&EatGreen website where we were able to find more information about the program and12the Summer Institute. This project is a “community university collaborative project on foodsecurity in Vancouver schools and institutional adaptations to climate change” (Think&EatGreen,2011). At the time of the meeting, we were unsure as to whether we would be able to receiveresults back from our survey due to time constraints; hence our stakeholders told us that weshould prepare a guide for analysis for future LFS 450 students as part of the project just in casewe weren’t able to do the analyzing ourselves. We began developing an analysis guide on Excel,yet we later found that the Google spreadsheet program already provides users with an analysisguide. Therefore, we felt it was more appropriate to use the Google spreadsheet’s analysis guideas it is less prone to human errors. Our stakeholders also told us that we should stress the timefactor of how the survey should only take five minutes as teachers often have very busyschedules. Furthermore, our Stakeholders loved the idea of including an incentive for completingthe survey, such as a free enrolment to the Summer Institute, as they thought it would reallyincrease the response rate.After conducting the pilot survey with our classmates, we found that many revisions stillhad to be made. Many commented that the option “If yes, please specify” was confusing forquestion number two in the pilot survey which stated: “Is the school garden used for academicinstruction?” For the list of workshop types, many found that the options were too broad, or theterminology used needed further clarification. Our classmates informed us that we should allowparticipants more than one option to respond to a particular survey question. Therefore, inquestion fourteen of our final survey, we allowed participants to select their top three choices ofwhich types of workshops they would like to attend, instead of constraining them to only onechoice. This enabled us to acknowledge that many participants may have more than one reasonfor attending the Summer Institute. We found that changes had to be made for the list of subjects13in question three. We needed to widen the spectrum of subjects as to cover the curriculum ofboth elementary and secondary school. As the list of workshops choices became longer thanbefore due to suggestions from classmates and stakeholders, we found that grouping workshopsunder a similar heading would be adequate. For example, we divided the workshop questionsinto two sections, one of which targeted on garden-management while the other garden-basededucation. Question six which asked participants what they were least interested in doing duringthe Summer Institute was eliminated from the pilot survey because we found that the questionadded no significance to the survey.We discovered that communicating with each other and with others though email wasmuch more effective than face-to-face communication. We found emailing to be very efficient asit allows for timely responses and the ability to send emails to multiple recipients. Using emailalso helps with organization by keeping track of our communication logs in word format forfuture references if necessary. Meetings in person were often very difficult to arrange with thestakeholders and with each other due to busy and opposing schedules. Although we only got tomeet with our stakeholders one time to discuss our project, it was a very effective meeting wherethey gave us a multitude of information and recommendations. Despite our group’s hecticschedules, we managed to find plenty of quality time outside of class to meet with each other andwork on the project as a group. A couple of aspects to note during group meetings were languagebarriers between group members. Because most of us have different backgrounds and ethnicities,we sometimes had a hard time understanding each other which thus caused temporary confusionand a little frustration. However, we all felt that having this diversity within our group helped ourproject in the long run because we all lent different opinions, beliefs, and outlooks to our finishedproject.14RECOMMENDATIONSFor Stakeholders:While designing this survey, we learned from our literature review that demographicquestions should be placed at the end of a survey instead of the beginning where our stakeholdershad suggested us to put them. For our finalized survey, we have placed the demographicquestions at the beginning of the survey to meet the suggestions of the stakeholders. It isrecommended that a final decision should be made by the stakeholders, Julia Ostertag andChessa Adsit-Morris, on the placement of these demographic questions. In addition to the resultsthat were gained from the pilot survey reviewed by our classmates and the many revisions doneto the survey by us, we still recommend that a thorough overview of the survey should be doneto clarify any changes that our stakeholders see fit. This overview is recommended for both thestakeholders and the Teaching Assistant for this project.Following the completion of the final survey, special attention should be made towardsensuring that the use of Google Form to distribute the survey complies with the ethics, rules andregulations of the UBC Research Ethics Board. This review can be done immediately while thesurvey is being prepared for distribution and ideally should be done by Julia and Chessa toguarantee that the survey will not present errors to the overall research project.For future survey design projects, we recommend that the stakeholders should provide alist of specific criteria that should be met for the design of the survey. This should ideally bedone before the start of the term for future students as this list will allow students to have a betteridea to what the stakeholders want and need. This will also allow the students to be in the samemindset as the stakeholders at the beginning of the term.15For Future LFS 450 Students:At the beginning of the term, we suggest that future LFS 450 students start off withconducting a literature review on survey writing due to the time and understanding required tolearn the techniques for proper survey writing. Planning out a timeline for the group is alsohighly recommended. This timeline will help to keep everyone in the group on track of the wholeproject.Because survey design can be a tough and often tedious process we think that somemotivation may be needed: it may help students to see the results of our completed survey beforestarting their project. Due to time constraints, we did not have a full analysis done for our survey;however, the convenience of Google spreadsheets should make this process very easy. If analysisis not already completed by Summer Institute members it might be a good idea to have futureLFS 450 do the survey analysis. All the information and instructions needed to compute the datais available on the Google Forms website found at,http://docs.google.com/support/bin/answer.py?answer=139706The usage of Google Form is recommended to continue as the program is easy to use andeasily accessible to anyone who has an internet connection. Another recommendation is to useand maintain the Gmailgmail account that was created this year for any questions and concernsthat a teacher may have towards the survey. The maintenance of this email account can beoverseen by future LFS students and by the stakeholders when the term is over for the students.Many papers have suggested that garden programs should start at an early stage of life,thus our survey is designed mainly for Elementary and High-school teachers. We do, however,think that University teachers should be considered as well because it is never too late to getinvolved.Formatted: Font color: Black16For Next Year’s Teaching Team:Our group felt that this year’s survey project was very successful and we have no futurerecommendations for next year’s teaching team. So long as the TA’s continue to provideinformative feedback, future groups should not have difficulties.PROJECT EVALUATIONOur project was to design a survey to help determine which workshops teachers areinterested in attending at the three-day Summer Institute. Looking for the sources in designing asurvey was not that hard comparing to the actual process of designing a survey. Even though weunderstood the basics of designing a survey, because most of us had little to no experiencewriting surveys, we never thought that designing a good survey required so much time andbackground knowledge.For the last four months, we had overcome many challenges ranging from the normalgroup work to the actual project itself. Since we are all taking more than one course and thushave different schedules, not all of us could always attend meetings outside of class. Anotherproblem we had was communication. Getting the right words and phrases across the table andcoming to an agreement can sometimes be a hard thing to do. However, these challenges did notstop us from getting the work done well. Some other challenges are actually from the projectitself. Choice of words and phrases can really be difficult sometimes because they are alldifferent. Some may have negative connotations, while some positive. Others may turn aquestion into a leading question, and sometimes a certain word or phrase did not fit, even thoughit seemed as though it should be included. Our project evaluation wews done throughout thewhole term because for every survey that we revised and edited, we were required to evaluate.17After all the editing, revising, meeting with our stakeholders, and talking to our LFS 450 TA, ourfinalized survey has become polished, smooth, excellent and professional. We evaluated ourproject through our patience, our commitment, our time, and our endless efforts.CONCLUSIONEven though we might think that we know how to write and design a good survey,designing a survey is actually a long process and it cannot be done overnight. A lot ofbackground knowledge is needed to make the process of survey writing flow easily. The surveymust be tested and revised again and again to make it appropriate.Upon completion of this project, we conclude that survey writing is intensive and tediousas it is hard to find a voice that fits a wide audience and is not ambiguous. Another challenge tosurvey writing was to create questions where responses can be analyzed in an appropriatemanner. We found that the key component to a successful survey for the stakeholders, surveywriters and workshop designers was a collaborative effort, so all relevant information wasincluded.ACKNOWLEDGEMENTSWe extend our gratitude to LFS 450, Will Valley and our stakeholders JuliaJuila andChessa in giving us helpful advice, guidance, support and commitment, which has made thisproject successful.18In conclusion, we believe we have created an appropriate survey that meets the needs ofour stakeholders and our goal in LFS 450 scenario 4. This is our greatest success in LFS 450 andwe did well.REFERENCESCalifornia School Garden Network. (2010) California School Garden Education Survey.Retrieved from:http://www.lifelab.org/wp-content/uploads/CA_School_Garden_Survey_2011.pdfCzaja, R. & Blair, J. (1996). Designing Surveys: a guide to decisions and procedures. ThousandOaks, California: Pine Forge Press.Kentucky School Garden Network. (2011) Kentucky School Garden Network. Retrieved from:http://surveymonkey.com/s/schoolgardensSustainable SMU-Learning Garden Survey. (2011) Sustainable SMU-Learning Garden.Retrieved from: http://www.surveymonkey.com/s/RZKH66GThink&EatGreen@School. (2011). Summary of Proposed Research. Retrieved from:http://www.thinkeatgreen.ca/summary-of-proposed-researchUBC Food Services. (2011) Values Statement. Retrieved from:http://www.food.ubc.ca/about-us/valuesUniversity of California Agriculture and Natural Resources. (2011) Garden Based LearningWorkgroup Planning Survey. Retrieved from:http://ucce.ucdavis.edu/survey/survey.cfm?surveynumber=464&back=none19APPENDICESAppendix 1: ULRs for survey designhttp://aggie-horticulture.tamu.edu/kindergarden/survey/survey.htmhttp://ucce.ucdavis.edu/survey/survey.cfm?surveynumber=464&back=none,http://www.surveymonkey.com/s/RZKH66G, http://surveymonkey.com/s/schoolgardens,http://www.lifelab.org/wp-content/uploads/CA_School_Garden_Survey_2011.pdf,http://blogs.cornell.edu/garden/grow-your-program/evaluation-toolkit/surveys/http://www.grownyc.org/blog/?p=63Appendix 2: TimelineAppendix 3: Pilot SurveyAt UBC the Faculty of Education has teamed up with the Faculty of Land and Food Systems inthe implementation of the Think&EatGreen@School project. This research project addresses theissues of food security, food system sustainability, and adaptations to climate change within thecontext of Vancouver schools. Our goal is to implement an outdoor classroom/teaching andlearning garden as an extension of the Land and Food Systems Orchard Garden (LFSOG) atFeb 16 Timeline and Qs for stakeholders uploaded onto VistaFeb 23 Meeting with stakeholders // Develop QuestionsMar 2 Develop QuestionsMar 9 Finalize Questions // Test run of survey in classMar 16 Paper Outline DueMar 23 Work on paperMar 30 Work on presentationApr 6 Final Presentation in ClassApr 8 Final Paper DueFormatted: Heading 1, Centered, Linespacing: 1.5 linesFormatted: Font: Calibri, 11 ptFormatted: Line spacing: Multiple1.15 liFormatted: Book Title, Font color:Custom Color(RGB(54,95,145)), NotSmall caps, Not Expanded by /Condensed by20UBC. The Think & Eat Green @ Project is developing the Summer Institute, which is a weeklong seminar that provides teachers with a space and resources for garden-based professionaldevelopment. Further information about the Think & Eat Green @ School Project can be foundat http://thinkeatgreen.caThis survey is designed to determine what garden-based workshops Vancouver school teachersare interested in learning about at the Summer Institute.This survey is very short and should only take about 5 minutes of your time. Uponcompletion of the survey prior to March ___, 2011, you will be entered to win a freeentrance ticket to the Summer Institute.If you have any questions regarding the survey, please feel free to contact us atubc.garden.survey@gmail.com.Survey Questions:1. Does your school have a school garden? If yes, please answer the following questions. Ifno, please skip to question #3.A. YesB. No2. Is the school garden used for academic instruction?A. YesB. NoIf yes, please specify how it is used for academic instruction:3. Which of the following subjects do you think should be incorporated into the SummerInstitute? Select all that applyA. BusinessB. ComputersC. Science(Physics/Chemistry/Biology)(Physics/Chemisry/Biology)21D. EnglishE. Creative Arts (drama, music, graphic arts, film,visual arts)F. Environmental studiesG. Career and Personal Planning (CAPP)H. History or Social StudiesI. Home economicsJ. LanguagesK. MathematicsL. Nutritional HealthM. Physical educationN. Leadership (Service Learning/Community Service)O. Other (please specify)4. What types of workshops would you be interested in taking in the Summer institute?Select all that apply.A. Understanding the local ecosystemB. Garden DesignC. Incorporating annual plants (a plant that germinates,flowers, and dies in a year or season)D. Incorporating perennial plants (a plant that lives formore than 2 years)E. Soil BuildingF. Composting Techniques (Vermiculture- a processwhere waste is eaten and converted by worms)G. Seed Starting/ Seed SavingH. Nutrient CyclingI. Pollination (the role of pollinators in pollination)J. Integrated Pest managementK. Season Extension (including greenhouse growing)L. Water ManagementM. Weed ManagementN. HarvestingO. Nutrition and human healthP. CookingQ. PreservesR. Other (please specify)5. What would you be MOST interested in doing at the Summer Institute? Check THREEoptions of MOST interest to you.22A. Discussing policies and issues relevant for community and school gardensB. Finding out about resources, support, and available tools that support school andcommunity gardeningC. Learning through sharing ideas and best practicesD. Learning from presentations of other community and school garden project and theirexperienceE. Learning technical information on how to start and manage a gardenF. Networking, making connections, and building relationshipsG. None of these options are of interest to meH. Other (please specify)7. School Name8. What grade(s) do you teach?*9. Your Name*10. Email Address*Optional- only used to notify prize winner23Appendix 4: Survey Draft #3 with comments and suggestions from stakeholders24252627Appendix 5: Finalized Survey282930