UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Online participatory systems : a case study of a web platform for public consultation Castro Miravalles, Claudia Maria 2016

Your browser doesn't seem to have a PDF viewer, please download the PDF to view this item.

Item Metadata


24-ubc_2016_may_castromiravalles_claudia.pdf [ 4.18MB ]
JSON: 24-1.0300059.json
JSON-LD: 24-1.0300059-ld.json
RDF/XML (Pretty): 24-1.0300059-rdf.xml
RDF/JSON: 24-1.0300059-rdf.json
Turtle: 24-1.0300059-turtle.txt
N-Triples: 24-1.0300059-rdf-ntriples.txt
Original Record: 24-1.0300059-source.json
Full Text

Full Text

ONLINE PARTICIPATORY SYSTEMS: A CASE STUDY OF A WEB PLATFORM FOR PUBLIC CONSULTATION by  Claudia Maria Castro Miravalles  B. Environmental Sciences and Arts, Universidad Central de Chile, 2003  A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF  MASTER OF SCIENCE in THE FACULTY OF GRADUATE AND POSTDOCTORAL STUDIES (Forestry)  THE UNIVERSITY OF BRITISH COLUMBIA (Vancouver)  April 2016  ©Claudia Maria Castro Miravalles, 2016 ii Abstract The aim of the study was to address the issues discussed by scholars that refer to the potential and the challenges of the use of online participatory methods, and contextualize them in a case study of a web platform for public consultation. To accomplish this, first a framework was set up by conducting a literature review. This study contrasts some of the theories described in the academic literature with the empirical results of this study. Using a mixed methods approach, an online survey (N=118) and phone interviews (N=23) were conducted. The survey and interview addressed issues of satisfaction with the current experience of participating online, trade-off between transparency and privacy, governance involvement and familiarity with technology in the context of online public consultation, in the hope to learn more regarding the concerns that users have about potentially replacing more traditional methods of consultation with online methods. Based on the experience of the participants of this study, we were able to identify some of the elements that increase and decrease satisfaction when participating in public consultation online, which seems to be critical in guiding the efforts of increasing online engagement. It was also found that participants prefer features such as summary tables, graphs and pictures when presented data related to the topic consulted, and that they value having options to providing feedback on a participatory web platforms. Finally, we discuss the possibility of further research with the aims of providing participants the opportunities they look for to engage in a more nuanced way. Although this research sheds light on online behavior, it also demonstrates that this is still a field that constantly requires exploration as the interaction of internet users with participatory platforms evolves. iii We think that understanding how theory and practice coming into play in an empirical scenario of a public consultation web platform could help online platforms and mechanisms to not only improve online participants’ experience, but most importantly, to focus on ways to achieve the larger goals of public consultation via the Internet. iv Preface This study was designed, implemented and written by Claudia Castro under the overall supervision and guidance of Dr. Michael Meitner.  The survey was designed by Claudia, revised by her supervisor Dr. Meitner and approved by her committee members: Dr. David Tindall, and Dr. Maged Senbel. Dr. David Tindall provided guidance on research methods. Arising from this study, a scientific paper is currently in review. This research was assessed and approved by the University of British Columbia Behavioral Research Ethics Board, under Certificate Number: H14-01191. v Table of contents  Abstract .................................................................................................................................... ii	Preface ..................................................................................................................................... iv	Table of contents ..................................................................................................................... v	List of tables............................................................................................................................ ix	List of figures ........................................................................................................................... x	List of abbreviations ............................................................................................................. xii	Acknowledgements .............................................................................................................. xiii	Chapter 1: Introduction ......................................................................................................... 1	1.1	 Information communication technologies and online participation ......................... 2	1.2	 Public consultation .................................................................................................... 3	1.3	 Context of the research questions ............................................................................. 6	1.4	 Research questions .................................................................................................... 9	1.5	 Literature review ..................................................................................................... 10	1.5.1	 Context of participatory processes .................................................................. 11	1.5.2	 Public consultation .......................................................................................... 16	 Transparency versus privacy ....................................................................... 17	 Efficiency .................................................................................................... 18	 Involvement ................................................................................................ 19	1.5.3	 Online governance involvement ..................................................................... 20	1.5.4	 Social media for participation ......................................................................... 21	1.5.5	 Participation and participatory processes online ............................................. 21	vi 1.5.6	 Limits of public consultation .......................................................................... 26	Chapter 2: Methods .............................................................................................................. 30	2.1	 Research design ...................................................................................................... 30	2.2	 Online survey .......................................................................................................... 33	2.3	 Interview ................................................................................................................. 38	2.4	 Sampling ................................................................................................................. 39	2.4.1	 Population of interest ...................................................................................... 40	2.4.2	 Recruiting ........................................................................................................ 41	2.4.3	 Data collection ................................................................................................ 41	2.5	 Ethical issues ........................................................................................................... 42	Chapter 3: Results ................................................................................................................. 43	3.1	 Demographics overview ......................................................................................... 43	3.1.1	 Gender ............................................................................................................. 43	3.1.2	 Level of education ........................................................................................... 45	3.1.3	 Income range ................................................................................................... 46	3.1.4	 Age .................................................................................................................. 48	3.1.5	 Planners and general public ............................................................................ 49	3.2	 Online survey areas of research .............................................................................. 50	3.2.1	 Governance involvement ................................................................................ 50	3.2.2	 Familiarity with technologies and use of online participation tools ............... 52	3.2.3	 Privacy concerns, and transparency versus privacy. ....................................... 57	3.2.4	 Satisfaction of the use of online participatory platforms ................................ 58	3.2.5	 Online public consultation .............................................................................. 59	vii 3.3	 Interviews ................................................................................................................ 61	3.3.1	 Coding of the content ...................................................................................... 61	3.3.2	 Satisfaction of the use of online participatory platforms ................................ 64	 Increase satisfaction .................................................................................... 65	 Decrease satisfaction ................................................................................... 70	3.3.3	 Trade-off between transparency and privacy .................................................. 73	3.3.4	 Public consultation platforms ......................................................................... 79	3.3.5	 Role of public consultation ............................................................................. 84	3.3.6	 General comments .......................................................................................... 88	Chapter 4: Discussion ........................................................................................................... 90	4.1	 Satisfaction of the use of participatory web platforms ........................................... 90	4.2	 Trade-off on transparency and privacy ................................................................... 97	4.3	 Governance involvement of participants .............................................................. 101	4.4	 Further areas of potential investigation ................................................................. 105	4.4.1	 Limitations of the study ................................................................................ 108	Chapter 5: Conclusion ........................................................................................................ 110	5.1	 Recommendations. ................................................................................................ 117	References ............................................................................................................................ 119	Appendices ........................................................................................................................... 128	Appendix A Online survey ............................................................................................... 128	A.1	 Email invite ....................................................................................................... 128	A.2	 Consent ............................................................................................................. 129	A.3	 Survey page 1 .................................................................................................... 130	viii A.4	 Survey page 2 .................................................................................................... 131	A.5	 Survey page 3 .................................................................................................... 132	A.6	 Survey page 3 (continued) ................................................................................ 133	A.7	 Survey page 4 .................................................................................................... 134	A.8	 Survey page 5 .................................................................................................... 135	A.9	 Survey page 6 .................................................................................................... 136	A.10	 Survey page 6 (continued) ................................................................................ 137	A.11	 Survey page 7 .................................................................................................... 138	A.12	 Survey page 7 (continued 1) ............................................................................. 139	A.13	 Survey page 7 (continued 2) ............................................................................. 140	A.14	 Survey page 8 .................................................................................................... 141	Appendix B Interview ....................................................................................................... 142	B.1	 Follow up interview protocol ............................................................................ 142	 ix List of tables  Table 3.1 - Classification of nodes for the question: “What things make you more or less satisfied and why?” ................................................................................................................. 65	Table 3.2 - Classification of nodes for the question: “What are the issues that are important to consider when trading off your personal privacy with increased transparency of decision making in your community?”. ................................................................................................ 75	Table 3.3 - Classification of nodes for the question: “What is the main benefit or potential benefit of public consultation software?”. .............................................................................. 80	Table 3.4 - Classification of nodes for the question: “What is the role of online public consultation?  To inform decision-making processes, or to help communities create their own vision of the future”. ............................................................................................................... 84	 x List of figures  Figure 1.1 Context of case study within online platforms ........................................................ 6	Figure 2.1  Methods used in the research ............................................................................... 31	Figure 2.2 Welcome page of the case study web platform where the survey was hosted. ..... 33	Figure 2.3 Frequency distribution of types of questions implemented in the online survey .. 38	Figure 3.1 Frequency distribution of gender of participants ................................................... 43	Figure 3.2 Frequency distribution of level of education of participants ................................. 45	Figure 3.3 Mean scores of level of education compared to transparency versus privacy preference of participants. ....................................................................................................... 46	Figure 3.4 Frequency distribution of income of participants .................................................. 47	Figure 3.5 Frequency distribution of age of participants ........................................................ 48	Figure 3.6 Planners vs. non-planners participants. ................................................................. 49	Figure 3.7 Voting frequency per type of election ................................................................... 51	Figure 3.8 Frequency of attended face-to-face meetings of participants. ............................... 51	Figure 3.9 Frequency distribution of frequency of reading social media of participants. ...... 53	Figure 3.10 Frequency content creation of participants. ......................................................... 54	Figure 3.11 Preferred web features for online consultation. ................................................... 55	Figure 3.12 Familiarity with other participatory platforms. ................................................... 55	Figure 3.13 Frequency that participants play online games. .................................................. 56	Figure 3.14 Frequency that participants respond to online surveys. ....................................... 56	Figure 3.15 How comfortable the participants feel about sharing personal information. ...... 57	Figure 3.16 Frequency distribution of participants inclined toward transparency or privacy. 58	xi Figure 3.17 Satisfaction levels of the use of participatory web platforms of participants. ..... 59	Figure 3.18 Favorable beliefs towards online public consultation. ........................................ 60	Figure 3.19 Negative beliefs towards online public consultation. .......................................... 61	Figure 3.20 - Coding steps ...................................................................................................... 64	 xii List of abbreviations  ANOVA- Analysis of Variance BC – British Columbia BREB – Behavioral Research Ethics Board CONICYT – From the Acronym in Spanish of National Commission for Scientific and Technological Research  CORE – Course on Research Ethics Eta2 - Measure of effect size for use in ANOVA F – F ratio NVivo – Qualitative Data Analysis (QDA) Computer Software Package Produced by QSR International Dr. – Doctor ICT – Information Communication Technologies PC – Public Consultation PI – Principal Investigator SD – Standard Deviation SPSS- Statistical Package for the Social Sciences TCPS2 – Tri Council Policy Statement  xiii Acknowledgements  I would like to express my heartfelt gratitude to several people that were part of this journey of professional/self-growth and, to God and the universe for placing them in my life path. There are many other people that in different ways were part of this expedition and somehow helped me with encouraging words, advice, or by listening. To all of them, thank you. All the support, encouragement and the love I received from them made this experience surpass all the expectations I had of becoming a student one more time. First, I would like to thank my mother whom gave me the last little push I needed to keep chasing my dream of becoming a graduate student. Also, for all her encouragement, coaching and for teaching me by example to persevere despite the adversity. Thanks to my brother, Tito for awakening the curiosity for sciences in me. I would like to thank you, Sean for your patience while I was a “student”, you have shown the greatness of a strong man who believes in women’s potential and power, and especially in me.  From the Faculty of Forestry I would like to thank my supervisor Dr. Michael Meitner for guiding me through this process with great spirit, for our discussions that allowed me to increase my analytical thinking, sharing your research experience with me, your revisions, and specially for your valuable feedback. I thank you, Dr. Tindall for guiding me in the territories of quantitative research; your classes and directions in this field and the feedback provided to this research were of great benefit. Dr. Senbel, thanks for your revisions, feedback and for guiding my directed studies credit. xiv From the department of Forest Resource Management I thank Scheyla, David, Jerry W, Jerry M, and Carl for providing assistance with technical issues and administrative help. Thanks to Gayle, Robin and Susan who always, with great disposition, guided me through the mysterious grounds of administrative paperwork and requisites.  To my lab fellows at the IDEAL lab, Lorien, thanks for engaging in discussions to fix the worlds’ problems, for your humor, and for those times you brought your four-legged companion, Grace, who would “woof” and make the days entertaining in the lab. Angela, Pille, thanks for your warm hearts, Julian for sharing your insight, and Miki for your words of encouragement. I would also like to thank a group of great people that now I am honored to call my friends; Vanessa, Alvaro, Estefanía and José, and especially to Claudia, for making the school experience, and this world a better place.  I would also like to thank Colleen Hardwick and Yuri Artibise for their valuable insight on online public consultation. I would like to express my appreciation to a group of wonderful people that are part of the UBC community and who fill up with fond memories this period of time. First, I thank all those who provided me with the opportunities I was given from working as a TA, marker, invigilator at Sauder school, research project coordinator in Costa Rica, to auditing French classes and volunteering at the farm. You all allowed me to feel deeply blessed for such fortune of becoming an UBC alumna, and to be able to embrace it. Finally, I thank CONICYT and Mitacs for their funding opportunities provided during this time.  1 Chapter 1: Introduction Online public consultation is a participatory tool that is currently used by public and private organizations to reach out to people, via the Internet and Information Communication Technologies (ICT), to find out about their opinions on certain topics. These tools are generally set up on web platforms to provide information, gather feedback, and communicate the results of the consultations back to participants and stakeholders, or both. The private and public sectors have been developing their web platforms and ICT systems with the aim of improving the communication interface with users.  This trend of gathering public input via the Internet continues to grow, and has spawned a growing population of digitally literate users and organizations. This thesis investigated people’s perceptions of an online consultation platform. Through a case study approach, special attention was paid to the user’s experience with the consultation platform. Since the Internet is a tool for the diffusion of ICT between people and organizations, it was decided that the approach of this study would not be limited to public consultation platforms. We looked at tools for online public consultation and participatory initiatives, and together they allowed us to explore online systems intended for citizen involvement with governance in their communities. The aim was to gain a broader insight and to find opportunities for improvement of the online participatory web platforms. The author of this thesis would like to acknowledge that the literature reviewed in this introductory chapter is selective to a portion of the scholarly research available related to the various areas of research interest. 2 1.1 Information communication technologies and online participation Almost 40 years ago our society started adopting the use of Information Communication Technologies (ICT), especially the Internet, and its use progressively turned popular worldwide. As a result of the growth in popularity of the Internet, currently its presence is ubiquitous in our society. ICT have become more interactive for Internet users since the massification of web 2.0 platforms (Seel, 2012). These platforms allow its users to become “virtually present” when performing online activities which, today, is shaping the way we go about our daily lives; years ago, these activities would have required our physical presence. For example, in many countries it is widely popular to perform tasks such as banking, shopping, and reading the news online, and in countries such as Estonia electronic voting was adopted 10 years ago (Tsahkna, 2013).  These web platforms also host virtual communities where users interact or share their opinion on various topics. Canada is also in a digital age: by 2016 the percentage of Canadian households that had access to the internet was 88.5%1. According to the World Bank (2012)2, new ICT promote growth and support social and cultural advances among other advantages for development.                                                    1 (“Canadian Internet Survey,” 2012)  2 (“Information & Communication Technologies Home,” n.d.)  3  As society is headed towards a digital age globally, numerous scholarly papers have been written about the potential (Martínez-Ballesté et al., 2013; Park, 2011) and challenges (Joinson et al., 2010; Zviran, 2008) of the use of the Internet for governance and participatory initiatives. Since the 80s scholars have attempted to predict the future of internet use, and how this would impact our lives as a society (Campsie, 2007). Consequently, this is the driving factor of this research: to think about present and future scenarios; one where the main focus is how we, as a society, interact online for governance involvement and decision-making.  The capabilities of web 2.0 allow users to constantly communicate in very accessible ways, for example by using social media, and it seems that there is no lack of innovation on how to use the technology of these applications (Ciuccarelli et al., 2014). However, from the focus of this research study, it is more important to understand how these interactive features of web platforms are being used to maximize their usefulness for public consultation.  This research focuses on the interface between Internet users in their role as citizens, and the use of web platforms that serve for participation and decision making on governance. 1.2 Public consultation Public consultation is a type of participatory process –within the context of governance– where the involvement of people is encouraged, to provide their opinions on the issues relevant to the consultation. The main goals of this process are to increase transparency, efficiency, and the involvement of people affected by decisions on policies, development projects or laws (Rodrigo & Andrés-Amo, 2006).  Implementing this process 4 traditionally involves participatory activities that depend on the timing and financial resources available. These activities frequently include public meetings, open houses, workshops, and public advisory committees (Environment, 1990; Rodrigo & Andrés-Amo, 2006). In fact, participatory initiatives are always looking for ways to become more effective, and because one of the goals of public consultation is to make processes more efficient different sectors that affect governance have responded by incorporating ICT in their participatory initiatives. With the growth in popularity of the use of the Internet, governance is moving into a digital age where the options offered to participants provide the chance of becoming virtually present. This way, participatory initiatives found more opportunities to gather people’s opinion by using the Internet. Since the early 1980s, discussions on the implementation of online participation (Walbridge, 1982) and the underlying theoretical framework started shaping the path for other uses of online services such as online public consultation (Hague & Loader, 1999). Meanwhile, public consultation in its traditional ways became a key aspect of the planning process during the last four decades (Shipley & Utz, 2012), and the incorporation of the Internet for public consultation processes began in earnest in the early 2000s.  As an example, Min (2007) discussed the advantages of online deliberation, which is a way of involvement and engagement, suggesting that it is as valuable as traditional methods for participation. Furthermore, Min (2007) argues, both types of deliberation can increase the participants’ knowledge of the issue, their political efficacy, and willingness to 5 participate in politics. While theoretically the Internet has been considered a promising tool for participation, scholars have found issues with its implementation and applications. The issues associated with online participation most commonly cited are: privacy concerns (Cho & Larose, 1999, p. 425; Jensen et al., 2005, p. 204; Park, 2011, p. 219; Solove, 2006, p. 26; Zviran, 2008, p. 97), technological barriers (Hargittai & Hinnant, 2008, p. 602; Van Aerschot & Rodousakis, 2008, p. 317; Warschauer, 2002, p. 4), the Internet users’ behaviors such as time spent online (Boulianne, 2009, p195; Preece et al. 2001, p. 348), lurking (Nonnecke et al., 2006, p. 8; Van Mierlo, 2014, p. 20), preferences of uses of the Internet (Boulianne, 2009, p. 205), and online engagement (Coleman & Gøtze, 2002, p. 24).  All the issues stated previously, scholars found, do affect in one way or another the goals of public consultation – to increase transparency, efficiency, and involvement  –. These issues described by scholars are the ones that motivated my curiosity for carrying out the research. This thesis paid attention to these issues, and contextualized them in a case study with the aims of understanding the challenges but also the potential of increasing the use of online participatory methods. 6  Figure 1.1 Context of case study within online platforms The objectives of this study were: • To explore the levels of satisfaction that users of the case study platform experience and find out if this relates to their opinions regarding the trade-off between transparency and privacy • To compare and contrast the results of this study with the findings of relevant literature • To develop the first case study of an online public consultation platform 1.3 Context of the research questions Some scholars (Bonfadelli, 2002; Hargittai & Hinnant, 2008; Van Aerschot & Rodousakis, 2008)  argue that there is a challenge in implementing online participation initiatives based on the technological barriers that users might encounter. Although the argument was pointed out more than 10 years ago, there is a hypothesis still debated by scholars concerning the “knowledge gap” being a social consequence of the information society. In other words, Bonfadelli (2002) identified the challenges of implementing online 7 participation in terms of the knowledge gap and how people use the Internet. He identified barriers to people benefitting from the information society. Some of these barriers were the lack of basic computer skills, lack of access to new media due to its cost, and user friendliness of the web platforms. The interface between the Internet and its users has improved greatly since the findings of Bonfadelli (2002). Fortunately for online participation systems, using the Internet has become a daily activity for most of the developed world, and society started increasing its digital literacy skills more than 30 years ago, contributing to the development and increase of its popularity. However, there are still other barriers to cross. For example, the concept of second-level divide (Min, 2010)  refers to factors such as motivations and internet skills. This means that despite a whole set of vocabulary and expressions have been added to people’s general awareness of technologies, which reflects an increase of digital literacy, technologies are constantly evolving. Due to these challenges and barriers there is criticism based on the scholar-identified challenges described earlier (Verdegem & Verleye, 2009). Online public consultation (PC) systems are criticized due to the digital divide or the privacy concerns of participants. However, despite the critics, given the outcomes from applying online participation in different fields, it seems evident that there are opportunities to increase citizens’ involvement through online public consultation. For example, it is known that technology provides the ability to manage big data (McAffee & Brynjolfsson, 2012), allows people to participate at their own pace and time, and to build networks for online communities with similar interests (Preece & Maloney-Krichmar, 2005) - to name some of the advantages that technology possesses-.  8 Online PC has benefits such as providing flexibility to people constrained by their schedule, and allowing the opportunity to give feedback at their convenience of time and pace. Also, for those who find it intimidating to share their opinion in public, an online platform might be a more comfortable space to voice personal opinions. Finally, new technologies such as touchscreens or voice-activated devices are making it easier for the elderly to learn and participate.  In Canada, some online participation initiatives have already been implemented3 and this research study investigated one of them using a case study approach: PlaceSpeak, which is a location based web platform for consultation, established circa November 20114. It makes an interesting case study due to its increasing popularity. The company provides a web platform system that allows public and private organizations to consult stakeholders on topics. Stakeholders participate by becoming members5 of the web platform. The difference of the case study participatory web platform with others is that they validate the genuineness of their members by different methods. By validating their memberships, the intention is to eliminate the “anonymity” factor of participants, which is another issue that challenges participation carried out online and that has been described by scholars (Wallace, 1999). During the last 15 years, there is increasing interest in carrying out research related to web platforms, and the impacts of the use of the Internet and online tools for decision making and planning. However, the scarce number of scholarly articles based on cases study of web based participatory platforms is what motivated this study. It seemed useful to                                                 3 PlaceSpeak (“PlaceSpeak,” n.d.), MetroQuest (MetroQuest, n.d.), Bang The Table (“Bang the Table,” n.d.), Engagement HQ (“EngagementHQ,” n.d.).  4 (Artibise, 2011) 5 PlaceSpeak has a membership system for participants to join at no cost. 9 explore the empirical experience of users of a participatory platform. Nevertheless it is important to mention that during the timeframe this research took place the number of scholarly publications that dealt with topics of online participation and web 2.0 increased. In this thesis, we used references from academic literature based on both online participation and online public consultation because these participatory initiatives have theoretical backgrounds in which online behavior has been examined. The idea of transferring the knowledge gained in one field to enhance the understanding in another similar field is not new. As an example, while conducting the literature review it was found that Hrastinski (2009) had taken this approach, and he argued that to understand online learning, first online participation had to be understood. 1.4 Research questions The research questions were based on the challenges that, according to scholars, online participation faces; privacy concerns, technological barriers, internet users’ behaviors, uses of the Internet, and engagement. These concepts and their rationale are developed in detail in the literature review carried out for this thesis, which follows this section.  The research questions intended to focus on the challenges of online participation that seemed more relevant to the case study web platform, while setting them into context. Using the case study web platform, PlaceSpeak, the research questions are: • Are users of the platform satisfied with the current online participation experience?  • What is the online participants’ trade-off between privacy and transparency? • What is the participants’ involvement in government? 10 To address these questions, an online survey was carried out where users of the case study web platform referred to their personal experience with online participation and online public consultation. Additionally, due to the nature of this exploratory research methodology, a short telephone follow up survey was carried out. The purpose of the interview was to deepen the understanding of the participants’ experience and perceptions regarding three topics of the use of online participation tools: trade-off between transparency and privacy, the role of participatory tools, and the elements that shaped their levels of satisfaction when participating online. It is important to mention that, although specific research questions were stated, it was expected that the data analysis would also shed light on other aspects to improve the understanding of the perception and experience of participants while engaging in online participation. 1.5 Literature review  In the previous section –the Introduction to this thesis– we presented the context of the research and elaborated on the justification for this research study in a broad sense.  Then, through the introduction section it was explained that according to scholars, online participation brings challenges as well as possibilities for its use. In this sense, it is necessary to set the context of participatory processes from a broader perspective, to be able to explore and extrapolate to online public consultation, as scholarly articles use interrelated theoretical backgrounds regarding online behavior. The following sub sections show the literature review that was conducted to understand the different components, factors, and context of online participation.  Section 11 1.5.1 presents the context of participatory processes. Section 1.5.2 provides a general description of the main goals of public consultation.  Its sub sections; transparency, efficiency and involvement expand on each one of the goals of public consultation, contrasting them with the challenges that the online version of participation usually faces as participatory process. Section 1.5.3 presents perspectives from online governance involvement. Section 1.5.4 discusses the use of social media for participation. Section 1.5.5 discusses participation and participatory processes online, with the aim to provide a broader vision of online participation, and Section 1.5.6 presents the limits that scholars argue exist for public consultation. 1.5.1 Context of participatory processes Due to the diversity of topics that potentially exist in participatory processes, there is a great variety in both the people that participate (Reed et al., 2009; Uyesugi & Shipley, 2005) as well as the methods that are implemented (Rodrigo & Andrés-Amo, 2006). In this sense, some authors firmly state that the outcomes of some participatory processes may be counterproductive and even frustrating for the participants (Adams, 2004; Berman, 1997; Innes & Booher, 2004) Adams (2004) for example, argues that the utility of public meetings is not achieving their main goal of “giving citizens the opportunity to directly influence decisions made by governing bodies”, and that they are instead useful to send messages to officials and setting agendas. Innes and Booher (2004), for example, claim that the theory and practice of participation struggles with the conflict between individual or collective interests, and democracy and voices that are never heard. 12 So far, most of the research has focused on what motivates people to participate (Gray, 1989), what groups are poorly or inadequately represented in participatory processes (Johnson et al., 2004; Jung et al., 2001) in contrast to groups that play a more important role (Prell et al., 2009), and how privacy concerns and perceptions shape people’s online behaviors (Boulianne, 2009). All of these factors potentially interact with variations in people’s interests (Bonfadelli, 2002), motivation levels, cultural and experiential backgrounds (Bimber, 2000; Hindman, 2000), and ages (Loges & Jung, 2001). For example, research focused on environmental public consultation found that younger people tend to delegate responsibility to government, decreasing their involvement (Wray-Lake et al., 2010).  From the perspective of motivations behind online behavior and participation in online communities, Nov et al (2010) carried out an insightful analysis of different online participation methods, finding several correlations between motivations (both extrinsic and intrinsic) that underpin types of participation. They found that within the intrinsic motivations the “enjoyment” of newer community members was reflected in increased metainformation6 sharing. The opposite occurred for older members. In the other hand, within the extrinsic motivations to participate in online communities they found that “self-development” and “reputation” factors were also related to the number of contributions that participants provided in online communities. In the frame of the discussion of low representation of certain groups young people could be misrepresented due to lack of interest (Bakker & de Vreese, 2011) , the elderly due                                                 6 Metainformation: Information about information. For example: If a picture is considered information, and the size of the file, the resolution, and the date he picture was taken would be meta-information. (Nov et al., 2010) 13 to a lack of technology use or digital knowledge, or people could be misrepresented due to low access to the Internet. According to Loges and Jung, inadequate representation can be due to technological barriers (2001), referred to in the literature as the digital divide. The literature indicates that the elderly, people with mobility impediments (Gallagher et al., 2011), or people living in remote communities (Hindsworth & Lang, 2009) may be misrepresented. Bonfadelli (2002) even argued for the concept of a double digital divide, where the barriers are raised due to a knowledge gap and gaps in skills of use and attitudes towards the Internet.  In 2004, an analysis of public participation done by Innes and Booher (2004) focused on the duality that underlies the debate between citizens and government and that encourages adversarial participation. The authors proposed a collaborative model where civic leaders and interest groups participate. A detailed analysis of public participation processes allowed the authors to conclude that the keys to success were dialogue, networks, and institutional capacity. Dialogue is the communication and listening practice where people are heard respectfully and understand others’ points of view. By networks, the authors referred to people building personal and professional networks, understanding others’ perspectives, and in most cases building trust. Innes and Booher stated that social capital made the participants more knowledgeable and competent, allowing them to believe more in their ability to make change happen. In the literature, institutional capacity is identified as a combination of social, intellectual, and political capital (Cars et al., 2002). The concept of social capital7 explored by Wellman (2001) was brought back to discussion by authors studying online participation,                                                 7 James Coleman, 1988, sociologist, considers social capital to be social resources existing in multiple dimensions of a social structure to facilitate social action to produce various benefits. It includes social relations of all sorts, such as networks, obligations and expectations, and norms and sanctions (Harvey, 2014) 14 arguing that social capital is a complex component to analyze, and that there are differences between older people and teenagers with regards to attitudes towards the use of social networking (Pfeil et al., 2009). From a theoretical point of view, the concepts of social, intellectual, and political capital help to understand a theoretical background that justified exploring the socio-demographics of our case study sample. Despite the negative concerns raised by some scholars, others believe that online PC systems do offer advantages over traditional methods (Andrews et al., 2003; Chen, 2007; Rhodes, 2003). Today, online participation is becoming more popular and aims at higher participation rates (Gil De Zuniga et al., 2009; Hoffman, et al.,  2013; Lam, 2004; Ostman, 2012). It is widely agreed that the Internet has the potential to increase connectivity and reach more people, but also, if used in a collaborative way, it can foster positive environmental change (Pike, et al., 2005). Indeed, throughout the time that the Internet has been in existence, scholars have described and speculated regarding the trends and issues of the use of the Internet for participation (Hilbert et al., 2009). Later in 2012, Shipley and Utz (2012) reviewed the past two decades of theory and practice of PC, and in terms of web-based techniques they highlighted the need for further research to determine “whether these methods can gain ground in reducing public cynicism and distrust through their ease of use and multipronged approach.”  In addition, although measuring the success and satisfaction of people involved in a participatory processes is not easy, some authors have attempted to do so (Barki & Hartwick, 1994; Baroudi et al., 1986; Verdegem & Verleye, 2009) Baroudi (1986), for example, explored the impact of user involvement on information system development, concluding 15 that user information satisfaction may lead to increased system usage. Even though this evidence is helpful at a systems development level, and not within the context of Internet use, the authors suggest that more exploratory studies are needed to investigate the impacts of involvement at different levels, and to determine types of users. The study carried out by Jack Baroudi (1986) built some of the background with the aims of understanding behaviors of people related to systems usage in general. Barki and Hartwick (1994)  focused on developing measures of participation and involvement as two different constructs. They identified specific dimensions of the behavioral construct of user participation and the psychological construct of user involvement and attitude. Barki and Hartwick provided a starting point for the study of the relationship between users’ participation, involvement, and attitude during system implementation. Later, in 2009, scholars (Verdegem & Verleye, 2009) built a theoretical model to measure e-government satisfaction through 29 indicators. This model combined the domains of predicting user’s acceptance and measuring satisfaction of e-government. In the same year, Verdegem and Verleye (2009), related some of the Information Communication Technologies acceptance theories. Investigations reflect that online participation issues are complex and the result of an evolving scenario. We witness a society rapidly adapting to technology, and web platforms constantly trying to address users’ privacy concerns. With the advances of behavioral research, the challenge is to try to efficiently adapt web platforms and prepare for what online users expect to see by addressing their concerns.  By taking the relevant issues in the literature of online participation into account and by developing a better understanding of the concepts previously described, we are better 16 prepared to explore how online public consultation has evolved over time. For example, Sanna Malinen (2015), carried out a systematic review exploring user participation in online participation communities, and arrived to the conclusion that so far most of the research has been based on quantitative studies comparing volume of activity, and remarked that further research should “investigate the quality of participation and particularly, the influence of participation for community” (Malinen, 2015). Research in this field is even more relevant now that legislation requiring public consultation is considered in most of the planning efforts of developed countries and is a requirement by the World Bank and other aid institutions working in developing countries (Reed et al., 2008; Shipley & Utz, 2012).  1.5.2 Public consultation As public consultation aims to improve people’s involvement, the incorporation of online systems seem to be the natural course of adaptation of participation, especially with the current progress in ICT (Panopoulou, 2009). Now that people introduce a variety of technological aids to everyday life, governments, planners, and managers have also been trying to adapt (Chen, 2007).  Moreover, studies regarding the development of online participation in Europe showed an increasing activity in the field, referred in the literature as e-participation (Panopoulou, 2009).  In Europe, e-participation in 2009 was translated into contributions of people by email responses to consultation topics, or people participating in discussions online. With the increasing amount of data available on the Internet it is possible to gather information regarding several types of current participatory initiatives around the world and get a better picture of how they are shaping our present and future. 17 Transparency versus privacy For example, considering that PC aims to increase transparency, some scholars’ research has focused on privacy concerns as an issue that may affect online behavior, hence transparency. From the opposite perspective, allowing participants to stay anonymous (Chess & Purcell, 1999; Joinson, 1999; Panopoulou, 2009; Wallace, 1999)  generates doubts regarding the quality of the participants. Joinson et al, (1999) for example, argues that a user’s behavior differs depending on context (Internet versus real world behavior).  The concept of privacy is a complex one. There have been numerous attempts to define it from different perspectives (Paine et al., 2007; Westin, 2003), and besides each person has their own perception of privacy (Kwasny et al., 2008). Some authors argue that the first attempts at defining privacy have provided legal precedents for a right to privacy (Prosser, 1960; Solove, 2006). In this sense, what originated as an interest in clarifying what was understood by privacy concerns is not far from the privacy issues that the Internet users deal with today. Later, and related to the topic of this research, the definition of privacy was used to support and guide technology development (Seničar et al., 2003) by specifying requirements for web sites that work with personal data to avoid liabilities (Joinson et al., 2010; Regan et al., 2013; Spiekermann & Cranor, 2009). Nevertheless, it is commonly agreed that new perspectives and areas of research provide guidance to develop new ways to respond to people’s beliefs and expectations of how privacy is handled (Cho & Larose, 1999; Dine et al., 2008; Foxman & Kilcoyne, 1993). For example, Cho and Larose (1999) focused on exploring aspects of privacy when carrying out online surveys and concluded with a set of recommendations that today are widely spread practices or part of the privacy policies of 18 some web platforms. Among these recommendations are: not using cookies identifiers8, disclosures statements, encryption9 of data, multiple response options10, and privacy certifications. Efficiency From the perspective of increasing efficiency, which is another goal of public consultation, online public consultation platforms provide useful technological tools for administering data generated through participatory processes. In an era of increasing population growth and pace of life, it is fundamental that participation offers efficient solutions. Staying current to changing scenarios is valued by society; for example, keeping updated information available to the public makes the process dynamic, and a better tool that records, displays, and manages information from surveys or discussions makes online public consultation more attractive. In this sense, whether efficiency is sought from a marketing perspective to gain trust (McKnight et al., 2002)  and advance business development or from the perspective of e-government (Chen, 2007), the interest of understanding the scope of privacy in both cases is to make administration more efficient (Wang & Bryer, 2013).                                                   8 Cookie: “A file that is stored on a client computer that is using a browser. It is initially deposited there by a server and is used to store information that might be required over one particular session or over a number of sessions with a browser. One use for cookies is to identify users and prepare customized Web pages for them. For example, a cookie might be used to store the identity of someone who has used an ecommerce site that sells some commodity by taking the name of the user and their email address. This cookie can then be read by a server next time the buyer uses the browser in order to personalize a greeting and to relieve them of the repetitive effort of providing such data.”, (Ince, 2013) 9 Encryption: “The process of transforming some text known as the plain text into a form which cannot be read by anyone who does not have knowledge of the mechanisms used to carry out the encryption. The transformed text is known as the cipher text. There are a wide variety of algorithms available to carry out this process.(Cho & Larose, 1999) 10 Multiple response options: “For those respondents who still are concerned about informational privacy on the Internet, the alternative response media of mail, telephone, or personal interview can be offered. By providing conventional (i.e., snail mail) mailing addresses, Internet surveys can be printed out and returned via the mail but without respondents’ return addresses if they so choose.”(Schaefer & Dillman, 1998)  19 Involvement The third goal of public consultation is increasing participants’ involvement. In this regard, some authors have described how the Internet facilitates involvement (Gil De Zuniga et al., 2009; Wellman et al., 2001). Hargittai and Hinnant (2008), on the other hand, explained how a group of people with specific socio- demographic characteristics may not be represented in instances of online participation. Other scholars that have also analyzed the technological barriers to increased involvement in online participation have discussed specific technological solutions such as simpler interfaces. Similarly, there has been progress investigating ways to improve online survey response rates by paying attention to specific considerations of the surveys’ design to make them more appealing and efficient (Dillman et al., 2008). For example, Andrews et al (2003) depicted the challenges of reaching hard-to-involve internet users and developed a list of considerations for survey design to reach these audiences.  Previous online participation initiatives serve to discover the potential as well as the gaps of online public consultation.  Nevertheless, we were unable to find in the literature a case study that contextualized the three main goals of public consultation (transparency, efficiency and involvement), and that explored how they are affected by the typical issues of online participation systems described in this thesis. Therefore, this research study provides the analysis of one such online public consultation tool currently in use in the Greater Vancouver region. 20 1.5.3 Online governance involvement Governance involvement is a topic that has been addressed from several perspectives in the academic literature. Therefore, it makes sense to pay attention to the instances where stakeholders can get involved in governance from a broad perspective, beyond the times of elections11. There are many possibilities within participatory initiatives and voting is the formal way to participate, however other instances such as: attending town hall meetings, signing petitions, or deliberating also contribute to communicating interests or concerns to set the agendas for policies or decision making. From the perspective of participatory web platforms, scholars have examined the effects of offline political participation with the aim of influencing government and policy involvement and its relation to online involvement in efforts to understand the potential benefits of online platforms for decision making or planning (Coleman & Gøtze, 2002).  Based on Thomas Bryer’s rationale, only limited benefit to the democratization processes would be obtained from investing in improving web platforms. As Bryer (2011) argues, there are big challenges to finding an optimal distribution of economic resources in terms of the costs of production of a participatory process, the participation process itself -which basically transfers the costs to participants due to time devoted to participate-, and how these affect the democratization process (the quality of the participatory process). He analyzed the results of participatory processes based on those three elements and came to the                                                 11 For example, Canadian government web platforms have implemented numerous online public services such as; downloading and submitting forms or permits, payment for services or taxes. In regards to electronic voting, the mandates on developing and testing for federal elections in Canada date since the year 2000. At the municipal level, Markham, in the province of Ontario, was the first one to introduce electronic voting since 2003. (Schwartz & Grice, 2013) 21 conclusion that the combination only results in a democratization process that has low costs, which is optimal, when there is an increase of production cost and a high participation, but this would require citizen and administration capacity building. 1.5.4 Social media for participation Throughout this thesis some of the advances of technology and its current use for participation have been described. More specifically, and gaining increasing attention is social media, which is seen not only as an application that functions as means of communication, but also that serves as a two way communication media, voicing political or governance opinions (Bryer & Zavattaro, 2011). Studies on this topic have taken place since 2008 and scholars have paid increasing attention to explanations for this phenomena’s popularity.  Social media is a very fast paced method of communication, and due to the way it is set up it is possible to analyze its content (data mining) and find trends over periods of time. It is important to point out that during the last few years, according to Rojas and Puig-I-Abril (2009) the focus of research in this field has shifted to focus on exploring specific uses of communication technologies and the consequences for society (individual and collective). Initially, researchers focused on matters related to the Internet use, on exploring accessibility and inequalities.  1.5.5 Participation and participatory processes online A broad range of online participatory activities can be found on the Internet and in the literature; from online participation for education (Hrastinski, 2009) to e-governance (Carlitz & Gunn, 2002) and decision-making. William Pike (2005) developed the idea of 22 incorporating the Internet and communication technologies for collaboration on environmental change and discussed the advantages that the Internet infrastructure could provide, but also how collaborative science could be improved through the use of Internet and communication technology systems (Pike et al., 2005; Seltzer & Mahmoudi, 2012).  In general, technology has been used as a platform to communicate and provide information on the consulted issue, to deliver surveys to stakeholders, and also to arrange other participatory initiatives within the public consultation process (Rodrigo & Andrés-Amo, 2006).  Gunter (2006) evaluated the state of e-democracy measuring major central and local government services that were available and operational by the end of 2005 in Europe. During the first decade of the 21st century, Europe embarked on a project to enable the provision of public services by electronic means as much as possible. The changes that have been implemented in different countries in terms of ICTs reflect how the mass use of Internet technologies is happening across societies. Thanks to those countries that spearheaded online participatory initiatives, it has been possible to identify benefits and opportunities to improve these participatory platforms and online governance capacity.  In terms of participation and participants, (Li & Bernoff, 2011) explained that due to the variety of forms of interactions Internet behaviors and participation are not predictable. The author classified people according to the type of interactions as inactives, spectators, joiners, collectors, critics, and creators. In the study, 51% of the people were classified as inactive. This classification was related to age: people over 50 years old were classified as inactive due to their paucity of interaction with online technologies. Younger people were typically seen as joiners, as they tended to engage in different types of online platforms. If 23 the goal of PC is to increase people’s involvement in participatory processes, the number of inactive people is alarming. Li argued that the variations of involvement at different ages were likely to be highly variable over time, but inferences regarding the data are limited since the study is from 2007 and the societal context of the use of technology has changed. She also highlighted the benefit of online systems, which are traceable, and available in real time to people that are interested in following the process.  A recent study (Bakker & de Vreese, 2011) found clues for political participation amongst young people in the context of the use of the Internet for communication. Their study helped in gaining a better insight into the role of media in affecting participatory behavior. The survey carried out by the authors clarifies the understanding of governance involvement and participation of younger internet users (18 to 24 years-old).  Campsie (2007) explained what happens with online participation at the municipal level in Canada. The author described the different possible forms of collaboration in use -for example; allowing people to upload personal stories, photographs, documents, and other contributions- to help build knowledge of the community’s past. She also referred to another widely used example, Google© maps, and specifically to the interactive interface that allow users to upload information, add data to the satellite photograph, and link it to specific geographic locations. In this sense, participatory activity linked to georeferenced data has set precedents in more sophisticated platforms that are linked to decision making (Brown & Weber, 2011; Carver, 2001; Redaelli, 2012). Another example that Campsie (2007) cites is the use of social media by providing an interface for discussion between users, where they are able to share and discuss environmental issues and a variety of social concerns, and even 24 promote involvement by signing a petition or setting up appointments for active participation.  These examples were later put in practice by Ciuccarelli et al., (2014)  and summarized in a book that expands on different applications that used social media data mining for urban planning. Campsie (2007) also made an interesting observation regarding the anonymity of participants when providing feedback and how important it was to be able to identify people, especially when feedback is related to decision-making. Campsie stated that online systems provide a supplemental form of participation but surveys that remain anonymous may be questionable when data is applied. Anonymity is an issue that had been discussed by Wallace (1999) and later by Chen (2007) and Park (2011). More specifically, after studying the behavior of internet users, Boulianne (2009) discussed the effects of the Internet on engagement and concluded that online news tended to have larger effects on engagement across time when compared to other measures. Applications of participatory processes offer different viewpoints that apply to the study of online participatory systems. For example, in education, there have been several advances in the use of technology to facilitate people’s participation (Matheus & Ribeiro, 2009). In the discipline of planning, the use of geographic information systems (GIS) as online tools to engage people (Carver, 2001) is already in use. GIS systems are often linked to decision-making processes and serve as another case to analyze when thinking about online public consultation.  The same issues discussed earlier such as transparency, digital literacy, privacy, acceptance of technology, ethics, and so on, are at the center of the discussion of governance 25 and the electronic vote (Park, 2011; H. Wang et al., 2011). For example, Wellman et al (2001)  referred to the historical observations carried out by Robert Putman from the 1960s to the 2000s that reflected a long-term decline in civic involvement. But, while it may sound discouraging, they believe that over time Internet use is becoming normalized as it is incorporated into routine practices of everyday life. They also found that heavy Internet use was associated with increased participation in politics and voluntary organizations – however not in their online versions -, but also Wellman et all (2001) found that heavy Internet users were people that committed less to an online community. People’s online interactions supplement face-to-face and phone communication, without necessarily decreasing or increasing them. However, participation in online and offline voluntary organizations and in politics were positively associated (Wellman et al., 2001).   Scheufele and Nisbet (2002) stated that the role of the Internet in promoting citizenship involvement was limited. They found out that those who used the Internet for entertainment tended to feel less efficacious regarding their role as participants and at the same time, were less informed about current events. Thus, the authors stated that mass media has an important role in promoting citizenship, independently from the intensity and type of internet use (whether it was to obtain information or for entertainment).   More recently, and after carrying out an empirical study, Bakker and de Vreese (2011) found a positive relationship between different internet uses and political involvement within young people (18 to 24 years-old) —a more specific finding than studies from a decade earlier that had found a link between the types of internet use and political participation (Scheufele & Nisbet, 2002). Focusing on the type of internet use brings us back 26 to Boulianne’s (2009) line of thinking and findings, connecting the effects of online media on internet users across time. Another example of highly-used internet tools that relate to participation is social media services. In a study carried out in Colombia (Velasquez, 2012), it was found that expertise influenced participation, but also influenced discussion of topics; however, the effect of those variables was stronger for some topics. The survey presented in this study explores how users of the participatory web platform -used as a case study- use different types of social media web platforms, and its application in terms of finding a potential relation to participation and governance involvement. 1.5.6 Limits of public consultation This literature review makes it clear that revisiting the fundamental goals of PC is essential to find ways in which online participation helps facilitate transparency and efficiency in the consultation process. While assessing the utility of online participation systems, the discourse focus changes according to the discipline. It is agreed that new perspectives and areas of research provide room to develop methods to better handle online participation, in ways that actually respond to people’s beliefs and expectations regarding the online participation experience. The issues of online participation should be particularly considered by planners and managers involved in public consultation. Nowadays more participatory initiatives are complemented by (or migrate to) online versions, and while increasing transparency and efficiency are still goals of public consultation, it is hard to achieve these goals because of the trade-off between transparency and privacy that stakeholders make when engaging in these processes and their online behavioral nature. For 27 example, a study in Italy compared face-to-face participation versus online participation, and concluded that the approach and tools for participation are more accepted in theory than in practice (Garau, 2012).  Observations made by Campsie (2007) regarding technology and its use pointed out that, historically, predictions regarding technology have been wrong. For example, offices were expected to go paperless by the 1990s and people were expected to work only four days a week as technology made people more efficient. Furthermore, Campsie insisted that online systems brought new forms of communication, but they could not replace face-to-face meetings completely.  It is true that access to the digital world can be a limitation to participate online for those who have less access to computers or the Internet. In addition, some people find it difficult to express themselves in writing, and they may prefer not to share their concerns when feedback needs to be written. However, over time, technology has become more popular, and owning technology gadgets may reduce technological barriers associated with the lack of use of the Internet. Despite the popularity of the Internet there are still some segments of society left out according to Jung et al (2001). Indeed, Jung et al (2001), questioned methods previously used to measure digital divide, which took into account time spent online and gadgetry ownership. Jung et al’s study revealed clues regarding the scope of goals and activities online and the difference of use among different age groups. They found that “older people subjectively evaluate their internet connection to be as central as younger people do”. Jung et al claim that the index they developed captures the multidimensional nature of the people-Internet relationship after people gain access to the Internet. 28 Despite initial theories concerning the Internet, Brunsting and Postmes (2002) argued that computer mediated communication (CMC) would be a more individualistic environment that reduced social cues and thereby undermined the social and normative influences on an individual. They also argued that the Internet, as a socially isolated medium, can reinforce social unity. In their study, activists and non-activists were found to perceive that internet activism provided strategic freedom to choose in what to participate without dealing with the consequences. From the perspective of another behavioral study, exploring the effects of perceived satisfaction of online community newspapers, Chung and Nah (2009) provided insights in how the interactive features of news presentation elicited higher satisfaction for online community audiences.  Chung and Nah’s (2009) study reported an increase in the number of interactive features of web sites that was positively associated with the perceived satisfaction of users. Therefore, satisfaction of users will be part of the features explored in the survey of this thesis to explore if there is a relation between the preferred interactive features of participatory web platforms and satisfaction of online participation experience. In terms of privacy, Park (2011) examined online behaviors analyzing the interrelation of several variables such as familiarity with technology, experience with the Internet, knowledge, and other socio-demographic variables. Park found evidence that describe a second digital divide, an issue already raised previously by Bonfadelli (2002), and furthermore, he expanded on the limitations of the digital divide to exercise privacy control when assuming an omni-competent user. This is due to the assumption that Internet users, in general, have increased their overall familiarity with technology. 29 In the following chapter “Methods”, we describe how this research thesis was implemented and how, through a survey and interviews, we intended to investigate the way in which online platforms attempt to increase involvement and make participatory processes more efficient. 30 Chapter 2: Methods 2.1 Research design This project is exploratory in nature as it intends to gain better understanding on issues that have not been clearly identified or explored in the literature. As indicated in the literature review, online participation faces certain challenges, and there is debate about this among scholars. Some refer to the potential or benefits of its use for online participation while others focus on the challenges that have to be sort out to achieve a better participatory experience.  For this reason is that we were interested in learning directly from the participants what was their experience or satisfaction when using participatory platforms for consultation. And in terms of the challenges, understand the trade-off between their perceived transparency and privacy. Based on the fact that there was a combination of topics of research interest, an exploratory approach was chosen because in the literature review it was revisited some of the issues that online participation face - digital divide, privacy concerns and online behavior -, and despite the fact that there are studies (Himelboim et al., 2012; Martínez-Ballesté et al., 2013; Min, 2007; Rivera-Sanchez, 2009)  that have looked at these specific challenges of online participation, no study has concurrently investigated these issues altogether. This research study is justified since it would be the first study that intends to contextualize these issues using a case study approach. The limitations for generalization for this study are acknowledged; however, it sheds light on the areas of research interest.  The research design involved mixed methods as we decided to conduct both quantitative and qualitative data collection through an online survey and phone interviews. By using these methods, the quantitative data provided frequencies to describe the responses 31 of the participants. The qualitative data gathered from the interviews provided the chance to explore further concepts of the research questions (see 1.4) that were not fully addressed in the survey such as; satisfaction with the use of online platforms, trade-off between transparency and privacy, familiarity with technologies, and general opinions of online participants about online public consultation.  Figure 2.1  Methods used in the research It is estimated12 that 7,000 members were invited to participate in the online survey and, although we will not refer to a response rate - based on the fact that web technicalities13 beyond our control impeded us to determine a response rate-, some measures were taken to increase the number of respondents. These measures will be explained in detail in each section that refers to the Online Survey and Interviews (Sections 2.2 and 2.3).  To begin this exploratory study, an online survey was conducted as part of the quantitative data collection. As previously indicated, there were five general areas of                                                 12 For details on recruiting and the estimate of invited members see Section 2.4.2. 13 Due to the privacy policy of the online platform that hosted the survey, and because the invite it is a result of an automated process it was asked to the PlaceSpeak an estimate of the invites sent (see Section 2.4.2). 32 research interest, and they are reflected throughout the questions of the online survey and interviews.  Additionally, in order to complement the research with qualitative data, the subsequent follow up phone interview questions focused on deepening aspects related to online behaviors and perceptions regarding the role of participatory web platforms. This allowed us to further explore topics that could not be completely addressed with the use of information obtained from the online survey only. Qualitative research allows for the exploration of complex events and phenomenon, as well as contextualizing studies (Babbie, 2007).  These methods allowed us to analyze the validity of the theoretical framework described by scholars, and to compare it with the context of the case study.  33 2.2 Online survey An online survey was developed and implemented to elucidate the issues that online participation faces according to the literature review.   Figure 2.2 Welcome page of the case study web platform where the survey was hosted. The final version of the survey was built using FluidSurvey, an online software/service for survey design, licensed to UBC Department of Sociology. The survey was divided into 5 main sections for each research interest, and consisted of 63 questions in total. These questions included 1 question for the general classification of participants based on the source of recruiting, 8 questions regarding demographic data, and 1 question enquiring about voluntary participation in a follow up interview (See 5.1Appendix A  - and Sub Appendices for the online survey). 34 In order to simplify the survey for participants, it was designed so that several questions had a structure based on similar statement sentences or similar multiple-choice responses. This way, participants could quickly work through the questions. This method minimizes the risk of lengthy questionnaires - which could be a reason for participants potentially withdrawing from the study.  To begin, the survey asked participants the following question: (Q1) Did we send you an email inviting you to participate in this survey? This question allowed us to estimate the number of participants that had joined the study depending on whether they responded to the email invite or because they learned about the survey during a visit to the PlaceSpeak web site. The topics that the survey explored were as follows: • governance involvement: This section had 4 questions in total, all based on a frequency scale type of response. Participants were asked: What is the frequency that you do the following activity; (Q2) vote in federal or national elections, (Q3) vote in provincial or state elections or (Q4) vote in municipal elections. The choices provided as response were; always / usually / about half the time / seldom / never / prefer not to answer. The final question of this section asked; (Q5) How many times have you participated in your life time in formal face-to-face public meetings? Choices for the frequency scale provided to respond to this question were: 0 times, 1 time, 2 to 4 times, more than 4 times. • familiarity with technology and use of online participation tools: This section had  three sets of questions, and three individual questions. 35 o The first set of questions of the familiarity with technologies section asked about the frequency that participants read social networks (Q6 – Q10) such as Facebook, Google+ or LinkedIn. The second set of questions asked about the frequency that participants create content online such as; (Q11) comment in online papers, (Q12) post in social media websites or (Q13) write in a blog. For these two sets of questions, the options were based on a frequency scale type of responses: daily / more than once a week / once a week / once a month / less than once a month / never. The third set of questions of this section asked: Have you ever used any of these other participatory platforms (Q14 – Q21)?: Crowdbrite, Delib, EngagingPlans, Granicus, IdeaScale, MetroQuest, MindMixer, OpenTownHall. The multiple choices provided to respond were: Yes, I liked it / Yes, It was fine / Yes, I did not like it / No, I have never used it. o The last three individual questions of the section familiarity with technologies were composed by two multiple-choice questions, and one open ended question. The first question asked: (Q22) How often do you respond to requests to fill out surveys? For this question, the multiple choice options were based on a frequency scale type of responses: always / usually / about half the time /seldom / never. The second question asked: (Q23) Do you play online games? The multiple-choice options of response were based on a frequency scale: daily / more than once a week / once a week / less than once a month / never. The final open-ended question of this section was:  (Q24) 36 What games do you play online. A comment box was provided to respond by typing text. • privacy concerns, and transparency versus privacy:  In this section there was a group of multiple choice questions and a slide bar type of question, also known as semantic differential question.  . The multiple-choice question asked participants to select from a Likert scale: How comfortable do you feel about sharing personal information (Q25 – Q30) such as: email address, last name and profile picture, for example. The Likert scale provided as options for responses: comfortable / somewhat comfortable / neutral / somewhat uncomfortable / uncomfortable. In the slide bar question, participants were expected to slide the bar towards their preference between the trade-off of: transparency of participatory processes or personal privacy. The slide bar translated into scores that ranged from 0 (Transparency) to 11 (Privacy), including a neutral position on score 6. • satisfaction of the use of online participatory tools: This section had two multiple-choice questions, and one comment box question. The questions were: How satisfied are you with your experience with participatory platforms (Q31) in general, and (Q32) with PlaceSpeak . Choices to respond were provided based on a Likert scale: highly satisfied / satisfied / neutral / unsatisfied / highly unsatisfied. The comment box question asked participants: (Q33) Have you had an experience with a consultation topic on PlaceSpeak that you really enjoyed? If so what made it memorable. 37 • online public consultation: This section provided three sets of multiple choice questions. o The first set of questions consisted of fourteen statements of belief (Q34 – Q47). There was a set of statements in favor of online public consultation (Q34 – Q40). The second set provided a set of negative statements about beliefs on online public consultation  (Q41 – Q47). The responses were based on a Likert scale: agree, somewhat agree, neutral, somewhat disagree, disagree. o The third set of questions (Q48 - Q52) related to the preferred features of PlaceSpeak participants such as: videos, pictures, and graphs for example.  The Likert scale provided as responses: very important, important, moderate importance, little importance, unimportant, don’t know. • demographic questions: this set of 8 (Q53-Q62) questions used mainly multiple choice questions. • follow up with interview question: (Q63) Would you be willing to participate in a short follow up interview regarding online participation? This was the last question of the survey, and asked participants to provide their contact information if they wanted to participate in the follow up interview to expand on their responses. This way it was possible to better address the research questions.  38 The frequency distribution of the type of questions used in the online survey was as indicated in Figure 2.3, the percentage only includes questions related to the areas of research interest.  Figure 2.3 Frequency distribution of types of questions implemented in the online survey The survey was hosted on PlaceSpeak’s servers, and once the survey was approved by the UBC Ethics Board it was deployed online. Participants were able to fill out the online survey by accessing it through a hyperlink. 2.3 Interview All those participants that indicated their willingness to participate in the follow up interview were contacted by email, and appointments for phone calls were set up at a 39 convenient time for them with those participants who followed through with their interest on being interviewed. An interview protocol was designed prior to the online survey deployment, which was considered a preliminary version of the interview protocol. This version was revised based on the exploratory data analysis carried out with preliminary results of the online survey. The revised version of the interview (Follow up interview protocol Appendix B.1) allowed for a deeper exploration of the topics that focused on addressing the research questions that were not fully addressed by the online survey data.  The questions addressed satisfaction of the use of online participatory platforms, tradeoffs between transparency and privacy, and satisfaction and roles/benefit of online participation software. The participants were contacted by phone. For organizational purposes of the data collected, and to maintain confidentiality of the participants within the recorded file, each audio file was named with a 9-digit number that matched the participant number for the online survey.  The interviewee’s name was not mentioned once the recording of the interview started. The interviews (N=23) were transcribed using “Dragon clearly speaking” dictation software. The transcriptions were verbatim and kept for records.  2.4 Sampling The sampling technique was a convenience sample. The sample was comprised of PlaceSpeak members that gave consent to participate in the online survey, and that had received an email invite through the hosting web platform. The weaknesses of this type of 40 sampling technique are acknowledged, as the results obtained do not allow generalizations to the population. All respondents were adults ranging from 18 to 89 years of age, all members of the case study online public consultation website, PlaceSpeak.  The number of respondents achieved for the online survey was N=119, and for the short follow up telephone N=23. 2.4.1 Population of interest The population of interest for this research is the users of the case study web platform, a location based platform for public consultation called PlaceSpeak.  There were several reasons why the population of interest was the members of PlaceSpeak public consultation web platform, which was selected as a case study. First, the fundamental idea that underpins the conception of a case study of a participatory platform is that its implementation/operation faces some of the theoretical challenges debated by scholars and discussed throughout this thesis (the digital divide, and engagement through online participation at different ages). Second, this startup was conceived as an UBC alumni project and currently is growing in popularity.  To put in perspective the growing popularity of this web platform: at the beginning of this study the number of members of this web platform was around 5000 and, for example, by November 2013 it had grown to over 7300. Most of the members (71.42%) belong to the Mainland Vancouver area. (Source: PlaceSpeak). Finally, another reason to choose this web platform as a case study is because it was a good opportunity to understand from an empirical example some of the elements that relate 41 to online engagement and participation for decision-making in a community, first hand, from the participants’ perspective. 2.4.2 Recruiting We intended to reach out to all the members of the case study web platform. To do this, we originally set up no geographical boundaries, including the entire world.  As the notification email was an automated process administered by the case study web platform, the number of members that were finally invited to participate is based on an estimate14. At the time the online survey was deployed, all the members had their settings activated to receive notifications, therefore the invite would have been sent to more than five thousand members world-wide. In case members had accidentally missed the email invite, they were still able to participate through the hyperlink publicly available on the web platform while the survey was deployed online. 2.4.3 Data collection The survey was deployed once the ethics approval from the UBC Research Ethics Board had been obtained. Starting in July 2014, data was collected for a period of 16 weeks.  Data generated by the survey was hosted and stored on FluidSurvey servers, located in Canada. This data was also backed up on a server at UBC, which had secure access and was protected by a password.  The questionnaire length of the online survey and the potential interest in the research topic might have affected the number of responses; The data was cleaned up by eliminating the incomplete questionnaires that had less than 50% of the responses to the questions, as                                                 14 (See details on the estimate on 2.1Research design) 42 well as duplicates. Although 171 questionnaires were started, the final number of cases that were included in the analysis and were considered as valid was N=118. The data obtained from the online survey and interviews combined provided insight to address the research questions, in such a way that the qualitative and qualitative data complemented each other. Regarding the gathering of the qualitative data, despite the initial interest of 56 respondents to follow up with interviews, only 23 participants were available to be interviewed. The interviews were transcribed, and NVivo software was used to code for content. The classification of content into codes allows identifying and focusing on the topics that are relevant for the research purpose.  Another method of analysis used was performing a key word search within the content of the interviews; this helped with the identification of words used repeatedly by participants that had not been identified in the initial coding of topics. The qualitative data analysis process is well documented by authors such as Bryman (2007) and Creswell (2008). 2.5 Ethical issues This research study followed the ethical guidelines proposed by the Canadian Tri-Council Policy Statement 2 (TCPS), and was considered to present low risk to participants. As this study involved the participation of subjects in an online survey and in a short phone interview, the study was submitted to the UBC Behavioral Research Ethics Board. Dr. Michael Meitner was presented as a Principal Investigator, and the author Claudia Castro as a Co-investigator and primary contact. Certificate number: H14-01191. 43 Chapter 3: Results A data set of 118 valid survey responses was used for the quantitative analysis, which was analyzed using SPSS software. 3.1 Demographics overview To understand the nature of the data obtained from the survey, descriptive statistics were used to provide insight into the demographics of the respondents, and to verify how the interview questions would complement data obtained from survey responses to address the research questions. In addition, ANOVA (Analysis of Variance) tests were conducted to determine if the survey responses were consistent across demographic variables.   3.1.1 Gender The respondents of the online survey were comprised of 55.3% males, 42.1%  female, and 2.6% identified with “other gender” (Figure 3.1).  Figure 3.1 Frequency distribution of gender of participants 44  A univariate ANOVA was conducted to compare the responses by gender.  To begin, it was necessary to adjust the gender variable because the number of respondents (N=3) under the “other gender” category were not enough to accurately calculate a mean for that group.  Thus, when comparing only male (N=63) and female (N=48) there were a few instances where their responses to the questionnaire were significantly different at the p = .05 level: • Male participants indicated they read significantly more from social media channels such as LinkedIn [F (1, 109) = 4.443, p<0.037 Eta2 = .039], and Twitter than female participants [F (1, 107) = 6.861, p<0.01 Eta2 = .06] (Figure 3.9). • Female participants feel more uncomfortable sharing their profile picture when participating online than male participants [F (1, 109) = 7.019, p<0.009 Eta2 = .06] (Figure 3.15). • Male participants disagreed significantly more than female participants when presented the following statements: - Public consultation online is less efficient because it requires training for people that are not familiar with computers [F (1, 109) = 16.328, p<0.0 Eta2 = .130] (Figure 3.19). - Public consultation online is more costly to implement compared to traditional methods of consultation [F (1, 109) = 7.826, p<0.006 Eta2 = .067] (Figure 3.19). 45 3.1.2 Level of education When participants were asked about indicating their highest level of education, 40.4% responded that they had received a postgraduate university degree, 28.1% received a 4 year degree, 15.8% received a 2 year degree or certificate, 14.9% had completed high school, and 0.9% had just completed primary. In summary, more than 66% received at least a 4 year degree or higher postgraduate education. This reflects a positively skewed histogram (Figure 3.2), and in general, a high level of education of the participants.  Figure 3.2 Frequency distribution of level of education of participants15                                                  15 Frequencies are based on 114 valid responses and 4 missing.  46 Using a univariate ANOVA test, the only response that was significantly different within levels of education was:  • Participants with higher education levels were significantly more inclined towards privacy (Figure 3.3). [F (4, 109) = 2.803, p<0.029 Eta2 = .093] (Figure 3.16). In figure 3.3 (below) the higher the mean score value at each category of level of education, the more inclined towards privacy.  Figure 3.3 Mean scores of level of education compared to transparency versus privacy preference of participants. 3.1.3 Income range In terms of income, 58.5% of the participants of the survey stated having an income salary equal or above 50 thousand dollars. This is reflected in the positively skewed distribution of the data in Figure 3.4.  47   Figure 3.4 Frequency distribution of income of participants  In the Province of BC, according to statistics Canada in 2012, the median total income of BC families was 74,15016 in 2013. The province of BC was chosen as a reference as most of the participants were located in that province (62.7%). A univariate ANOVA was conducted to compare the different levels of income with the rest of the responses of participants in the questionnaire. The instances that showed significantly different at the p = .05 level were: • Participants with higher income levels voted significantly more frequently in municipal [F (7, 85) = 3.371, p<0.003 Eta2 = .0217], provincial [F (7, 85) = 2.524, p<0.001 Eta2 = .253], and federal elections [F (7, 85) = 5.704, p<0.00 Eta2 = .320] than the rest of the income ranges (Figure 3.4).                                                 16 1. Census families include couple families, with or without children, and lone-parent families. Source: Statistics Canada, CANSIM, table 111-0009. Last modified: 2015-06-26. 48 • Participants with higher income levels presented significantly higher agreement with the belief that Public Consultation excludes people without internet access [F (1, 7) = 2.383, p<0.028 Eta2 = .162] (Figure 3.19). 3.1.4 Age The average age of participants was 53 (SD=13.71, range 19 – 89). There were two peaks in the frequency of ranges of age. One in the range between 60-66 years-old, and the following peak was in the range between 46-52 years-old.  Figure 3.5 Frequency distribution of age of participants17 A univariate ANOVA test was used to see if there were  significantly different responses based on the age of participants. In the following cases there were significant differences at p = .05 level:                                                 17 Frequencies are based on 114 valid responses and 4 missing.  49 • Older participants have a significantly higher agreement with the statement “I believe public consultation online provides input that will have an effect on the outcome of the topic” [F (44, 69) =1.587, p<0.042 Eta2 = .503] (Figure 3.18). • Younger participants indicated finding videos in a web platform as a feature of significantly higher importance [F (44, 69) = 1.822, p < 0.012 Eta2 = .537] as well as pictures [F (44, 69) = 1.804, p < 0.014 Eta2 = .535] (Figure 3.11). 3.1.5 Planners and general public In addition to the demographic data previously presented, responses to the question that identified whether or not participants were related to the planning profession were used to find out if the responses of planners differed from the general public. Almost a third of the participants that responded to the survey (27.1%), stated that their job was in some way related to the planning profession (Figure 3.6).   Figure 3.6 Planners vs. non-planners participants.  A univariate ANOVA was conducted to compare the responses of participants and, in general, there were  few differences when comparing the responses except for the following, which were significantly different at a p = .05 level: 50 • Participants that consider themselves related to planning profession: - Attended Face-to-face meetings significantly more frequently than the general public [F (1,111) = 13.130, p < 0.00,  Eta2 = .106] (Figure 3.8). - Indicated finding pictures in a web platform as a feature of significantly higher importance compared to the general public [F (1,111) = 12.863, p < 0.00, Eta2 = .104] (Figure 3.11). - Scored higher agreement than the general public with the statement that online public consultation is only effective when combined with meetings or other public consultation activities [F (1,111) = 3.607, p < 0.06, Eta2 = .031] (Figure 3.19). 3.2 Online survey areas of research The following sub-sections analyze the rest of the survey results, highlighting specifically the areas of research interest that were covered by the online survey. 3.2.1 Governance involvement  When participants were asked about the frequency that they voted in municipal, provincial, and federal elections, there was a positively skewed distribution of participants towards higher frequency of voting in all cases (Figure 3.7)  51  Figure 3.7 Voting frequency per type of election In terms of the frequency with which they attended face-to-face meetings; results indicate that participants of the survey are inclined towards frequent participation in face-to-face meetings (Figure 3.8).   Figure 3.8 Frequency of attended face-to-face meetings of participants. 52 3.2.2 Familiarity with technologies and use of online participation tools Considering that the survey was deployed online, it is assumed that there was a certain level of technological familiarity among the survey respondents; since to be able to respond to the online survey it was required that participants interacted with the web 2.0 features of the platform that hosted the survey. The literature review of this thesis described that a common issue faced by online participation is the technological barriers that potential participants may come across; which are typically referred to as the digital divide.  The behavior of internet users such as frequency of use, and what they use the Internet for -information or for entertainment- may indicate how much of a challenge they experience in terms of technological barriers or the digital divide (Bonfadelli, 2002). In this sense, it is worth paying attention to the frequency of interaction with social media to see if it is an indicator of participation, and to reach a better understanding of how participants behave online. Participants were asked about the frequency with which they read or created online content. The peak frequency values of responses for reading social media were: reading daily on Facebook, and never posting on MySpace (circled in Figure 3.9).  53  Figure 3.9 Frequency distribution of frequency of reading social media of participants. In terms of the frequency of online content creation, the survey data shows that the peak frequencies were: never comment online or write in a blog. Nevertheless, in the case of content creation by posting in social media web sites, there were similar numbers of participants that post on social media on a daily basis and those who never post (circled in Figure 3.10). 54  Figure 3.10 Frequency content creation of participants. To the same purpose of finding out potential technological barriers for participation that could be due to the type of information presented in consultation topics, participants were asked to indicate how important is that the public consultation case study provide certain features from a multiple choice option, for example: video, pictures, graphs, summary tables and my -their- responses compared to the rest. Responses show that frequencies of important and very important choices had peaks for: summary tables, graphs and pictures (circled in Figure 3.11).  55  Figure 3.11 Preferred web features for online consultation. To understand the level of familiarity that participants had with participatory platforms, they were asked if they had used a series of other online participatory platforms. Data shows that, despite the fact that they are acquainted with PlaceSpeak; there is a lack of familiarity among respondents with other participatory platforms (Figure 3.12).   Figure 3.12 Familiarity with other participatory platforms. 56 Another question implemented in the survey that aimed to help understanding online behavior, was the frequency of online gaming (Figure 3.13).   Figure 3.13 Frequency that participants play online games. To understand familiarity with technologies and online behavior, participants were asked about the frequency they responded to online survey invites. Data shows that most participants of the survey usually respond to invites to fill out surveys (Figure 3.14).  Figure 3.14 Frequency that participants respond to online surveys. 57 3.2.3 Privacy concerns, and transparency versus privacy. To address this area of research, one of the questions in the online survey asked participants how they felt about sharing certain personal information (Figure 3.15), and the results showed that the highest peaks of frequency of responses were located on feeling uncomfortable sharing phone number and home address. The highest peaks of frequency of feeling comfortable sharing personal information were allocated for sharing email and last name.   Figure 3.15 How comfortable the participants feel about sharing personal information.  Furthermore, we asked if they felt more inclined towards transparency or privacy. A univariate ANOVA was conducted to compare the effect on other responses. There was no significant difference at the p = .05 level in the groups inclined towards transparency or privacy. The graph (Figure 3.16) shows a bimodal distribution of the responses.  58  Figure 3.16 Frequency distribution of participants inclined toward transparency or privacy. Although there was no significant difference when comparing the responses of those who completed the survey, and those who followed up with an interview, there was a slight difference in their mean score for trade-off between transparency and privacy. The mean score for the online survey respondents was 6.18 (N=118), and for the interviewees 5.35 (N=23).  3.2.4 Satisfaction of the use of online participatory platforms In terms of satisfaction of the use of online participation platforms, participants were asked about their levels of satisfaction with the use of participatory web platforms in general, and with the case study participatory web platform -PlaceSpeak-.  In both cases, there were peaks of responses located in satisfied and neutral (Figure 3.17). 59   Figure 3.17 Satisfaction levels of the use of participatory web platforms of participants.  3.2.5 Online public consultation Some authors (Bonfadelli, 2002), have analyzed the use of the Internet to understand whether behavior online indicates a potential for participation. One of the factors that influence behavior is shaped by beliefs. In the survey of this research study, to gain a better understanding of participants’ perception of online public consultation systems, participants were asked about their beliefs regarding online public consultation.  To approach this topic, participants were presented with fourteen statements, and asked to indicate their level of agreement/disagreement on a 5 point Likert scale. The first seven statements were favorable beliefs regarding online public consultation. The second set of statements reflected negative beliefs towards online public consultation. 60 For the statements in favor of online public consultation, the data shows that most of participants agree with the beliefs that were in favor of online public consultation (Figure 3.18).  Figure 3.18 Favorable beliefs towards online public consultation. The second set shows that there is no general trend in the statements that refers to negative beliefs towards online public consultation (Figure 3.19). 61   Figure 3.19 Negative beliefs towards online public consultation. 3.3 Interviews The audio files obtained from the interviews (N=23) were transcribed verbatim, and then the text was imported to NVivo software and analyzed for content. The coding of nodes and child-nodes per question occurred according to the steps described in Figure 3.20.  3.3.1 Coding of the content According to Mills (2009), there are different ways to approach qualitative data coding and identifying categories (nodes).  For example, it can be done in an inductive way, which allows obtaining structure gradually from the data, or in a deductive way, which allows exploration of data in a more open-minded approach. 62 Although the preliminary content analysis was done with a more deductive type of approach by generating automated coding of the responses - that classified each response based on each main questions (nodes)- , it is important to highlight that due that this research had an exploratory approach, the analysis approach was mainly an inductive. This inductive approach, made possible to identify for example that the concepts that participants were referring to in one question, were linked to concepts related to topics enquired about in the other question. This way, the content of each question was broken down into child-nodes to identify clusters of themes, and ideas that participants referred to, this, beyond the clustering by question number identified in the early stage of the automated coding per question. Furthermore, in order to verify if additional classifications were required, a word frequency analysis was performed at the end in case that analytical themes rose due to a word frequency analysis. This was done to verify that all themes were considered in the coding of the interviews. As an example of the inductive approach to coding that was implemented in this research, when a participant was asked about the trade-off between transparency and privacy, they sometimes would loop back to refer to factors that had decreased satisfaction or bring the concept of “trust” to discussion. See example below: Transcription of interview: “…This is getting back to the ‘professionally made survey’. When the professional survey’s people phone me -and Gallop poll [for example]- someone that I have brand name recognition of, I know that they have done a statistically valid survey. So, I have some level of trust, then. That my answers will not be manipulated, and when I am responding to an open ended [question], I have no idea whether people are controlling answers, and with all that, I am partially suspicious that the survey can be highly manipulated. With that case, I want to be as anonymous as possible. I mean, I don’t want to 63 disclose, because essentially, what I’m saying is, if the survey person is not disclosing much to me, why should I disclose much to them? That’s sort of the implicit trade-off I am describing to you.” In cases such as the one cited above, although the participant is citing a specific example of a factor that decreases satisfaction, it is understood from the point of view that the participant is making a case of: why the trade-off is context dependent from the participant’s point of view. Furthermore, the coding of the theme was acknowledged to also refer to the theme of “trust” – which is a theme mentioned by other interviewees as well- . Qualitative analysis has been described by authors to require skills and creative thinking (Mills, 2009), in this sense, it was implemented a quality assurance measure; the coded content was revised to ensure that all content had been coded and that the coding was appropriate for each theme based on this inductive approach. Finally, to simplify the analysis, child-nodes were then revised to find opportunities for clustering based on the discourse of the interviewees or the topics of research interest. 64  Figure 3.20 Coding steps It is important to mention that as part of the analysis, the distribution of the demographic variables of the interviewees showed no significant differences with demographic of the respondents of the online survey. 3.3.2 Satisfaction of the use of online participatory platforms In the online survey, participants were asked to rate how satisfied they were with the use of online platforms in general, and specifically with PlaceSpeak’s participatory web site. However, no comment box was provided to expand on their responses. Therefore, when carrying out the interviews, and with the aim of better understanding the user’s satisfaction with online participatory platforms, interviewees were asked: “Could you tell us a little about what things would make you more or less satisfied, and why?”. 65 Interviewees referred to factors that increased and decreased their satisfaction levels. A more detailed description of these is presented in section (Increase satisfaction) and (Decrease satisfaction). The general classification of nodes and child-nodes - for the first question of the interview - are displayed below (Table 3.1).    Table 3.1 - Classification of nodes for the question: “What things make you more or less satisfied and why?” Increase satisfaction Interviewees mentioned varied factors that increase their satisfaction levels, although it was still possible to relate some of these ideas and cluster them into child-nodes. There were two child-nodes that had the same number of sources; they only differed in the number of references made by interviewees. Six interviewees mentioned ideas clustered in the child-node: ability to expand on the responses. For the child-node: feeling that my time 66 (participating) is being used efficiently, six interviewees cited examples related to this idea. For example: they stated that their satisfaction increases when engaging in a topic that makes them feel it is worth their time, or feeling that their time is valued by the organization carrying out the consultation. Additionally, they find it important to know the amount of time they are expected to dedicate when participating. Also in regards to time, interviewees stated feeling that their level of satisfaction increases when they are able to participate at their own pace and can complete the consultation at a time of their choosing. Then, the idea of how topics and questions are asked, and what was the approach to the consultation topic - from a positive connotation perspective-, was the child-node that clustered five sources. There were different examples that related to the importance of having a good survey design, or questions designed for an easy response. Specifically, interviewees indicated that when deciding to participate online, they assess the nature of the topic, how it is presented, and how it relates to them as stakeholders.  Four interviewees indicated that the idea of the child-node: knowing what happens after the consultation, also increased their levels of satisfaction. They expressed interest in knowing who will have access to their responses, and how the data will be used. For example, knowing that their participation will influence the decision-making, and that associated records and results will be available after the consultation, were indicated as factors that increase their satisfaction by providing a sense of transparency of the participatory process. For the ideas classified in the node feeling your voice is heard as a participant, three interviewees referred to this as a general feeling, including one interviewee that stated feeling 67 that online participation is a more democratic process. Finally, for the set of child-nodes related to the factors that increase satisfaction mentioned by interviewees, three interviewees cited examples related to feeling that the topic of consultation is relevant for me as a stakeholder. Following is an overview of each child-node, related to the factors that increase the levels of satisfaction mentioned by interviewees, including some references obtained from them. In terms of having the ability to expand on their responses, the examples cited by interviewees 8, 7, 15 and 12 were: I8: “I would say, I may have an informed opinion, around a topic than a lot of the general public. It would be nice to be able to respond in a more nuanced way to survey questions. Maybe, even to the extent of, simple diagramming tools, or feel like you are helping to create something more than rather something that is very caged in pre-determined choices.” I7:“More satisfied would be the ability to make comments …. Sometimes the questions are not broad enough so you can’t answer it without answering incorrectly, because there is no other option… So, if there was a... comment field, for example, where you could put in the answer to another question. Sometimes they don’t ask the right question.” I15: “I think surveys truly allow me to express my opinion, sometimes that means there is open ended things [questions] where you can just enter your own views, and not being constrained by a particular multiple choice question. So those, probably allow more satisfaction.” I12: “In a survey perhaps, something at the end where you can write a little opinion. And, I prefer surveys on topics that I am more involved with. Because sometimes, I do have further things to say that are strictly my opinion, and I don’t feel the survey addresses them. So, I do like that. That gives me an opportunity, if there is a box at the end for filling additional 68 things, to have more of a say. But I usually do that when it is talking about topics I care about.” On the child-node where interviewees referred to the importance of feeling that my time (participating) is being used efficiently, interviewee 1, 11 and 3 stated for example: I1: “…A way of making me feel (interviewee referring to factor that increases satisfaction), as the person filling out the survey, is that you are conscious of me, wanting to give you good feedback. But … in a shorter, a most efficient way as possible...” I11: “…I like knowing how much time to put aside, or how many pages I’m going to go through. How long the thing is… so I can gauge. Sometimes I feel a little bit like; I just can’t carry on with this - this is taking me too long-.” I3: “I think my favorite thing is you’ve got the time to think about it and critically reflect on what you want to say (interviewee referring to online participation). Whereas if it's a face-to-face environment in a hall meeting or something. You don't have the time necessary to reflect because it not your timeline, or it's somebody else's timeline.” Regarding how topics and questions are asked, and what was the approach to the consultation topic (from a positive connotation perspective), specifically referring to the clarity of the survey, the survey’s design or the quality of the questions, interviewees 18, 13, 1 and 10 stated: I18: “Well, It depends very much on how clear the instructions are (interviewee referring to factor that increases satisfaction). So sometimes it's not at all clear what you're expected to do.”  I13: “…as far as I am concerned it is just the ease of use, and understanding what the question is. So, it is just satisfaction with the better design as I understand it. [That way] I’m satisfied, I enjoy it, and it gives me an opportunity to give my opinion on things.” 69 I1: “…when I am doing a survey …it’s easy if I don’t have to, actually, write something. When there’s a list of choices. So, those go quickly, and you go: OK, that makes sense to me. So, something that you are given a number of options, a menu. That’s clearly the easiest and fastest.” I10: “I use online, so much, for giving my opinion, like surveys, that’s one thing. And with surveys in general; I’m pretty satisfied because they are normally really well written.” Regarding those opinions referring to the child-node: knowing what happens after the consultation, interviewees 14, 21 and 22 indicated: I14“… I like to get a copy of the aggregate research, the results of the survey at the end of the whole thing (interviewee referring to the participatory process). That is a deciding factor.” I21:  “More satisfied would be to know that the question that I’m being asked … will have an outcome and it’s not just seeking information (interviewee referring to questions asked seeking participant’s personal information).” I22: “I like the fact I think there is a lot more flexibility with what you can do with it (Interview referring to how data of the online participation can be processed later). I think that when you participate online you are confident there is a record. And, it is much easier to present information online, and for people to give feedback on. So I think those are the main factors that I like it for.” In terms of the comments related to the child-node: feeling your voice heard, interviewees 20, and 12 indicated: I20: “...I find positive about it is, at least you get to express an opinion (interviewee referring to online participation), and particularly, I guess the opportunity of participating in any meaningful way in a political process for instance. It is satisfying to be able to, at least to express your opinion to someone somewhere, and hopefully have some effect.” 70 I12: “I feel satisfied when I get to do a poll. I am registered with two companies that send out polls. So I trust them both, and when I do the survey - I don’t know how much value it has - but I feel like I had an opportunity to give my opinion without going to any stress or writing letters to my MP (Member of Parliament), or anything. So I do feel good. I feel… little voice it’s better than no voice...” Finally, in terms of increasing satisfaction, there were a few comments of interviewees indicating that their satisfaction increased when feeling that what they are being asked is a relevant topic for them, as stakeholders. Interviewees 2 and 18 mentioned: I2: “Well certainly, if the survey or questions are relevant (interviewee referring to the factors that increase satisfaction when participating online). … Yes, I just go back and say if they are relevant and satisfactory.” I18: “Probably, whether if it's a subject I am interested on (increases the satisfaction), there are a lot of approaches online that I'm not interested.”  Decrease satisfaction The factors that decreased the level of satisfaction mentioned by the interviewees were clustered in three main child-nodes. Likewise with the factors that increased the level of satisfaction, (thirteen) interviewees provided examples referring to how topics and questions are asked, and what was the approach to the consultation topic but from a negative connotation perspective. Also, there were four interviewees that cited ideas clustered in the child-node: having concerns about the digital divide (technological barriers) for other participants. Finally, three interviewees cited examples that generated a child-node around the ideas of having concerns about the data management during and after the consultation.  Among the ideas related to how topics and questions are asked, and what was the approach to the consultation topic from a negative connotation perspective, were cited; 71 poorly formulated questions, surveys with design issues, not being able to expand on the responses, and the type of statements and judgments or assumptions made by surveyors. For example, interviewees 13, 14, 15, 16 and 10 stated: I13: “quite often when you see survey questions they are double in a way, and what I mean by that, is: that the deeper you read into the question it can be interpreted in either way…” I14: “A lot of times, the options that are there, don't really fit into the answer I want to give. So, being able to give a bit more detail. Or, any other type of answer where I can fill in the information makes a difference. If the answers don't fit, I should have the option of giving a different opinion…” I15“ I think it depends on the quality of the survey. [It would increase the satisfaction] if the survey it’s truly trying to extract an unbiased response. That’s the way I would put it. Many online surveys, and maybe they are just more prone to it. Because, sometimes they are done less professionally. I am pretty suspicious there is a very a strong bias in the type of question (interviewee referring to survey questions in general). The city of Vancouver, to be very specific, is a classic for this kind of survey.  Where, actually, if you are opposed to something, there’s never a question that allows you to say you are opposed to something for example. Why can they do that?… a very frustrating survey. So I guess, bottom line, it comes down to the quality of the question.” I16: “The thing that I find frustrating with some of the online surveys is they often ask you questions that you find really annoying, because they don't give you the options that you would like to answer. Often, the options that there should have been situated, as the person that was designing the survey, thought what people might be saying. But there is not enough choices or…you sometimes feel trapped in the choices you are given. I was just doing a regional survey on transportation options, and it was very obvious that it wasn’t driven by the liberal idea of what transportation and port improvements are.  The options browsed were only given from their perspective. There, is my frustration. Sometimes it’s leading answers, rather than giving you the real freedom to answer what you would like to….[And in 72 terms of factors that decreases satisfaction] just designing them in such a way that people don’t identify with the answers you provided as an option, there isn’t always a ways for expressing your choice.” I10: “I use online, so much, for giving my opinion, like surveys. That’s one thing. And, with surveys in general; I’m pretty satisfied because they are normally really well written.  But occasionally you get dumb questions you can’t answer properly.  That really irritates me. They make the question too complicated and you say yes, or definitely, or not.  Or, it doesn’t really make sense…” Four interviewees indicated having concerns about the digital divide (technological barriers) of other participants. For example, interviewees 11, 9 and 7 recalled: I11: “…and the other thing [that decreases satisfaction] is how familiar I am with the tool.  Or how like it, might be to something else I knew. So, I feel comfortable for me to make a translation. For example, the “Talk Vancouver” surveys that have come out, that I’ve signed up for. Well, It can be a bit awkward the first time, because it isn’t intuitive necessarily; unless it is like something you have done before. So I think we might lose people at the beginning and lose their input if they find it too difficult.” I9:“…So, when you say to people: well, come and participate in a webinar. And, you might as well say then: Why don’t you come on this rocket ship to the moon”[Laughing ironically]…and that’s 25% of my community. So, that’s the big problem.” I7: “…the things that complicated (the participant’s experience and decreased satisfaction), are [that it] is too difficult to process, to log in, and to find the information you are looking for. Well yes [it decreases satisfaction], and one I am thinking of, in particular, there are so many people who wanted to participate but they couldn’t get pass the log in [with PlaceSpeak], and so they just went away in frustration so their opinions were never recorded.” 73 For the child-node having concerns about the data management during and after the consultation, there were three examples cited by interviewees: I23: “the only thing I can possibly add as a criticism, is that someone in the web is a spin doctor, and they may or will write or present their case a little better than somebody else, even though the case they are presenting seems like is not the right way to go. That’s about the only thing I can see. I am also concerned about trolls… that will get on, and swap the comment section with their own specific agenda rather than addressing the issue that is asked.” I14: “Another one [factor that decreases satisfaction] is that it depends on whom is doing the survey. So, if it's a public survey type, where the data it's open to the public, or it's going to be published openly, versus a specific entity doing the survey. That they're going to keep the data for themselves. That makes a difference as well. So, it's really, for myself, it has to do with the entity that it's doing the survey that makes the difference, whether I really want to participate or not. But if there's a transparent, sort of a public institution versus a political group or a private institute that it's going to keep the data for themselves.” 3.3.3 Trade-off between transparency and privacy  In the online survey, participants were asked to indicate their response on a semantic differential scale, to communicate whether they felt more inclined towards transparency or privacy when participating online. Figure 3.16 showed that the frequency of responses of participants’ preferences had a bimodal distribution with a slight positive skew to the side of privacy preference. To help further investigate this topic we followed-up in the interview on this area of research interest, and interviewees were asked: “What are the issues that are important to consider when trading off your personal privacy with increased transparency of decision making in your community?”. 74 The responses of the interviewees were assigned to several clusters of ideas (child nodes) around the topics of transparency and privacy. The most popular content came from ten sources (interviewees) indicating that the trade-off depends on the context. A group of nine interviewees referred to ideas indicating that they felt inclined towards transparency. The third most popular opinion, with six sources citing examples, were ideas that reflected interviewees being inclined towards personal privacy. Another group of three interviewees stated believing that privacy does not exist anymore. Finally, only one interviewee indicated that he/she still prefers traditional methods of participation, as from his/her perspective traditional methods are more transparent for smaller communities. 75   Table 3.2 - Classification of nodes for the question: “What are the issues that are important to consider when trading off your personal privacy with increased transparency of decision making in your community?”.  On the child-node that stated that the trade-off depends on the context, interviewees 3, 22, 7,15 and 6 cited: I3: “ I think the issues [to consider when trading-off transparency versus personal privacy], are really hard to explain from my perspective. The issue is that. In the trade-off, that is the issue. So, I mean there are times when transparency is more important, and there are times when privacy is more important, and I think that is one of the struggles that we have in society. That we want to know exactly where that point is. But it changes, depending on the situation or who is involved.” I22: “…I am fairly comfortable sacrificing some of my privacy in order to get my voice heard, in terms of location, and that kind of thing.  I think sometimes it is necessary, 76 unfortunately, because if there is people that are anonymous, they are not always as honest. So it can get sometimes important. But I think, is important to keep an option to maintain some privacy there, for sure. And with issues with, where probably the more local the issue is, the more I am willing to sacrifice in stuff where I am participating online. For example; on a development that was going to happen in my neighborhood.  In that case, because the city [of Calgary] was running [the survey] which is an institution that I trust - and I have a higher degree of trust - and, the issue was in my immediate area, I was more comfortable giving, for example, my specific address online in that consultation. If it is something that is a bigger, more national issue maybe, and is an institution or business that I am less trusting of, then, that changes a little bit, and I may not be so comfortable giving away that much information. I think you just have to be mindful of who is doing [the survey].” I7: “I suppose it would depend on the topic. Because if it was something sensitive where people knowing your address for something, and then, there could be some follow back out of that depending on the topic. I think, in some cases, you would have to respect the privacy, and go under a number or something like that. Even if you’re voting you don’t even put your name. You need some privacy in some things. But there are a lot of things that people shouldn’t be afraid to put out their names because there is too much information being spit out there but nobody to account for it. I mean, it can be very harmful to a lot of people, and there is nobody even knows their source. So I think, it would depend on the subject.” I15: “…This is getting back to the ‘professionally made survey’. When the professional survey’s people phone me -and Gallop poll [for example]- someone that I have brand name recognition of, I know that they have done a statistically valid survey. So, I have some level of trust, then. That my answers will not be manipulated, and when I am responding to an open ended [question], I have no idea whether people are controlling answers, and with all that, I am partially suspicious that the survey can be highly manipulated. With that case, I want to be as anonymous as possible. I mean, I don’t want to disclose, because essentially, what I’m saying is, if the survey person is not disclosing much to me, why should I disclose much to them? That’s sort of the implicit trade-off I am describing to you.” 77 I6: Well, I would give more personal data if it’s to the government, if they are asking for… if it’s something that I need to go through part of a bureaucratic process. Then, I’ll know if it was required. But, if it’s just an optional survey in the community, I’m less likely to give my email, and even then, I don’t like giving my email. Yes, [I perceive email as a little bit more delicate] I don’t like to end up on spam lists.” For the child-node that grouped the ideas of nine interviewees that stated being inclined towards transparency, the following were cited as examples; interviewees 14, 7, 23 and 12 stated:  I14: “…the data collected should be made available. Not just the conclusions from it. But the aggregate data, itself, should be made publicly available. Transparently.” I7: “…If I got an opinion that I want to put online, I don’t mind putting my name on it. And, if more people had to identify themselves, as being the originators of the comments, maybe, there would be more intelligent things said…” I23: “Transparency is a very important issue to me. Because, if you don’t get the whole picture for the decision. Basically that’s it. How can you make an informed opinion if you don’t have all the facts? And, that seems to me the most important thing. Meanwhile, I have found that with….the city of Vancouver say. The City of Vancouver seems to send a lot of issues, but they engage in to what I call: manufactured consent. They just pull out figures that bolster their position without being factual, and to me, that is fundamentally wrong. I also like to hear both sides of an issue so that I can make an informed decision as well.” I12: “Well, I haven’t done any surveys where I was actually feeling threatened by anything [personal information wise] so I think I’ve been transparent.  If they ask me for my postal code, I realize that they can come to my door. But I probably haven’t said anything terrible, or have been asked anything, or a terrible question that I have felt my security in risk.”  From those six interviewees that stated being inclined towards privacy, some cited examples for this child-node such as: 78 I6: “I think privacy it’s a very important issue, and it’s something that I personally consider when I am considering giving feedback. That, I may not input if I am asked for too much personal data, and I’m less willing to share my opinion. So, there should be a really good reason for collecting any kind of personal information, and that shouldn’t have some record of information about confidentiality, and who is going to have that data, and have access to it. It’s very important.” I10: “…most surveys organize well with this one. So, your personal information is not attached to the information you’re giving out. I still worry of giving out any personal information, unless I know the source it’s reputable. And, sometimes, it’s not always possible. I am on Angus Reid, I am on a list of survey boards. So, I think you’ve asked perfectly, I mean, it doesn’t really matter what you’re asking me as long as nobody knows who I am.” I19: “I think it’s fair if a city is doing a survey online, that they know that I have participated. That they know anything that I’ve decided to make publicly available: like my telephone number, or my home address. But they do not have the right to know how I answered the questions… I’ve seen lots of online surveys that have the usual demographics questions at the end, about age groups; I don’t mind those numbers being provided, in a global sense. And I think I wouldn’t even mind getting my name identified, as being in to this grouping. But - unless I purposely chose to opt in to have my opinion identified to my name - I don’t think that whomever is doing the survey, should be able to access that type of information.”  Only a group of three interviewees mentioned that they do not believe that privacy exists. For example, interviewee 20 and 2 expressed: I20:“… I think that personal privacy it's all but dead. I think the ability to spy on citizens has increased astronomically in the last few years. So I don't post personal information online.  But, by the same token, I have no confidence, whatsoever, that we have any personal privacy left.” 79 I2: “I have little faith in that there is privacy. Every day it seems that I continue to read about certain places like Facebook, and like, are changing their terms of use almost daily. And they always seem to be in favor of decreasing the privacy that you have, and also decreasing the transparency that they have. Is quite negative doing. They don’t come out and tell you what’s going on. They don’t tell you. You usually find out by other forms of media”. Finally, only one participant indicated preferring direct participation, as opposed to its online version, since he considered that traditional participatory methods were more transparent, and served better to consult in smaller communities with older generations. I5: “…this is kind of a long answer, but where I live, on a small island, we have lots of opportunities for direct democracy; direct participation in democratic activities. But we also have a new generation of people who basically believe that online political activities are more transparent, and more inclusive. But the traditional forms of organization that require participation don't have them involved, because those people are feeling they are involved in a more transparent level of democracy online. From my perspective [online participation], is less participatory but more transparent, and each of them (online and traditional methods) gives up to privacy to some degree. I am of a different generation that sees [online participation] as - perhaps as more transparent, and possibly more participatory - but less practical because it doesn't involve congregating, and doing things in the traditional manner - . That doesn’t really say that they (those who participate online) are not involved in groups or activities, but they are perhaps more willing to gather to protest rather to participate in the mundane, day-to-day activities of democracy. The only issue I have is that I think as we grow - I’ve heard, even used this term: go forward or move forward - we learn how much we have traded-off for privacy for transparency. I am somewhat fearful what we will lose, as we move towards some kind of online transparency if that makes any sense.” 3.3.4 Public consultation platforms The third question of the interview asked: “What are the benefits of online consultation software?”. Seven of the interviewees provided examples such as the benefit of 80 facilitating accessibility to more participants. Following in popularity, there were five interviewees that instead of referring to the benefits referred to the constraints of online participation. Other benefits mentioned by interviewees included: two examples related to the idea that online public consultation provides options for meaningful citizen engagement, and two citations that indicated that public consultation facilitates reaching out to decision makers. Also, cited by individual interviewees was that online participatory tool have powerful information sharing potential and that online participation allows participating with flexibility of time / pace and place.   Table 3.3 - Classification of nodes for the question: “What is the main benefit or potential benefit of public consultation software?”. In the case of the seven interviewees that indicated that the benefit of public consultation software facilitates accessibility to more participants, interviewees 3,11, 5 and 12 recalled:  81 I3: “I think that it does have potential to open participation to people that don't like going out in public or people that have disabilities. I think it's difficult for people that have young children, or people that can't get out of the house of somebody. Is convenient there, so there's all of those that, quite often, their voices are not heard, otherwise and specially including street outreach. There's another advantage, I have a cold the for last little while, and I've been very active online but I would wouldn't have gone out to a public place and spread my germs anywhere” I11: “The benefit is that it gives another way, in a little threshold way, for some people, to participate that might not otherwise do so. And that, is not just the people who are totally hooked in to social media, but also people who are hooked down, who are disabled. Even people in poverty, many people have access and are very engaged online because it’s one way to stay connected with people, maybe they can’t get out all that much. So I think it’s a really important piece, and I think if it’s not happening there would be seen as a real deficit. It would be wrong not doing it, but if you do it exclusively, off course, that is not right either…” I5: “I can see referendum, decision making at a larger scale, voting [as benefits of online participation]. Voting it would be one thing. And I see that as being not a loss, I see it as a benefit in some ways.” I12: “I think it’s very easy to get a large group of opinions and you’re able to look at the demographics… so they place it in a town with similar demographics that are feeling the same. I think is a great way to just get lots information from the public easily.”  With a different perspective, instead of referring to the benefits of online participation, five interviewees mentioned some constraints of online participation rather than benefits. Within this group of comments, interviewee comments ranged from the concerns they have about the digital divide, the representativeness of different stakeholders 82 and some examples that they had seen of misuse of public consultation. For example, interviewees 4, 7 and 2 recalled: I4: “The main benefit would be, if we get more people on board, or people to participate more in their governments of their communities. I am a little bit skeptical that it will actually occur. In any community, you get only a fraction of the population to a meeting or to return a survey…In my little experience with this type of software that it's very similar, and again, you only get a certain segment of the population you get the people who have computers or are connected, and that, unfortunately, doesn't encompass the entire community. So, I see the benefit of more people being active but a bit concerned that it doesn't end up being sort of response rates of other forms of consultation.” I7: “I think [online participation] it’s very important, but it has to be done in a way that allows a good percentage of input. For example, the last one we [participated in]. There were 11,000 people involved, and the whole study was done on the responses of 54 people. So, I think that more public participation is required, and it should be… I don’t know what percentage, but 54 out of 11,000, I don’t think it’s acceptable. So, there has to be a way for them to do that, and to achieve that target, before they continue with the study. And with so many online forums or surveys, they would proceed with 54 people responding, out of 11,000, saying that that is acceptable.  But in the case of this last one, the 54 people who responded, did not represent the community. The study was presented to the town - this was over a two-year period -, so when adopted that, then it became part of the official plan. And, we are only three months into it now and it’s already no good. Because, there were things that came up, that were not even included, and the people not even knowing about it.  So, 54 people isn’t enough, [they] got to have a more representative number of the public to get a good feel for what they want.” I2: “…a good example that it comes to my mind it’s; a year ago or so when they closed down the Coast Guard station in Vancouver - the largest seaport on the western coast of the country -. And, the government said; we are closing it down. And it didn’t matter how many letters you would write to your local MP whether be in the North Shore, or Vancouver, or the 83 West side of Vancouver. That was not the way the party was going to decide everything. 90% of the people say we want the Coast Guard station, but it gets overruled. So, you can give all the information to the politicians, but it doesn’t necessarily mean that going to be heard to or listen to or even followed or thought about.  The benefit is definitely to the person that it’s seeking the information. But if they don’t use the information, it becomes a moot.” There were two interviewees that commented on the benefit of online consultation as providing a meaningful engagement experience. For example, interviewee 8 cited: I8: “I think the benefit should be: a sense of happiness. In general, a sense of civic pride, and contributing.  And then, happiness or satisfaction with the end result as well. Should be a generalized sense that the way that - it looks and works, and the building the developers were allowed to build, and the space between the buildings - how the infrastructure that is there, laid out and paid for; people should feel the importance of how those choices have been made. However, feels - a bit like - the apparent purpose it is to pick a box. It’s to show that politicians did not allow specialist to restrain the visions of their own. Just to go ahead and make that decision anyway. So, the ideal would be something that evolves participatorily, but the reality and the purpose of the work is more cynical.”  There were also two interviewees that mentioned ideas that clustered in the idea that the benefit of public consultation is reaching out to decision makers: I2: “I think the benefit is that politicians are able to understand and become knowledgeable on what the public wants and needs are. As to whether they listen to that information, or absorb that information, it’s a whole another question. You can present all sorts of polling information, for example, to a politician. But if that is not the way the party wants to go, then it gets ignored.” Finally, the benefits mentioned by interviewees that referred to the two child-nodes cited by individuals were: powerful information sharing potential, and the benefit of flexibility of time/pace and place. In those cases, interviewees cited: 84 I11: “I think that the use of online tools, because it’s possible to share huge amounts of information, do it visually, do it with infographics, even have a dialogue afterwards. That’s the real advantage if you are going to engage people for input. Then be able to show impact.” I6: “Benefit. The main thing, [is that it’s] easy to do it from home and at your own pace”   3.3.5 Role of public consultation The last question of the interview aimed at finding out about what the participants think the role of public consultation online is – based on two alternatives. Interviewees were asked: Which would you argue is more important for public consultation software: a) to focus on informing decision-making processes, or b) to help communities to create a more cohesive vision of the future?  Table 3.4 - Classification of nodes for the question: “What is the role of online public consultation?  To inform decision-making processes, or to help communities create their own vision of the future”. Ten interviewees stated that the role of online public consultation software is to help communities create a vision of their future. Then, seven interviewees were inclined towards 85 the role of informing decision-making processes. There were four interviewees whom indicated that both roles (to inform decision making and create a vision), are important. Finally, one individual interviewee indicated that the role of online public consultation software is to assist neighbors to create dialogue. In terms of the idea that the role of public consultation is to engage communities to create a vision of the future, interviewees 1, 5, 9 and 18 stated: I1: “If I have to give priority, I would say let’s spend some effort in engaging our communities in the vision of the future which refers back to my comments around: learning to emphasize, to listening, to ask good question, to train us - as community members - to be able to work well together. And then, we can launch into processes that help inform decision makers, or decision making processes better.” I5: “I suppose it would be a coherency. I think it could form a more coherent plan for the future. I don’t know that we could become all that much more informed, I think we are overwhelmed with information, and perhaps we know less than we did before.”  I9: “To me it’s about communities. To me the emphasis needs to be helping people in the community connect to each other ….and I would say that there different kinds of communities not just geographical communities. Because I think the technology can help with geographic communities as well, but the primary community are communities of interest around particular important issues and listen to what we are doing. On a much larger scale … and definitely on a global scale around… We’re not us in shortage of global issues we need to address. And, I am thinking on allowing people to involved addressing those issues, to be able to have access to each other in an informal but informative way… For example: I was - before I retired - a social housing developer, and I know that now people are forced to do housing at the local level with little or no provincial funding, and little or no federal funding. So, people at different parts of the country are doing very interesting and innovative things, but we are not in touch with each other.” 86 I18: “I think the community, creating a vision of the future. I think partly because I think that’s a very important aspect that the decisions that affect a lot of people need to be made as broadly as possible. And, people need to have an opportunity to have some input into the items that are going to affect them”.  Those participants who referred to the alternative that to inform decision-making processes is the main role of online public consultation software, interviewees 8, 22, 19 and 23 recalled:  I8: “I would say the first of those two, inform decision-making. Helping communities create a vision; I think the vision risks being frustrating, if the vision doesn’t follow on to the decision-making.”  I22: “I think the second one is a function of the first one a little bit..., but I think making it an informed decision… is probably the more important one. In terms of, where we would be less successful is in giving overall general guidance for a community.  For example: I don’t think people can do that as well online, or maybe in general. So, I think it would always have to be a scenario where giving people choices or options, as far as decision making goes, rather than a bit more open ended direction.” I19: “To inform decision making, because the second one will take care of itself on its own. The first one (to inform decision making). If you are grappling with a group that is not used to online survey to shape public policy, it is a group that it’s going to require far more attention, and… in terms of building communities that’s been going on for years online.” In the case of the three interviewees that indicated that both roles were important, they cited the following: I23: “…I don’t think it’s an either or situation, both issues are valid. It’s very valid that the public can make an informed decision in order to facilitate the second …You need the first one to facilitate the second one. You need a more cohesive vision of the future if you know what you’re dealing with. You can’t make a decision on the future without looking at what 87 your options are, and making an informed decision about the options. And those options are real, or are they not and not just someone else’s fight.” I15: “That’s a very good question, I would have a strong sympathy to both answers. I think that if public participation were used in the more thorough and rigorous manner, I think it would have tremendous power in the ladder category. In other words, I would love it, if it was giving me an accurate - as a citizen - picture of my fellow citizens. I think the constraints, though, are the level of participation. I’m a strong believer, for example, that we should be having more direct input to a government doing this kind of thinking. If that would be an incentive for people to actually vote. Let me give you an example: I had an opinion about the Point Grey bypass. I was very sensitive to the fact that it’s only my opinion, and they have certain bias and certain background. I would have truly loved to know what all of Vancouver really thought about that -unmediated by politicians -. So in other way: if I had a certain position of the topic that for example it’s different with the majority, I would be very happy to live with the fact that I was in the minority. And, the majority don’t feel this way and that would be very powerful and it would make me a lot more comfortable as a citizen rather than thinking a special interest group had a particularly strong sway on a particular topic, and we’re actually getting pushed by a special interest group, not by the general population.”  In the case of the interviewee that indicated the main roles of online public consultation was to assist dialogue amongst neighbors cited: I11: “I just found that kind of language very simplistic … may not be asking the right question, it’s more I think what this can help do, if there is that loop that gets closed at the end, and people can actually talk to each other or see how their neighbors feel, there is a chance to engage people in a dialogue with each other in a way… this shared vision of the future is kind of sometimes a bit of a pit fall. What I am trying to say is that this is like a tool that can be used to assist in the development of dialogue amongst the neighborhoods, even across neighborhoods. Because online you can find like-minded people who are living in an 88 area, in other area of the planet. But if you give them a tool, you can also start to engage in a dialogue with people that don’t agree with you or don’t understand your point of view, and that’s where this will be an errs in some kind of an engagement process for decision making. But if we don’t take it to that next step, what we do, is we end up having public meetings where there are certain people that show up every time that see that there is a microphone, they are on it, and they don’t let it go, and nothing really happens.” 3.3.6 General comments Throughout the interviews participants were encouraged to provide feedback in terms of their experience with the use of participatory platforms in general. In this last section of Chapter 3 – Results-, we present some of the comments arising from the interviewees opinions that were coded into the category of general comments, as they did not respond directly to the interview questions and/or were unique comments. However, these comments were still considered valuable as they were related to the participant’s expectations of how consultation topics should ask for feedback and they are summarized below: • To be presented with an objective but brief explanation of the contrasting perspectives on a topic so that participants may be informed about the rationale behind each point of view. • To have a list of opinions of stakeholders –or grouped opinions- that stand with the different viewpoints, so that facilitates the participant finding themselves within those positions. • Seeing information on how the responses obtained from a survey within the general consultation process might be used. 89 • Being informed about all the security measures for safeguarding people’s privacy that the organization calling for consultation is taking –beyond the ones that the participatory platform states-. • An interviewee indicated expecting to start seeing the chance of co-creation such that the engagement of communities is present from the very beginning of a process.  Following, in Chapter 4, the results presented previously will be discussed, and contrasted with the literature. 90 Chapter 4: Discussion The findings of this study are highlighted in this chapter. The discussion is presented in sections where each of the research questions is addressed by expanding on the relevant areas of interest. To build up the discussion, in each section reference is made to the quantitative data, as well as the qualitative data collected from the interviews. Additionally, the findings of this study are situated in the theoretical background and literature, and it is discussed the interviewees discourse around the different topics of research interest. Towards the end of this chapter, in Section 4.4, further areas of potential investigation will be discussed based on the findings of this research study, as well as on the discussion that was built around the scholarly literature that provided the framework of this thesis. To close this chapter, in Section 4.5, the limitations of this research are acknowledged. 4.1 Satisfaction of the use of participatory web platforms This section discusses the research question: Are the users of the online platform satisfied? The results of the online survey indicate that participants are divided between feeling “satisfied” and feeling “neutral” about their experience of using online public consultation platforms - in general as well as in the case study (Figure 3.17)-. When comparing the frequency of responses in the range between “highly satisfied” to “highly unsatisfied”, one could assume that the experience of participants has been predominantly satisfactory. However, responses do not show a clear trend or an unanimous feeling of the participants about their levels of satisfaction. This is why it was implemented further qualitative exploration through interviews. Through the experience of Verdegem and Verleye’s (2009) it could be provided some context of why the levels of satisfaction of 91 participants on the survey of this thesis do not reflect a clear trend. Verdegem and Verleye (2009) developed a model to measure satisfaction in the context of E-government. In doing so, they employed both qualitative and quantitative methods to formulate adequate indicators for measuring satisfaction. The model of Verdegem and Verleye (2009) employed a method aiming at exploring participants’ needs and expectations, and they pointed out that the justification of this approach was because there is criticism about the development and provision of electronic public services, and that it needs a more user-oriented approach. The complexity of the theoretical model they built, which considered behavioral aspects, reflect that questions on a topic such as satisfaction require sound exploration. Furthermore, as satisfaction levels are linked to participants’ needs and expectations (Verdegem and Verleye 2009), the satisfaction levels of the online survey of this thesis would be a reflection of the range of particular experiences of users presented in section 3.3.2: Interviews - satisfaction of the use of online participatory platforms. The interviews provided a source of information to find out about the specific factors that shaped the opinion of participants’ in relation to their levels of satisfaction. This translated into examples of what increased or decreased their levels of satisfaction as participants of participatory web platforms. The factors mentioned by participants to increase satisfaction were varied in nature, but those factors that decrease their satisfaction were grouped in only two main child-nodes (Table 3.1).  Those interviewees that focused on highlighting the features that increase their level of satisfaction mentioned factors such as; being provided the ability to expand on their responses, feeling that the use of their time participating is valued and examples of their 92 preferred approach when being enquired about feedback. Those interviewees that referred to factors that decrease satisfaction provided examples that could be grouped in two main child-nodes: negative connotations of how topics are asked and concerns about technological issues (Table 3.1). The findings of this research indicate that one of the factors most valued by participants when getting involved in an online consultation topic, is that they are provided with the ability to expand on their responses, that they get the feeling that their time dedicated to participate is valued, and to seeing topics/questions well developed. Nevertheless, in some cases interviewees had contrasting opinions about this. E.g.: one of the interviewees stated appreciating being provided the chance of responding in a comment box, and another interviewee would state finding it tedious to respond in comment boxes. Considering both perspectives, and by learning about the factors that shape participant’s satisfaction, it is not possible to infer that if those elements that decrease the levels of satisfaction of participant’s were addressed, there would be a positive impact on their satisfaction levels for the use of online participatory platforms. The fact that there are contrasting opinions about the drivers of satisfaction reveals that satisfaction is mainly a consequence of personal experience. The variety of the interviewees’ responses explains the rationale behind the model to measure users’ satisfaction with E-government that Verdegem and Verleye built in 2009. Their model, composed of 29 indicators, was reduced to 9 key indicators, reflecting that satisfaction with online systems is a more complex matter than just asking participants about their levels of satisfaction. A relevant conclusion from their study exploring satisfaction, and 93 the use of e-government that relates to this study, is that satisfaction may indeed have a decisive influence on large-scale adoption and use of e-government services (Verdegem & Verleye, 2009). In this thesis, there were three main research questions, and various areas of research interest to set into the context of a case study. Therefore, building a long survey to exclusively measure satisfaction such as Verdegem and Verleye’s study was not of our interest. However, it was of our interest getting a better understanding of the experience of using online participatory platforms, and to learn about participant’s satisfaction with their use. And in doing this, it was beneficial to complement the data obtained from both quantitative and qualitative portions of the research. The literature shows how scholars have studied the different ways that people’s behavior can affect user involvement and participation (Barki & Hartwick, 1994; Boulianne, 2009; Joinson, 1999; Zviran, 2008). These authors have contributed over time to the understanding of attitudes and behaviors, helping to explore the constructs of involvement, usage, and participation. Barki and Hartwick (1994) for example, advanced in the psychological aspects of people’s behavior that provide understanding how the participation of users in the development of systems contributes to their usage. Studies as such, have subsequently helped to build the theoretical constructs around the concepts of internet usage and online participation analyzed in this thesis. There has been a great advance in terms of increasing the understanding of users’ online behavior, but naturally, from the enquiring perspective of science, all these authors agree with the idea that there is need of further 94 research in this area. Human behavior seems to be a field that is permanently open to, and requiring exploration.  In this sense, in the online survey of this research study, besides the direct question on participant’s satisfaction levels, they were also asked about what preferred features they like seeing in participatory platforms  (i.e.: videos, pictures, etc.). As a result, participants showed preference for seeing features such as; summary tables, graphs, and pictures over seeing other features such as videos or their responses compared to other participants (Figure 3.11). From the perspective of online behavior, Nov, Naaman and Ye (2010) investigated motivations of participation in online communities and suggest that comparing the results of preferred features may provide some insight in the motivations of participants.  The results of this research study, allowed identifying the preferred ways participants like seeing displayed data or information. This is evidence that could help for development or improvement of participatory web platforms, and through further study, it could be confirmed how this could affects participant’s levels of satisfaction which is key for increasing use of participatory web platforms. This concept of users’ preferences is reinforced by the idea of Chung and Nah (2009), whom explored the effects of perceived satisfaction (discussed in section Efficiency) of an online community newspaper. Chung and Nah (2009), were able to identify that customization features - such as; content submissions, letter-to-the-editor, and e-mail byline links - are the sole significant positive predictor of perceived satisfaction in their study.  The outcome of how the participants’ satisfaction level is shaped, it is partially reflected in the responses to the question on the survey of this thesis, where they were asked 95 about their beliefs towards online public consultation (from a favorable and negative perspective). Here, most of the participants indicated agreement with those statements that were supportive with online public consultation. On the other hand, on the statement questions that referred to negative beliefs on online public consultation, participants were divided in their opinions; not showing any trend.  Keeping in mind that public consultation, at least on its traditional methods aim at increasing engagement of participant, it is crucial to find ways to increase the satisfaction of users. Based on other specific factors that decrease satisfaction levels described by participants, there is evident linkage between some of the comments expressed by the interviewees, and with what scholars have identified as the challenges faced in online participation. For example, what in the literature is referred to as the “digital divide”, an interviewee mentioned being concerned about elders participation when potentially encountering technological barriers while participating online (I9, on page 63 above). Also, thinking about the potential technological barriers that participants may encounter, the literature indicated that the technological gap - digital divide - exists, in addition to other challenges of implementing online participatory processes (presented in 1.5.1 Context of participatory processes).  In terms of the other challenges - that according to scholars - online participation face, such as concerns with personal privacy, this will be discussed further in section 4.2 (Trade-off between transparency and privacy) . Some participants, that when asked about what shapes their satisfaction, provided examples of previous personal experiences related to online involvement with governance issues, and how these have shaped their satisfaction. 96 These examples will be presented in section 4.3 – governance involvement - as they related to the context of that section. Although in this research study, the socio-demographic indicators did not show any relationship with the participant’s levels of satisfaction, scholars such as Loges and Jung (2001), Johnson (2004) and Pfeil et al (2009) debated on how socio-demographics affect online behavior. They stated finding indicators such as educational level or age as predictors of information technology usage. Perhaps, in a bigger sample size of the members of the case study there could potentially be differences within these variables. Nevertheless, while there may have been no differences observed in the levels of satisfaction at different demographics in this research study, it is important to acknowledge that Van Deursen and Van Dijk (2011), did observe the differences between operational and formal internet skills in their study. They found differences based on the age of internet users when designating internet tasks to them. Although reflecting high operational and formal internet skills, their information and strategic internet skills was not sufficient to succeed in performing certain tasks while using the Internet. They also emphasized this could increase potential disadvantages that groups with less education, would have as more activities are performed over the Internet. Considering that the ability of performing certain tasks on the internet, and therefore in online participatory platforms, it can be assumed that this will shape the perception of the experience of their users. From this perspective, although our study did not find any socio-demographic to be an indicator of satisfaction with the use of online participatory platforms, we consider it was beneficial to monitor socio-demographic variables in this thesis, as technologies and the interaction with their users have historically evolved.  97 4.2 Trade-off on transparency and privacy The previous section discussed how participants referred to the different factors that increase or decrease their satisfaction, and it was discussed how this related to the participants’ online behavior. Still with a focus on online behavior, this section’s discussion it is around the research question: What is the participant’s trade-off between transparency and privacy? The responses of the online survey, which asked participants if they were more inclined towards transparency or personal privacy, resulted in a bimodal distribution (Figure 3.16). Although, there was a slight difference on the way the data was distributed reflecting a slight inclination towards the preference for personal privacy. As a matter of fact, the total number of respondents that were within the range of privacy (N=60) was higher than that the number of responses in the range of transparency (N=52) as seen in Figure 3.16 - excluding those who chose a neutral score (N=6)-.  Furthermore, the peaks on the bimodal curve also showed a greater peak on privacy with 20 respondents, compared to 17 respondents at the peak of transparency. Indeed, at the extreme values of preference of both concepts, 12 participants chose the highest value of inclination towards privacy compared to only 4 participants that chose the highest value of inclination toward transparency. One of the research questions of this study aimed at finding out what is the trade-off for these two options, and despite it was identified a slight difference in the bimodal curve, the data from the online survey did not show a particularly preference between transparency or privacy, or how this choice was made. Therefore, further exploration was required through the interviews. 98 Participants were asked in the interview to expand on this topic to understand what shapes participants’ trade-off between transparency and privacy when getting involved in participatory web platforms. Almost half of the interviewees - 10 out of 23 - highlighted that they incline towards transparency or privacy, depending on the context (Table 3.2). Also, some clarified that they make the trade-off decision based on the type of organization calling for the consultation or the topic they are being asked about - based on their perception of trust built from previous experiences -. This connects to a couple of comments of interviewees that mentioned that the City of Vancouver had used practices of consultation that had decrease their satisfaction – I23 and I15 (pages 63 and 68)-, which directly relates to the scope of this research, since the case study platform is based in Vancouver. On the other hand, an example provided by one interviewee referred to the city of Calgary, which according to interviewee 22, has gained his trust (page 66). Furthermore, a couple of interviewees mentioned having issues with trusting online technologies or systems - interviewee 6, section 3.3.3: Trade-off between transparency and privacy (page 83).  Another piece of information that relates to the topic of online user’s privacy concerns, obtained from the online survey of this thesis, revealed that participants feel uncomfortable sharing phone number and home address. And, on the other hand, participants indicated feeling comfortable sharing email and last name (Figure 3.15). In terms of specific privacy concerns, or as discussed earlier - the preferred features that were mentioned to increase satisfaction – there are elements identified in this research study that could potentially contribute by identifying opportunities of improvement of the design of participatory platforms. 99 Acknowledging the varied nature that stakeholders and participants have (Bimber 2000, Prell et al. 2009), one could think that transparency and privacy would be based on the participant’s socio-demographic indicators; as Dara O’Neil (2001) depicted a small difference within socio-demographic indicators for online privacy concerns in her studies. However, this difference at socio-demographic levels was not observed in the responses to the online survey of this thesis, which partially confirms what Priscilla Regan et al (2013) stated about not finding enough evidence to indicate different online privacy attitudes among generations.  On the other hand, Bhimani (1996) described the basic flaws of the Internet structure in 1996, and despite the great progress in developing technologies that provide more secure and reliable systems on data security since then, it is evident that as a society we have not completely adapted to those limitations foreseen almost two decades ago. This, specially if we consider that interviewees brought up to discussion concerns about trust in the system, privacy concerns, or even stated that privacy does not exist anymore based on the capabilities of technology to track online behavior of its users. While in the field of marketing, trust is a key element of successful marketing in electronic commerce (McKnight et al 2002), this thesis reinforces what Joinson et al (2010) found in their study; where privacy and trust seem to be interrelated to the point where high trust compensates for low privacy. That was the case pointed out by interviewee 22, whom stated being willing to share personal information in online consultations on local issues as he trusts the local administration (the city of Calgary, cited on Page 68).  100 The comments of interviewees that referred to the concept of trust or trusting online systems, were varied, and indicated that when prioritizing transparency over privacy could discourage internet users to act as internet trolls (interviewee 23, page 64). Another interviewee indicated that by having more transparent processes, where participants are accountable or identifiable would benefit the participatory process by potentially making participants provide more intelligent or thoughtful opinions (Interviewee 7, Page 67). This group of statements also supports the idea that participants’ trade-off evaluation is context dependent as discussed above, on Page 88. Tom Bakker and Claes de Vreese stated in 2011 that the effects of the Internet use depend on a complex combination of personal and social characteristics. Learning about these specific experiences that have shaped the perception of participants we found a valuable source of information to increase the understanding of participatory online behavior by exploring different personal perspectives on what are the participant’s opinions and experiences with online participatory systems. In terms of other issues associated with concerns of interviewees, related to their trade-off decision between transparency and privacy, some mentioned online security concerns but from the perspective of a factor that decreases the levels of satisfaction.  Zviran, in 2008, focused on online privacy concerns research but making the distinction from online security, which relates to data integrity. In the interviews, it seemed that some participants referred indistinctively to these concepts, which seem to be a reflection of the increasing awareness of the use of technologies that society has developed. It seems that online security has turned into a sensitive topic due to the increase awareness of surveillance practices on internet users. As a result, discussions around anonymity are 101 gaining importance and initiatives to protect personal privacy and encourage anonymity are becoming more prevalent. In summary, through this thesis we have increased our understanding of how is shaped the experience of internet users that engage in online participation for consultation, decision-making or community planning. As public consultation aims at increasing transparency, it is necessary to acknowledge there could be limitations of increasing transparency when trying to balance transparency while revealing personal information on web platforms. However, acknowledging that these experiences are a result of the interaction with these online systems, it is necessary to keep in mind that learning about what features participants want to see in these platforms and linking that with the goal of providing ways to increase trust in online participatory systems, could potentially increase satisfaction and also help achieving the ultimate goal of public consultation of increasing engagement of participants.  4.3 Governance involvement of participants This section discusses the research question: What is the participants’ involvement in governance? Based on the responses to the two questions of the online survey that aimed at exploring the governance involvement of participants (Figure 3.7 and Figure 3.8), one could assume that participants are indeed quite involved in governance. Likewise, it would also be reasonable to assume that participants that indicated being related to the planning profession would be more involved with governance than the rest of the participants, however no significant difference was found between these groups, although the comments and organization of the ideas in the interview were noticeably different. Some of the interviewees 102 that indicated being related with the planning profession shared experiences or provided examples of how they considered participatory web platforms could be improved (3.3.6). Wellman (2001), measured internet involvement and found that the Internet use was associated with increased participation in politics, and also that heavy Internet users were people that committed less to an online community. Considering this is a literature published in 2001, these findings could be revised as the Internet is ubiquitously present in our daily lives, and the parameters that defined heavy internet users then, may have changed at current date. This is why this thesis focused on investigating other aspects of the users’ interaction with participatory platforms related with governance.  So, from the standpoint that heavy internet users are less committed to an online community (Wellman 2001), this research study dedicated a section to find out about the familiarity of participants with technologies/internet use by exploring the use and frequency of social media use. The responses showed that participants do have familiarity reading social media platforms (Figure 3.9). When the individual scores of frequency of reading social media platform were calculated as a combined score for reading social media platforms, the results showed that 99 out of 118 participants read social media at least once a week or more frequently, in at least one or more of the social media platforms the survey enquired about. This data could reflect that the operational skills of the users of the case study web platform with the use of internet are high, according to the analysis of Van Deursen and Van Dijk (2011). Despite there are aspects of the internet use learned from this portion of this thesis, there are still areas of uncertainty related to online behavior. As mentioned earlier, Tom 103 Bakker and Claes de Vreese (2011) stated that the effects of the Internet use depend on a complex combination of personal and social characteristics.  On the other hand, Bakker & de Vreese (2011) investigated the effects of the Internet use and political involvement among young people (18 to 24 y-o), finding a positive association. Despite the fact that the average age of the respondents of the online survey of this research study was 53, the results of the online survey of this research study seem to echo the findings of Bakker and de Vreese.  In terms of governance involvement of participants and the beliefs regarding the potential of the use of online participatory tools, the responses to the online survey reflected an inclination towards agreeing with the beliefs that represented favorable comments towards online public consultation (Figure 3.18). According to a study carried out by Elections Canada between 2004 and 2011, the attitudes of Canadians towards online registration suggest an increasing level of support for online registration among all electors during that period of time. On the other hand, the responses to the beliefs that were not favorable towards the potential of the use of online public consultation did not show a trend or consistency in the responses, but the interviews provided some clarification. For example, during the interviews, participants expressed optimism about the potential of the use of online web platforms for public participation and, just like in Coleman and Gøtze’s study (2002), participants stated expecting to see opportunities to contribute to policy making via online engagement.  Specifically, interviewee 8 (Page 59) mentioned that would like to have more nuanced ways of 104 participating, a platform that provide flexible ways to give feedback, to find a way to bring the benefits to their own communities.  In general this could be considered an indicator of a positive predisposition towards participatory platforms. On the other hand, on the question of the survey that asked participants about beliefs that had a negative connotation on online public consultation, the distribution of responses was not uniform. This is also connected to the varied nature of the comments related to factor that interviewees mentioned to increase or decrease their levels of satisfaction. Although there is a feeling of positive predisposition for using web platforms reflected on the responses of participants and interviewees of this research study, it is acknowledged the challenge of online participation connected to the theory of the 1% rule of the internet users. This theory was recently tested and confirmed by Van Mierlo (2014) where he indicated that 99% of the Internet users fall in the category of lurkers18. Although this could be a concern in terms of participation rates, online public consultation and participatory methods do not aim intrinsically at generating content, but to increase transparency in the decision making process, to increase citizen engagement, and to add efficiency to the participatory processes. From that perspective, we think that the effects of the 1% rule would require further exploration in the field of its outcomes on online public consultation.                                                 18 Lurker: A member of a newsgroup or other online forum who reads messages but does not contribute to the discussion. (Chandler & Munday, 2011). 105 4.4 Further areas of potential investigation  In this research, it was possible to partially understand the state of the art of an online participatory platform. This research study had various areas of research interest and as such, it was a challenge to investigate all of them in a deep manner. Also, by learning about the complexity of the components of this research study, and acknowledging the limitations encountered during the investigation, it is of relevance to highlight and summarize some of the areas of further research that could contribute to the field of online consultation. Following we will highlight some of those potential areas of investigation.  First, linked to the findings of this research in terms of the features that participants indicated preferring seeing displayed, it would be an interesting area to explore the effects of providing customization features for the way that background information is provided/displayed - such as summary tables, graphs and pictures -, and how this affects their levels of satisfaction. Also, further investigation towards understanding of the multiple aspects that shape online participation may provide developers ways to choose more suitable online participatory tools based on the context, and also to contribute to public consultation in ways that acknowledge the limitations of its online version, but take advantage of its online version. Based on what was investigated on this research study, rather than from an objective comparison with other online participatory platforms case study –which were not found in the literature-, the literature pointed the fact that online tools are being used with the aim of helping communities. In addition, it was found agreement within the literature about the benefit of employing qualitative data for research in the fields related to internet use and 106 online participation, such as the one obtained from the interviews of this research study, mostly because the topics of this research study interrelate in a complex way.  Using qualitative research in further investigation would allow describing participants’ experiences and would let the participants provide the connection they perceive the topic has with other fields. Qualitative research using semi-structured or non-structured ways of enquire would allow flexibility to build these connections. For example, in the case of this thesis, when interviewees were asked about satisfaction of the use of online platforms some brought up the topic of privacy and, likewise, when other interviewees referred to the trade-off between transparency and privacy, they linked the discussion with factors that affect their levels of satisfaction. Also, we consider there is potential for further exploring the experiences and perceptions of the participants of these online communities to find out more about the effects of online tools on social capital with the aims of enhancing participatory planning and decision-making. Although this study did not initially anticipated exploring the topic of social capital, the literature inevitably connected at times with this concept, as we live in an online era as a society where we have virtual connections of communities, institutions, networks and organizations. From what was observed during this research, there could be the potential of benefiting from the apparently high motivation of these groups of participants to engage, and also from the knowledge and experience of older generations - as containers of experience and knowledge -. Beyond investigating what caused the average age of the case study of this research to be 53, it might be interesting in potential future research to explore what the 107 experience or knowledge of an older socio-demographic group may represent for society in terms of the feedback they provide for decision making. Also on the same topic, it would be interesting to explore how the civic knowledge and governance involvement experience reflects in their opinions. For example, comparing different age groups and their decision making on planning or governance topics to find the differences within the age groups and how these related with social capital. Another area of potential research that could benefit from further investigation is around the trade-off between transparency and privacy and trust in online participatory systems. The research study aimed at learning the trade-off between transparency and privacy when engaging in a consultation processes online. Based on this research study, it was learned that participants incline towards transparency or privacy depending on context. Further research could focus on learning more about how participants have developed trust in certain institutions or organizations, which could help developing strategies to increase online civic engagement. According to Warren et al (2014), social media for example, has helped to increase trust in institutions. In this sense, it would be interesting to explore how a web platform, independent from government, can build up trust among its stakeholders and influence decision making, to invigorate citizen engagement in planning and decision making. Finally, taking into account the opinion of one of the interviewee who mentioned (interviewee 5, page 70) preferring direct participation rather than online participation, and 108 echoing the view point of Brian Adams19 (2004), we find potential on doing a program evaluation type of research to investigate a participatory processes that may have combined both traditional and online methods. Furthermore, understanding that there is potential and limitations of using online participation, it would help to see other participatory web platforms studied as well to understand their strengths and weaknesses in different contexts.   4.4.1 Limitations of the study To begin, it is important to mention that with little information on case studies on online participation platform there were limitations to compare the case study of this thesis to others studies objectively, however the literature review served as a general guideline in this investigation. And, in this sense, it is acknowledged that there is greater amount of scholar sources on the different areas of research interest of this thesis from the perspective of other disciplines beyond public consultation, and that the literature review was a selection of them. Also, there are certain limitations for quantitative studies as well as for qualitative studies. Therefore this study evidently had constraints related to both types of research approach. First, a common limitation such as time available to carry out this research studies. Time wise, there could have been some benefit in extending the deployment period of the online survey but for the sake of completing this study within a reasonable time frame of a graduate program, extending the time of the survey deployment was only possible to a certain extent. The original planned period was effectively extended from 4 to 16 weeks.  Due to the sampling method used, which was convenience sampling, there is potential bias in this research study in terms of how participants were available to take part in                                                 19 Brian Adams (2004), whom stated that public meetings are not meeting their main purpose of influencing decision-making, and are mostly working as a way to voice opinions to authorities or set the agenda. 109 the study. The members of PlaceSpeak evidently use the Internet, and the frequency that they use the Internet as well as the familiarity they have with web platforms may vary between members. The group of people that responded to the online survey required going through different stages that may have resulted in leaving certain participants out. For example, participants had to receive an email with an invite. At that stage, there was risk of having difficulties of opening the link that granted access to the survey, and once having accessed to the survey, there was risk of participants dropping out of the study at an early stage before finalizing the questionnaire. Additionally, although more than half of the participants that took the survey expressed interest in following up with an interview only a third of them ultimately responded to our contact email for that purpose. From the perspective of the analysis, a non-random sampling such as the one used impeded us to do inferences on the population of all PlaceSpeak members. It is acknowledged that significance tests require, among other requirements, a random sample to add validity. Despite this limitation, some statistical tests were run, which helped to shed light on potential relationships between the variables of the research interest at the analysis stage. Considering that the size of the sample and the sampling method were, in theory, not the ideal methods to use, the additional effort of carrying out qualitative analysis through the follow up interview reduced some uncertainty by providing participants the chance to expand on their responses on the topics of research interest. 110 Chapter 5: Conclusion With the current levels of use of Information Communication Technologies and technological devices, and as their use continues to increase as part of our daily lives, participatory initiatives online are only becoming more commonplace. The fast pace of life in cities, where there is constant development and need of planning, policy making, and consultation for decision-making, turns online public consultation into a potential tool to help keep up with these fast paced changes. There are advantages to using these online tools; easy access and management of big data, reaching out to remote locations – providing connectivity is available–, and potentially decreasing the costs of public participation as online systems stay in place and are perfected. Nevertheless, there are challenges that still have to be considered so that these online processes strengthen citizen engagement and participation in governance involvement and decision-making, rather than having them questioned by public opinion.  The objectives of this research rose from the review of a theoretical framework and empirical studies that support or challenge the different aspects of online participatory processes. In deciding on the research methods, the instruments chosen took into consideration different perspectives, which later helped to address the research objectives from a pragmatic perspective that acknowledges that the potential and challenges of participatory systems are interrelated.  This thesis contextualized the challenges and potential of use of online participatory systems in a case study, using qualitative and quantitative research methods. Regarding the levels of satisfaction of the users of the case study web platform, it is concluded that 111 participants are satisfied but through the interviews, it was learned that nevertheless they still yearn to see these platforms improved.  In summary, this research allowed us to learn from the interviewees’ experience with the case study online consultation web platform. Among the findings of this investigation is that participants do have different preferences for the features they like seeing when interacting with an online consultation platform. Participants have preferences regarding the way the background information of a topic is presented, but also regarding the feedback options provided.  We learned about the online features that participants prefer to see in a web platform when engaging in online participation, which are: summary tables, pictures, and graphs.  The findings of this research also indicate that among the factors valued by participants when getting involved in a consultation topic is that they value being provided with options, which leads us to conclude that there is value in exploring customizable features for these platforms. Secondly, participants appreciate well-developed consultation topics that clearly show that they were designed to make the best use of their time; thereby indicating that the participants’ time is being valued.   According to the interviewees, the time they dedicate to participate is taken seriously; so they expect respect for their time and desire to participate in a more nuanced way. As examples, they referred to how the information is presented to them, and what options participants are given to provide feedback. For example, they mentioned disliking questions that are somehow leading to certain answers, and that they like being given different ways to provide feedback or responses such as multiple-choice options and/ or optional comment boxes. As commonsense 112 as these requests may seem, it is evident that institutions or organizations calling for consultation have room for improvement in responding to the concerns of participants. These two pieces of information: the preferred features and understanding what increases participants satisfaction, could be key for consultation platforms that intend to increase the level of involvement of their participants, which is supported by the fact that these concepts link to the literature related to understanding participants’ needs and expectations, and how behavior can affect user involvement. In conclusion, as nowadays more participatory initiatives are complemented by or migrated to online versions, it is crucial to look for ways to increase the satisfaction of their users; whether this is by implementing the technological enhancements that provide more efficient interfaces, or by tailoring the features of the web platforms with the aim of attracting the attention of the participants. These days, there are many technological tools or web platforms that facilitate participation, however having options available for participants has not necessarily provided satisfactory experiences due to some of the reasons presented in section (about the factors that decrease satisfaction). In general, there has been progress in participatory platforms to date; such as improving the interfaces and enhancing the Internet communication protocols to provide safer online environments for users. Previous research had paid attention to the experience on the user end to understand participants’ motivations to increase engagement and efficiency of the participatory process, and although this research did not explore the preference for customizable features on participatory web platforms, we think that customizable features may provide participants the options that they 113 are expecting to see in online participatory platforms to be able to participate according to their needs and interests. In regards to the second research objective, the results indicate that the trade-off between transparency and privacy that participants make when participating online showed to be context dependent. Although this statement seems broad, this is reflected in the quantitative responses where participants were asked whether they inclined towards transparency or privacy when participating online for governance involvement and decision-making, which resulted in a bimodal distribution.  Through the information obtained from the online survey and follow up interviews, we increased our understanding of the context of online consultation processes on the case study platform. We can acknowledge that the context is what predominantly shapes the experience of internet users who engage in online participation for consultation, decision-making and community planning.   Furthermore, the interviews provided a sounder understanding of their perspective and concerns. Of the concerns expressed during the interviews, it was noted that at least 50% of the interviewees were inclined towards transparency, while the other half of interviewees indicated responses related to the idea that “Privacy is context dependent”. Also, while respondents seemed to be divided between these two, a third popular response among interviewees referred to the fact that some of the participants (10 of them) mentioned having issues with trusting online participatory systems. They referred to trust regarding different elements, for example: the organizations calling for consultations, the methods used (surveys, polls), the management and storage of the responses and personal data, and what 114 happens with the participatory process after they have participated. Following the popularity of responses, interviewees mentioned that they like being provided with options to set up different settings or levels of privacy. From these sets of responses, and in terms of the trade-off between transparency and privacy, it is possible to conclude that when it comes to privacy, once more, customization is a feature valued by participants. This reinforces the idea that providing customization features could positively impact usage of online platforms for participation. In regards to the issues related to the trust of participants in online participatory systems, it is also concluded that this is an issue beyond participatory platforms and more related to the participant’s perception regarding the reputation of institutions or organizations calling for consultations via online. Nevertheless, as online participation becomes more popular and the development of systems moves towards providing features to increase the levels of satisfaction of participants, there could be potential for improving the trust of participants in online systems though their experience with a participatory consultation platforms run by a third party such as PlaceSpeak. That said, as the trade-off between transparency and personal privacy is predominantly “context dependent”, we conclude that trusting online systems is becoming key for civic online engagement for decision-making. It seems that a major barrier is the distrust on online systems and having some apprehensions about sharing their personal data when required to get involved in participatory initiatives online. In term of how the trade-off between transparency and privacy occurs when participants decide to engage in online participatory activities for governance or decision-making, we conclude that there is 115 important work on the side of organizations and institutions to increase levels of trust. Learning about the different opinions on personal privacy and trust, builds a frame for better understanding the experience of the participants of this study and potentially finding links to online behavior. The last research objective was focused on governance involvement. The result of the survey carried out for this thesis clarifies the understanding of governance involvement and participation of the users of the case study web platform within different socio-demographics.  From the perspective of the literature it has been possible to identify benefits and opportunities to improve these participatory platforms and online governance capacity. Learning about the state of the art in research in these areas has been valuable for gaining a better understanding of Internet users’ online behavior, although it seems that the understanding of online behavior is still limited and potentially subject to change based on the contrasting literature that exists. Based on the findings of this research, we conclude that evidently the users of this participatory platform are involved in governance. However, more importantly, is that the group of participants that were interviewed could be considered the most engaged and as such, they expressed very interesting points of view beyond what they were asked in this research. They were eager to provide examples about how online consultation systems should and should not be. These examples were discussed in section 3.3.5 and could be taken in consideration for improving the systems development of participatory platforms for decision-making and governance involvement. 116 Revisiting the three main goals of public consultation: to increase transparency, efficiency, and the engagement of participants in the process; it is possible to see opportunities for improvement of online participatory and consultation platforms. Taking into account the concerns expressed by the participants in this study could be beneficial to help achieve those goals. As Wellman (2010) pointed out, he believes that Internet use is normalizing over time, which could mean that once the digital divide has decreased significantly, the use of the Internet could become more popular for governance involvement. This “normalized” scenario where people use the Internet for civic purposes with more relevancy could mean that participants constantly provide feedback, deliberate, and engage on topics related with decision making and planning on a regular basis. To get to that scenario, it is likely that monitoring the satisfaction levels of users of online participatory platforms would be required, improving the customization options for privacy/transparency and encouraging governance involvement through social media and Information Communication Technologies. Finally, it is important to state that through the implementation of this research, we experienced first-hand the process of developing a consultation topic using a web platform –the case study–. Therefore, the challenges faced in this research process were the exact same challenges that organizations running online consultation topics face which provided us with a firsthand experience for analysis.  We understand that the web platform PlaceSpeak – our case study– provides to those organizations calling for consultation a best-practices guideline document, we highlight the utility of this guideline; the success of online participation initiatives is dependent on complex multi dimensional components. However, through this 117 type of research it is possible to broaden the understanding of the nature of these multi dimensional components and to get empirical experience of the specific case. Thanks to this experience we have elaborated a set of recommendations to the public consultation web platform as follows. 5.1 Recommendations. Based on the findings of this research and the proposed areas of investigation discussed in the previous chapter, we would like to suggest some recommendations not only to the consultation platform but also to those who call for consultations online.  First, we find that there is value for the case study web platform in exploring the effects of increasing customizable features for participants, which could be promising for increasing their satisfaction levels. As indicated in this research, participants seem to value when they are able to choose the way they provide feedback or the way they are presented information related to the consulted topic. For example, it would make sense to carry out a program implementation type of study where initial levels of satisfaction are measured and then, provide participants the option of customizing certain features, and then exploring potential changes in the levels of satisfaction of having these customizable features. The objective would be to monitor whether customization features are an actual improvement of the experience of online participation for its users. On the other hand, we believe that it would be of great use to the organizations calling for consultation, and also to the web platform hosting these consultations, to focus on increasing satisfaction of users. To achieve this, we think the use of the preferred features mentioned by interviewees could be increased when possible and appropriate. 118 Finally, we find it vital to highlight the importance of gaining the trust of stakeholders. This is a complex terrain that we did not deeply explore through this research, however, the discourse of scholars and the interviewees of this research points out the relevance of the topic and the ways it interrelates with users’ satisfaction in online participation. Beyond the ability that the case study platform already has to provide the tools, guidelines and insight, the role and responsibility of organizations and institutions that call for consultation becomes paramount when there is so much flexibility to independently facilitate a public consultation topic.  119 References Adams, B. (2004). Public meetings and the democratic process. Public Administration Review, 64(1), 43–54. Retrieved from http://doi.org/10.1111/j.1540-6210.2004.00345.x Andrews, D., Nonnecke, B., & Preece, J. (2003). Electronic survey methodology: A case study in reaching hard-to-involve Internet users. International Journal of Human - Computer Interaction, 16(2), 185–210. Retrieved from http://doi.org/10.1207/S15327590IJHC1602 Artibise, Y. (2011). July 18, 2011 Archives - PlaceSpeak Blog. Retrieved April 18, 2016 from http://blog.placespeak.com/2011/07/18/ Bakker, T. P., & de Vreese, C. H. (2011). Good News for the Future? Young People, Internet Use, and Political Participation. Communication Research, 38(4), 451–470. Retrieved from http://doi.org/10.1177/0093650210381738 Bang the Table. (n.d.). Retrieved April 18, 2016, from http://bangthetable.com/ Barki, H., & Hartwick, J. (1994). Measuring User Participation, User Involvement, and User Attitude. MIS Quarterly, 18(1), 59–82. Baroudi, J. J., Olson, M. H., & Ives, B. (1986). An empirical study of the impact of user involvement on system usage and information satisfaction. Communications of the ACM, 29(3), 232–238. Retrieved from http://doi.org/10.1145/5666.5669 Berman, E. (1997). Dealing with cynical citizens. Public Administration Review, 57(2), 105–112. Retrieved from http://doi.org/10.2307/977058 Bimber, B. (2000). Measuring the gender gap on the Internet. Social Science Quarterly, 81(3), 1–11. Bonfadelli, H. (2002, March). The Internet and Knowledge Gaps: A Theoretical and Empirical Investigation. European Journal of Communication. Retrieved from http://doi.org/10.1177/0267323102017001607 Boulianne, S. (2009). Does Internet Use Affect Engagement? A Meta-Analysis of Research. Political Communication, 26(2), 193–211. Retrieved from http://doi.org/10.1080/10584600902854363 Brown, G., & Weber, D. (2011). Public Participation GIS: A new method for national park planning. Landscape and Urban Planning, 102(1), 1–15. Retrieved from http://doi.org/10.1016/j.landurbplan.2011.03.003 Brunsting, S., & Postmes, T. (2002). Social Movement Participation in the Digital Age: Predicting Offline and Online Collective Action. Small Group Research, 33(5), 525–554. Retrieved from http://doi.org/10.1177/104649602237169 Bryer, T. a. (2011). The Costs of Democratization. Administrative Theory & Praxis, 33(3), 341–361. Retrieved from http://doi.org/10.2753/ATP1084-1806330302 Bryer, T. a., & Zavattaro, S. M. (2011). Social Media and Public Administration. Administrative Theory & Praxis, 33(3), 325–340. Retrieved from 120 http://doi.org/10.2753/ATP1084-1806330301 Campsie, P. (2007). Online public consultation: The promise - and the reality. Municipal World, 117(12), 13–14,24. Canadian Internet Survey. (2012). Retrieved April 10, 2016, from http://www.statcan.gc.ca/daily-quotidien/131126/dq131126d-eng.htm Carlitz, R. D., & Gunn, R. W. (2002). Online rulemaking: A step toward E-governance. Government Information Quarterly, 19(4), 389–405. Retrieved from http://doi.org/10.1016/S0740-624X(02)00118-1 Cars, G., Healey, P., Madanipour, A., & de Magalhaes, C. (2002). Urban governance, institutional capacity and social milieux. Carver, S. (2001). Public participation using web-based GIS. Environment and Planning B: Planning and Design, 28(6), 803–804. Retrieved from http://doi.org/10.1068/b2806ed Chandler, D., & Munday, R. (2011). A Dictionary of Media and Communication. Oxford University Press. Chen, P. (2007). E-engagement: A guide for Public Sector Managers. Retrieved April 18, 2016 from http://www.oapen.org/search?identifier=459088 Chess, C., & Purcell, K. (1999). Public participation and the environment: Do we know what works? Environmental Science & Technology, (732), 2685–2692. Retrieved April 18, 2016 from http://pubs.acs.org/doi/abs/10.1021/es980500g Cho, H., & Larose, R. (1999). Privacy Issues in Internet Surveys. Social Science Computer Review, 17(4), 421–434. Retrieved from http://doi.org/10.1177/089443939901700402 Chung, D. S., & Nah, S. (2009). The Effects of Interactive News Presentation on Perceived User Satisfaction of Online Community Newspapers. Journal of Computer-Mediated Communication, 14(4), 855–874. Retrieved from http://doi.org/10.1111/j.1083-6101.2009.01473.x Ciuccarelli, P., Lupi, G., & Simeone, L. (2014). Visualizing the data city: Social Media as a Source of Knowledge for Urban Planning and Management. Springer briefs in aplpied sciences and technology. Retrieved from http://doi.org/10.1007/978-3-319-02195-9 Coleman, S., & Gøtze, J. (2002). Bowling Together: Online Public Engagement in Policy Deliberation. Retrieved April 18, 2016 from http://www.acteurspublics.com/files/epublic/pdf/scoleman-jgotze-bowling-together.pdf Dillman, D. A., Smyth, J. A., & Melani Christian, L. (2008). Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method (3rd ed.). Dinev, T., Hart, P., & Mullen, M. R. (2008). Internet privacy concerns and beliefs about government surveillance – An empirical investigation. The Journal of Strategic Information Systems, 17(3), 214–233. Retrieved from http://doi.org/10.1016/j.jsis.2007.09.002 EngagementHQ. (n.d.). Retrieved April 18, 2016, from http://engagementhq.com/ Environment, O. (1990). Public consultation the guide: a resource kit for ministry staff. 121 Ontario: Environment Ontario. Foxman, E., & Kilcoyne, P. (1993). Information technology, marketing practice, and consumer privacy: ethical issues. Journal of Public Policy & Marketing, 12(I), 106–119. Retrieved April 18, 2016 from http://www.jstor.org/stable/10.2307/30000116 Gallagher, P., O’Donovan, M.-A. M. -a., Doyle, A., & Desmond, D. (2011). Environmental barriers, activity limitations and participation restrictions experienced by people with major limb amputation. Prosthetics and Orthotics International, 35(3), 278–284. Retrieved from http://doi.org/10.1177/0309364611407108 Garau, C. (2012). Focus on Citizens: Public Engagement with Online and Face-to-Face Participation—A Case Study. Future Internet. Retrieved from http://doi.org/10.3390/fi4020592 Gil De Zuniga, H., Puig-I-Abril, E., & Rojas, H. (2009). Weblogs, traditional sources online and political participation: an assessment of how the internet is changing the political environment. New Media & Society, 11(4), 553–574. Retrieved from http://doi.org/10.1177/1461444809102960 Gray, B. (1989). Collaborating: finding common ground for multiparty problems (1st ed.). San Francisco: Jossey-Bass. Gunter, B. (2006). Advances in e-democracy: overview. In Aslib proceedings (pp. 6–15). Retrieved April 18, 2016 from http://www.emeraldinsight.com/journals.htm?articleid=1573061&show=abstract Hague, B. N., & Loader, B. (Eds.). (1999). Digital Democracy: Discourse and Decision Making in the Information Age. Londond and New York: Routledge. Hargittai, E., & Hinnant, A. (2008). Digital Inequality: Differences in Young Adults’ Use of the Internet. Communication Research, 35(5), 602–621. Retrieved from http://doi.org/10.1177/0093650208321782 Harvey, K. (2014). Encyclopedia of social media and politics. (K. Harvey, Ed.). Thousand Oaks, CA: SAGE Publications Ltd. Retrieved from http://doi.org/doi: 10.4135/9781452244723 Hilbert, M., Miles, I., & Othmer, J. (2009). Foresight tools for participative policy-making in inter-governmental processes in developing countries: Lessons learned from the eLAC Policy Priorities Delphi. Technological Forecasting and Social Change, 76(7), 880–896. Retrieved from http://doi.org/10.1016/j.techfore.2009.01.001 Himelboim, I., Lariscy, R. W., Tinkham, S. F., & Sweetser, K. D. (2012). Social Media and Online Political Communication: The Role of Interpersonal Informational Trust and Openness. Journal of Broadcasting & Electronic Media, 56(1), 92–115. Retrieved from http://doi.org/10.1080/08838151.2011.648682 Hindman, D. B. (2000). The Rural-Urban Digital Divide. Journalism & Mass Communication Quarterly, 77(3), 549–560. Retrieved from http://doi.org/10.1177/107769900007700306 Hindsworth, M. F., & Lang, T. B. (2009). Community participation and empowerment (1st 122 ed.). New York: Nova Science Publishers. Hoffman, L. H., Jones, P. E., & Young, D. G. (2013). Does my comment count? Perceptions of political participation in an online environment. Computers in Human Behavior, 29(6), 2248–2256. Retrieved from http://doi.org/10.1016/j.chb.2013.05.010 Hrastinski, S. (2009). A theory of online learning as online participation. Computers & Education, 52(1), 78–82. Ince, D. (2013). Cookie. In A Dictionary of the Internet (3 ed.). Oxford University Press. Retrieved April 18, 2016 from http://www.oxfordreference.com.ezproxy.library.ubc.ca/view/10.1093/acref/9780191744150.001.0001/acref-9780191744150-e-704?rskey=t9Xxl1&result=1 Information & Communication Technologies Home. (n.d.). Retrieved April 18, 2016, from http://www.worldbank.org/en/topic/ict Innes, J. E., & Booher, D. E. (2004). Reframing public participation: strategies for the 21st century. Planning Theory & Practice, 5(4), 419–436. Retrieved from http://doi.org/10.1080/1464935042000293170 Jensen, C., Potts, C., & Jensen, C. (2005). Privacy practices of Internet users: Self-reports versus observed behavior. International Journal of Human-Computer Studies, 63(1-2), 203–227. Retrieved from http://doi.org/10.1016/j.ijhcs.2005.04.019 Johnson, N., Lilja, N., Ashby, J. a., & Garcia, J. a. (2004). The practice of participatory research and gender analysis in natural resource management. Natural Resources Forum, 28(3), 189–200. Retrieved from http://doi.org/10.1111/j.1477-8947.2004.00088.x Joinson, A. (1999). Social desirability, anonymity, and Internet-based questionnaires. Behavior Research Methods, Instruments, & Computers, 31(3), 433–438. Retrieved from http://doi.org/10.3758/BF03200723 Joinson, A., Reips, U.-D., Buchanan, T., & Schofield, C. B. P. (2010). Privacy, Trust, and Self-Disclosure Online. Human-Computer Interaction, 25(1), 1–24. Retrieved from http://doi.org/10.1080/07370020903586662 Jung, J.-Y., Qiu, J.-L., & Kim, Y.-C. (2001). Internet Connectedness and Inequality: Beyond the “Divide.” Communication Research, 28(4), 507–535. Retrieved from http://doi.org/10.1177/009365001028004006 Kwasny, M. N., Caine, K., Rogers, W. A., & Fisk, A. D. (2008). Privacy and Technology: Folk Definitions and Perspectives. In CHI ’08 Extended Abstracts on Human Factors in Computing Systems Pages 3291-3296 (pp. 3291–3296). New York, New York, USA. Retrieved from http://doi.org/10.1145/1358628.1358846 Lam, W. (2004). Encouraging online participation. Journal of Information Systems Education, 15(4), 345–348. Li, C., & Bernoff, J. (2011). Groundswell: Winning in a World Transformed by Social Technologies. Boston, Mass.: Harvard Business Press, c2008. 123 Loges, W. E., & Jung, J.-Y. (2001). Exploring the Digital Divide: Internet Connectedness and Age. Communication Research, 28(4), 536–562. Retrieved from http://doi.org/10.1177/009365001028004007 Malinen, S. (2015). Understanding user participation in online communities: A systematic literature review of empirical studies. Computers in Human Behavior, 46, 228–238. Retrieved from http://doi.org/10.1016/j.chb.2015.01.004 Martínez-Ballesté, A., Pérez-Martínez, P. A., & Solanas, A. (2013). The Pursuit of Citizens ’ Privacy : A Privacy-Aware Smart City Is Possible. IEEE Communications Magazine, (June), 136–141. Matheus, R., & Ribeiro, M. M. (2009). Public online consultation of federal ministries and federal regulatory agencies in Brazil. Proceedings of the 3rd International Conference on Theory and Practice of Electronic Governance - ICEGOV ’09, 390. Retrieved from http://doi.org/10.1145/1693042.1693127 McAffee, A., & Brynjolfsson, E. (2012). Big Data: The Management Revolution. Harvard Business Review, 90(10), 60–68. McKnight, D. H., Choudhury, V., & Kacmar, C. (2002). The impact of initial consumer trust on intentions to transact with a web site: a trust building model. The Journal of Strategic Information Systems, 11(3-4), 297–323. Retrieved from http://doi.org/10.1016/S0963-8687(02)00020-3 MetroQuest. (n.d.). Public Involvement Software - MetroQuest. Retrieved April 18, 2016, from http://metroquest.com/ Mills, A. J. (2009). Encyclopedia of Case Study Research. SAGE Publications, Incorporated. Min, S.-J. (2007). Online vs. Face-to-Face Deliberation: Effects on Civic Engagement. Journal of Computer-Mediated Communication, 12(4), 1369–1387. Retrieved from http://doi.org/10.1111/j.1083-6101.2007.00377.x Min, S.-J. (2010). From the Digital Divide to the Democratic Divide: Internet Skills, Political Interest, and the Second-Level Digital Divide in Political Internet Use. Journal of Information Technology & Politics, 7(1), 22–35. Retrieved from http://doi.org/10.1080/19331680903109402 Nonnecke, B., Andrews, D., & Preece, J. (2006). Non-public and public online community participation: Needs, attitudes and behavior. Electronic Commerce Research, 6(1), 7–20. Retrieved from http://doi.org/10.1007/s10660-006-5985-x Nov, O., Naaman, M., & Ye, C. (2010). Analysis of participation in an online photo-sharing community: A multidimensional perspective. Journal of the American Society for Information and Technology, 61(3), 555–566. Retrieved from http://doi.org/10.1002/asi Ostman, J. (2012). Information, expression, participation: How involvement in user- generated content relates to democratic engagement among young people. New Media & Society, 14(6), 1004–1021. Retrieved from http://doi.org/10.1177/1461444812438212 Paine, C., Reips, U.-D., Stieger, S., Joinson, A., & Buchanan, T. (2007). Internet users’ 124 perceptions of “privacy concerns” and “privacy actions.” International Journal of Human-Computer Studies, 65(6), 526–536. Retrieved from http://doi.org/10.1016/j.ijhcs.2006.12.001 Panopoulou, E. (2009). eParticipation initiatives: How is Europe progressing. European Journal of eParticipation, 7(March). Retrieved February 7, 2014 from https://www.researchgate.net/publication/216694588_eParticipation_initiatives_How_is_Europe_progressing_European_Journal_of_ePractice Park, Y. J. (2011). Digital Literacy and Privacy Behavior Online. Communication Research, 40(2), 215–236. Retrieved from http://doi.org/10.1177/0093650211418338 Pfeil, U., Arjan, R., & Zaphiris, P. (2009). Age differences in online social networking – A study of user profiles and the social capital divide among teenagers and older users in MySpace. Computers in Human Behavior, 25(3), 643–654. Retrieved from http://doi.org/10.1016/j.chb.2008.08.015 Pike, W. (2005). Augmenting collaboration through situated representations of scientific knowledge. University of Pennsylvania. Retrieved March 12, 2014 from https://etda.libraries.psu.edu/paper/6585/1838 Pike, W., Yarnal, B., MacEachren, A. M., Gehan, M., & Yu, C. (2005). Retooling collaboration: a vision for environmental change research. Environment: Science and Policy for Sustainable Development, 47(2), 8–21. Retrieved from http://doi.org/10.3200/ENVT.47.2.8-21 PlaceSpeak. (n.d.). Retrieved from www.placespeak.com Preece, J., & Maloney-Krichmar, D. (2005). Online Communities: Design, Theory, and Practice. Journal of Computer-Mediated Communication, 10(4), 0. Retrieved from http://doi.org/10.1111/j.1083-6101.2005.tb00264.x Prell, C., Hubacek, K., & Reed, M. (2009). Stakeholder Analysis and Social Network Analysis in Natural Resource Management. Society & Natural Resources, 22(6), 501–518. Retrieved from http://doi.org/10.1080/08941920802199202 Prosser, W. L. (1960). Privacy. California Law Review, 48(3), 383–423. Redaelli, E. (2012). Cultural Planning in the United States: Toward Authentic Participation Using GIS. Urban Affairs Review, 48(5), 642–669. Retrieved from http://doi.org/10.1177/1078087412441158 Reed, M. S., Dougill, A. J., & Baker, T. R. (2008). Participatory indicator development: what can ecologists and local communities learn from each other? Ecological Applications : A Publication of the Ecological Society of America, 18(5), 1253–69. Retrieved April 18, 2016 from http://www.ncbi.nlm.nih.gov/pubmed/18686585 Reed, M. S., Graves, A., Dandy, N., Posthumus, H., Hubacek, K., Morris, J., … Stringer, L. C. (2009). Who’s in and why? A typology of stakeholder analysis methods for natural resource management. Journal of Environmental Management, 90(5), 1933–49. Retrieved from http://doi.org/10.1016/j.jenvman.2009.01.001 Regan, P., FitzGerald, G., & Balint, P. (2013). Generational views of information privacy? 125 Innovation: The European Journal of Social Science Research, (November), 37–41. Retrieved from http://doi.org/10.1080/13511610.2013.747650 Rhodes, S. D., Bowie, D. a, & Hergenrather, K. C. (2003). Collecting behavioural data using the world wide web: considerations for researchers. Journal of Epidemiology and Community Health, 57(1), 68–73. Retrieved from http://doi.org/10.1136/jech.57.1.68 Rivera-Sanchez, M. (2009). A multinational study on online privacy: global concerns and local responses. New Media & Society, 11(3), 395–416. Retrieved from http://doi.org/10.1177/1461444808101618 Rodrigo, D., & Andrés-Amo, P. (2006). Background Document on Public Consultation. Retrieved Jun 3, 2013 from http://www.oecd.org/mena/governance/36785341.pdf Rojas, H., & Puig-i-Abril, E. (2009). Mobilizers Mobilized: Information, Expression, Mobilization and Participation in the Digital Age. Journal of Computer-Mediated Communication, 14(4), 902–927. Retrieved from http://doi.org/10.1111/j.1083-6101.2009.01475.x Schaefer, D. R., & Dillman, D. A. (1998). Development of a standard e-mail methodology. Public Opinion Quarterly, 62(3), 378. Retrieved from http://doi.org/10.1086/297851 Scheufele, D. a. DA, & Nisbet, M. C. M. C. (2002). Being a Citizen Online: New Opportunities and Dead Ends. The Harvard International Journal of Press/Politics, 7(3), 55–75. Retrieved from http://doi.org/10.1177/1081180X0200700304 Schwartz, B., & Grice, D. J. D. (2013). Establishing a Legal Framework for E-Voting in Canada. Elections Canada. Retrieved April 10, 2016 from http://www.elections.ca/res/rec/tech/elfec/pdf/elfec_e.pdf Seel, N. M. (2012). Encyclopedia of Sciences of Learning. In N. M. Seel (Ed.), (p. 3536). Seltzer, E., & Mahmoudi, D. (2012). Citizen Participation, Open Innovation, and Crowdsourcing: Challenges and Opportunities for Planning. Journal of Planning Literature, 28(1), 3–18. Retrieved from http://doi.org/10.1177/0885412212469112 Seničar, V., Jerman-Blažič, B., & Klobučar, T. (2003). Privacy-Enhancing Technologies—approaches and development. Computer Standards & Interfaces, 25(2), 147–158. Retrieved from http://doi.org/10.1016/S0920-5489(03)00003-5 Shipley, R., & Utz, S. (2012). Making it Count: A Review of the Value and Techniques for Public Consultation. Journal of Planning Literature, 27(1), 22–42. Retrieved from http://doi.org/10.1177/0885412211413133 Solove, D. (2006). A taxonomy of privacy. University of Pennsylvania Law Review, (c), 1–91. Retrieved March 12, 2014 from http://www.jstor.org/stable/40041279 Spiekermann, S., & Cranor, L. (2009). Engineering privacy. IEEE Transactions on Software Engineering, 35(1), 67–83. Retrieved from http://doi.org/10.1109/TSE.2008.88 Tsahkna, A.-G. (2013). E-voting: lessons from Estonia. European View, 12(1), 59–66. Retrieved from http://doi.org/10.1007/s12290-013-0261-7 Uyesugi, J. L., & Shipley, R. (2005). Visioning diversity: Planning Vancouver’s 126 multicultural communities. International Planning Studies, 10(3-4), 305–322. Retrieved from http://doi.org/10.1080/13563470500378895 Van Aerschot, L., & Rodousakis, N. (2008). The link between socio-economic background and Internet use: barriers faced by low socio-economic status groups and possible solutions. Innovation: The European Journal of Social Science Research, 21(4), 317–351. Retrieved from http://doi.org/10.1080/13511610802576927 van Mierlo, T. (2014). The 1% rule in four digital health social networks: an observational study. Journal of Medical Internet Research, 16(2), e33. Retrieved from http://doi.org/10.2196/jmir.2966 Velasquez, A. (2012). Social media and online political discussion: The effect of cues and informational cascades on participation in online political communities. New Media & Society, 14(8), 1286–1303. Retrieved from http://doi.org/10.1177/1461444812445877 Verdegem, P., & Verleye, G. (2009). User-centered E-Government in practice: A comprehensive model for measuring user satisfaction. Government Information Quarterly, 26(3), 487–497. Retrieved from http://doi.org/10.1016/j.giq.2009.03.005 Walbridge, S. (1982). OCLC and government documents collections. Government Publications Review, 9, 277–287. Retrieved from http://doi.org/10.1016/0277-9390(82)90058-9 Wallace, K. A. (1999). Online Anonymity. Ethics and Information Technology, 23–25. Retrieved from http://doi.org/10.1002/9780470281819.ch7 Wang, H., Chung, J. E., Park, N., McLaughlin, M. L., & Fulk, J. (2011). Understanding Online Community Participation: A Technology Acceptance Perspective. Communication Research, 39(6), 781–801. Retrieved from http://doi.org/10.1177/0093650211408593 Wang, H., Chung, J. E., Park, N., McLaughlin, M. L., & Fulk, J. (2011). Understanding Online Community Participation: A Technology Acceptance Perspective. Communication Research, 39(6), 781–801. Retrieved from http://doi.org/10.1177/0093650211408593 Wang, X.-H., & Bryer, T. A. (2013). Assessing the Costs of Public Participation A Case Study of Two Online Participation Mechanisms. The American Review of Public Administration. Retrieved from http://doi.org/10.1177/0275074012438727 Warschauer, M. (2002). Reconceptualizing the digital divide. First Monday, 7(7). Retrieved from http://doi.org/http://dx.doi.org/10.5210%2Ffm.v7i7.967 Wellman, B., Haase,  a. Q. A., Witte, J., & Hampton, K. (2001). Does the Internet increase, decrease, or supplement social capital? Social networks, participation, and community commitment. American Behaviora Scientist, 45(3), 436–455. Retrieved from http://doi.org/10.1177/00027640121957286 Westin, A. F. (2003). Social and Political Dimensions of Privacy. Journal of Social Issues, 59(2), 431–453. Retrieved from http://doi.org/10.1111/1540-4560.00072 127 Wray-Lake, L., Flanagan, C. a, & Osgood, D. W. (2010). Examining Trends in Adolescent Environmental Attitudes, Beliefs, and Behaviors Across Three Decades. Environment and Behavior, 42(1), 61–85. Retrieved from http://doi.org/10.1177/0013916509335163 Zviran, M. (2008). User’s Perspectives on Privacy in Web-Based Applications. Journal of Computer Information Systems, 48(4), 97–105. Retrieved Oct 12, 2013 from http://scholar.google.com/scholar?hl=en&btnG=Search&q=intitle:User’s+perspectives+on+privacy+in+web-based+applications#0  128 Appendices Appendix A  Online survey  A.1 Email invite University of British Columbia IDEAL LAB 2239 - 2424 Main Mall, V6T1Z4 Department of Forest Resources Management Principal Researcher: Mike Meitner  Hi, My name is Claudia Castro and I am a graduate student from University of British Columbia. I’m writing to ask for your help. By filling out a survey about online participation experience you will help me to comply with the requirements to obtain my Master’s degree.  I have created a topic in PlaceSpeak’s website with a survey called “Participation in online consultation systems” and I would appreciate very much if you could help me by filling it out.  This is a research project but I am hoping that your community may benefit from this survey because I will provide the results to this organization.  This research aims to understand the relation and dynamics between users of online consultation platforms (like you), and the use of the Internet for participation. This survey follows the guidelines and has been reviewed by the Research Ethics Board of UBC.  The estimated time for the survey is 20 minutes. If you agree to participate, you will be assured complete confidentiality; the results will be only used in grouped form and no individual identity will be revealed. Your participation in this survey is completely voluntary and you can withdraw at any time. You must be 18 years of age or older and by completing the questionnaire you agree to participate in this research.  This research is carried under the guidance of Dr. Meitner, professor from the Faculty of Forestry at UBC. You can contact us for further questions at this number: ______________ from the Faculty of Forestry at UBC. To proceed to fill out the survey please click on the link below. 129  Thanks in advance for your help in this study. Yours truly and thankfully,  Claudia Castro  A.2 Consent Dear Participant,  Thank you for taking the time to complete this survey. Your feedback will help us understand your preferences and experiences regarding online public participation. This survey should take 20-30 minutes of your time. Your answers will be completely confidential. Please note that by completing this survey you consent to release the information you provide to the University of British Columbia to be used in a scientific publication. If you have any concerns or complaints about your rights as a research participant and/or your experiences while participating in this study, contact the Research Participant Complaint Line in the UBC Office of Research Services at _______________ or if long distance e-mail RSIL@ors.ubc.ca or call toll free _________________. Yours truly and thankfully, Claudia Castro   130 A.3 Survey page 1    131 A.4 Survey page 2   132 A.5 Survey page 3   133 A.6 Survey page 3 (continued)   134 A.7 Survey page 4  135 A.8 Survey page 5  136 A.9 Survey page 6      137 A.10 Survey page 6 (continued)  138 A.11 Survey page 7   139 A.12 Survey page 7 (continued 1)   140 A.13 Survey page 7 (continued 2)   141 A.14 Survey page 8    142 Appendix B  Interview B.1 Follow up interview protocol Participant number: _ _ _ _ _ _ _ _ _ Hello. This is Claudia Castro, and I am calling regarding the follow up interview about the survey on “Online Participation Systems”.  Is this still a good time to call? First of all, I would like to know if it is ok with you that I record this conversation. There would not be any personal information linked to the audio, and I would identify you with a participant number. Depending on the participant’s response, I began recording. ------------------------------ Thanks for your time.  I am with participant _ _ _ _ _ _ _ _   and today I will be asking you about three different topics, and just as a reminder: there are no right or wrong responses. So. just feel free to give your opinion or say pass if you don’t feel comfortable responding the question. 1) Let’s begin with Satisfaction of the use of Online Participatory Platforms and to set up a context: “There are lots of people that do not use online participation”.  In the survey that you completed for us we asked you to rate your general satisfaction with online participation software but we were not able to ask you why you were satisfied or not. Could you tell us a little about what things would make you more or less satisfied and why?   - If the participant mentions issues, ask What are the underlying goals you try to achieve if the issues you mention were improved? 143 2) In the survey you completed, we asked you how you rated your Trade-off between Transparency and Privacy: Now we would like to know: In the context of using online public participation software; What are the issues that are important to consider when trading off your personal privacy with increased transparency of decision making in your community? From the issues you just mentioned, which do you see as most important and why? In case the participant asks about the definition of Transparency, use the following: “For “transparency”: we are referring to operating in such a way that it is easy for others to see what methods and data are being used to make decisions”. 3) In general, What do you think is the main benefit of public participation software for society? 4) Which of the following ideas Would you argue is more important for public participation software to focus on and why:  • to inform decision making processes OR • to help communities to create a more cohesive vision of the future. I have no more questions to ask besides if there is anything you would like to add on regards of the previous topics we talked about: satisfaction, transparency and privacy or the role of public consultation software. Thanks again for your time.  Bye 


Citation Scheme:


Citations by CSL (citeproc-js)

Usage Statistics



Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            async >
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:


Related Items