@prefix vivo: . @prefix edm: . @prefix ns0: . @prefix dcterms: . @prefix skos: . vivo:departmentOrSchool "Applied Science, Faculty of"@en, "Engineering, School of (Okanagan)"@en ; edm:dataProvider "DSpace"@en ; ns0:degreeCampus "UBCO"@en ; dcterms:creator "Haider, Husnain"@en ; dcterms:issued "2015-05-28T21:34:49Z"@en, "2015"@en ; vivo:relatedDegree "Doctor of Philosophy - PhD"@en ; ns0:degreeGrantor "University of British Columbia"@en ; dcterms:description "To ensure safe and secure water supply, water utilities are experiencing challenges of climate change, socio-economic viability, and rapid rate of environmental degradation. Core of water utility business deals with managing assets and services which can be divided into functional components, such as water resource management and environmental stewardship, operational practices, personnel productivity, physical infrastructure, customer service, public health security, socio-economic issues, as well as financial viability. To be a sustainable water utility, major impetus is to enhance performance efficiency and effectiveness of the functional components to ensure high level of customer satisfaction. Due to limited human and financial resources, small and medium sized water utilities (SMWU) are facing even further challenges related to performance enhancement. The participation of SMWU in Canada is almost negligible in National Water and Wastewater Initiative (NWWBI) so far. Consequently, such SMWU are managing their functional components without knowing whether they are meeting their primary performance objectives or not. Hence, there is an urgent need of a comprehensive framework for adopting performance management in SMWU. In this research, an integrated performance management framework, consisting of five models has been developed. The overall framework initiates with the identification of performance indicators (PIs) based on a critical review, followed by the model using multicriteria decision analysis for the selection of PIs encompassing all the functional components. These PIs are then evaluated through an inter-utility performance benchmarking model (IU-PBM), which efficiently deals with the exiting data limitations in SMWU. Based on the IU-PBM results, an intra-utility performance management model (In-UPM) has been developed to hone in the performance of sub-components and different water supply systems within the utility for decision making under uncertainty. Finally, a risk-based model has been developed to improve customer satisfaction in SMWU. This research will help utility managers across Canada and potentially other parts of the World to enhance performance management for SMWU. The utility managers can effectively implement this framework, with available resources, to achieve socio-economic benefits, as they can: i) identify the underperforming functional components, and can take corrective actions rationally; ii) manage customer satisfaction with efficient inventory management and data analyses."@en ; edm:aggregatedCHO "https://circle.library.ubc.ca/rest/handle/2429/53582?expand=metadata"@en ; skos:note " PERFORMANCE MANAGEMENT FRAMEWORK FOR SMALL TO MEDIUM SIZED WATER UTILITIES: CONCEPTUALIZATION TO DEVELOPMENT AND IMPLEMENTATION by Husnain Haider A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF PHILOSOPHY in THE COLLEGE OF GRADUATE STUDIES (Civil Engineering) THE UNIVERSITY OF BRITISH COLUMBIA (Okanagan) May 2015 © Husnain Haider, 2015 ii Abstract To ensure safe and secure water supply, water utilities are experiencing challenges of climate change, socio-economic viability, and rapid rate of environmental degradation. Core of water utility business deals with managing assets and services which can be divided into functional components, such as water resource management and environmental stewardship, operational practices, personnel productivity, physical infrastructure, customer service, public health security, socio-economic issues, as well as financial viability. To be a sustainable water utility, major impetus is to enhance performance efficiency and effectiveness of the functional components to ensure high level of customer satisfaction. Due to limited human and financial resources, small and medium sized water utilities (SMWU) are facing even further challenges related to performance enhancement. The participation of SMWU in Canada is almost negligible in National Water and Wastewater Initiative (NWWBI) so far. Consequently, such SMWU are managing their functional components without knowing whether they are meeting their primary performance objectives or not. Hence, there is an urgent need of a comprehensive framework for adopting performance management in SMWU. In this research, an integrated performance management framework, consisting of five models has been developed. The overall framework initiates with the identification of performance indicators (PIs) based on a critical review, followed by the model using multicriteria decision analysis for the selection of PIs encompassing all the functional components. These PIs are then evaluated through an inter-utility performance benchmarking model (IU-PBM), which efficiently deals with the exiting data limitations in SMWU. Based on the IU-PBM results, an intra-utility performance management model (In-UPM) has been developed to hone in the performance of sub-components and different water supply systems within the utility for decision making under uncertainty. Finally, a risk-based model has been developed to improve customer satisfaction in SMWU. This research will help utility managers across Canada and potentially other parts of the World to enhance performance management for SMWU. The utility managers can effectively implement this framework, with available resources, to achieve socio-economic benefits, as they can: i) identify the underperforming functional components, and can take corrective actions rationally; ii) manage customer satisfaction with efficient inventory management and data analyses. iii Preface I, Husnain Haider, conceived and developed all the contents in this thesis under the supervision of Dr. Rehan Sadiq. Third author of the articles from this research work, Dr. Solomon Tesfamariam, has reviewed all the manuscripts and provided critical feedback in the improvement of the manuscripts and thesis. Most of the contents of this thesis are published, under review, or submitted in scientific journals and conferences.  A version of Chapters 2 and 3 has been published in NRC Research Press Journal Environmental Reviews with title “Performance Indicators for Small and Medium Sized Water Supply Systems: A Review” (Haider et al. 2014a).  A version of Chapter 4 has been published in Urban Water Journal with title “Selecting Performance Indicators for Small to Medium Sized Water Utilities: Multi-criteria Analysis using ELECTRE Method” (Haider et al. 2015a).  A portion of Chapter 5 has been published in The proceedings of the Canadian Society of Civil Engineers (CSCE) General Conference (2014) with the title “Performance Assessment Framework for Small to Medium Sized Water Utilities – A Case for Okanagan Basin” (Haider et al. 2014b).  A version of Chapter 5 has been published in the ASCE’s Journal of Water Resources Planning and Management with the title “Inter-utility Performance Benchmarking Model (IU-PBM) for Small to Medium Sized Water Utilities: Aggregated Performance Indices” (Haider et al. 2015b)  A version of Chapter 6 is under review in the Journal of Cleaner Production with the title “Intra-utility Performance Management Model (In-UPM) for the Sustainability of Small to Medium Sized Water Utilities: Conceptualization to Development” (Haider et al. 2015c).  A version of Chapter 7 is under review in the Risk Analysis with the title “Customer Satisfaction Management Framework for Small to Medium Sized Water Utilities: A Risk-based Approach” (Haider et al. 2015d).  A research article consisting of an overall integrated framework developed in this thesis is under review in the Canadian Journal of Civil Engineering with the title “Multilevel Performance Management Framework: A Case of Small to Medium Sized Water Utilities in BC, Canada” (Haider et al. 2015e). I secured the approval of UBC’s Behavioral Research Ethics Board (UBC BREB No.H14-00668) for one of my article (Haider et al. 2015b). A copy of the approval and the relevant signed performa are attached in Appendix B. iv Table of Contents Abstract………. .................................................................................................................................... ii Preface……….. .................................................................................................................................... iii Table of Contents ................................................................................................................................. iv List of Tables…. ................................................................................................................................... xi List of Figures… ................................................................................................................................ xiii List of Abbreviations ......................................................................................................................... xvi List of Symbols ................................................................................................................................... xix Acknowledgements ............................................................................................................................ xxi Chapter 1 Introduction ........................................................................................................................ 1 1.1 Background ........................................................................................................................... 1 1.2 Research Motivation ............................................................................................................. 2 Performance Management .................................................................................. 3 1.2.1 Decision Making at Strategic, Tactical, and Operational Levels ....................... 4 1.2.21.3 Research Gap ........................................................................................................................ 5 1.4 Objectives ............................................................................................................................. 6 1.5 Thesis Structure and Organization ........................................................................................ 6 1.6 Proposed Framework ............................................................................................................ 8 1.6.1 Terminology Adopted in Research ..................................................................... 8 1.6.2 Identification of Potential PIs for SMWU .......................................................... 8 1.6.3 Selection of Performance Indicators................................................................. 10 1.6.4 Inter-utility Performance Benchmarking .......................................................... 10 1.6.5 Intra-utility performance management for SMWU .......................................... 10 1.6.6 Managing Customer Satisfaction in SMWU .................................................... 11 Chapter 2 Literature Review ............................................................................................................. 12 2.1 Background ......................................................................................................................... 12 2.2 Classification of Water Supply Systems ............................................................................. 13 2.3 Size Based Classification of Water Supply Systems .......................................................... 14 Difference between L-WSS and SM-WSS ....................................................... 17 2.3.1 USEPA Drinking Water Infrastructure Needs Surveys – Case Study .............. 17 2.3.2 v Source Water .................................................................................................... 18 2.3.3 System Age and Pipe Size ................................................................................ 18 2.3.4 Public Health Risk ............................................................................................ 20 2.3.5 Operation and Management Systems ............................................................... 21 2.3.6 Physical, Water Quality, and Environmental Sustainability ............................ 21 2.3.72.4 Literature Review of Performance Indicators Systems for Water Utilities......................... 23 Terminology and Historical Background of Performance Indicators ............... 23 2.4.1 IWA Manual of Best Practice ........................................................................... 25 2.4.2 The World Bank ............................................................................................... 27 2.4.3 National Water Commission (NWC), Australia ............................................... 29 2.4.4 American Water Works Association (AWWA) ............................................... 30 2.4.5 Office of the Water Services (OFWAT) – UK and Wales ............................... 32 2.4.6 National Research Council (NRC) – Canada ................................................... 33 2.4.7 Asian Development Bank (ADB) ..................................................................... 35 2.4.8 Canadian Standards Association (CSA) ........................................................... 37 2.4.92.5 Evaluation of Performance Indicators Systems .................................................................. 38 2.6 Performance Assessment of Water Utilities with Limited Resources – Some Case Studies .............................................................................................................. 42 South Asia - Bangladesh, India, and Pakistan .................................................. 42 2.6.1 Eastern Europe, Caucasus and Central Asian (EECCA) Countries - Armenia 43 2.6.2 Arab Countries ................................................................................................. 44 2.6.3 Africa - Malawi and 134 Water Utilities .......................................................... 44 2.6.42.7 Selection of Performance Indicators ................................................................................... 46 2.8 Performance Benchmarking for Water Utilities ................................................................. 47 2.9 Customer Satisfaction Assessment ..................................................................................... 49 Chapter 3 Identification of Suitable Performance Indicators ........................................................ 51 3.1 Background ......................................................................................................................... 51 3.2 Categorization of Performance Indicators for SMWU ....................................................... 52 Water Resources and Environmental Indicators .............................................. 52 3.2.1 Personnel/ Staffing Indicators .......................................................................... 55 3.2.2 Physical Assets Indicators ................................................................................ 56 3.2.3 Operational Indicators ...................................................................................... 57 3.2.4 Water Quality and Public Health Indicators ..................................................... 62 3.2.5 Quality of Service Indicators ............................................................................ 64 3.2.6 vi Financial and Economic Indicators .................................................................. 66 3.2.73.3 Summary ............................................................................................................................. 68 Chapter 4 Selection of Performance Indicators............................................................................... 69 4.1 Background ......................................................................................................................... 69 4.2 Modeling Approach ............................................................................................................ 70 Criteria for Selection of PIs and Ranking System ............................................ 71 4.2.1 Multicriteria Decision Analysis (MCDA) ........................................................ 74 4.2.24.2.2.1 Analytic Hierarchy Process ............................................................................... 74 4.2.2.2 Elimination and Choice Translating Reality ...................................................... 76 4.3 Application of MCDA – An Example of Water Resources and Environmental PIs .......... 80 Estimation of Criteria Weights using AHP ...................................................... 80 4.3.1 Development of Outranking Relationships using ELECTRE .......................... 81 4.3.24.4 Development of Indicators .................................................................................................. 87 Water Resources and Environment (WE) Indicators........................................ 87 4.4.1 Personnel/ Staffing (PE) Indicators .................................................................. 89 4.4.2 Physical (PH) Indicators ................................................................................... 91 4.4.3 Operational (OP) Indicators ............................................................................. 92 4.4.4 Water Quality and Public Health Indicators ..................................................... 95 4.4.5 Quality of Service (QS) Indicators ................................................................... 97 4.4.6 Financial and Economic (FE) Indicators .......................................................... 98 4.4.74.5 Final Ranking of Selected Indicators ................................................................................ 100 4.6 Summary ........................................................................................................................... 103 Chapter 5 Inter-utility Performance Benchmarking Model (IU-PBM) ...................................... 104 5.1 Background ....................................................................................................................... 104 5.2 Approach and Methodology .............................................................................................. 107 5.2.1 Performance Benchmarking Modeling Approach .......................................... 107 5.2.2 Benchmarking Transformation Functions ...................................................... 107 5.2.3 Performance Aggregation Indices .................................................................. 109 5.2.4 Simos’ Method for Estimating the Weights of PIs ......................................... 109 5.2.5 Aggregating Performance Indicators using TOPSIS Method ........................ 109 5.3 Development of Benchmarking Transformation Functions .............................................. 112 5.3.1 Water Resources and Environmental Sustainability....................................... 117 5.3.2 Personnel Adequacy ....................................................................................... 117 vii 5.3.3 Physical Assets Efficacy ................................................................................. 118 5.3.4 Operational Integrity ...................................................................................... 119 5.3.5 Water Quality and Public Health Safety ......................................................... 120 5.3.6 Quality of Service Reliability ......................................................................... 121 5.3.7 Economic and Financial Stability ................................................................... 121 5.4 A Case Study of the Okanagan Basin ............................................................................... 122 5.5 Summary ........................................................................................................................... 131 Chapter 6 Intra-utility Performance management Model (In-UPM) .......................................... 132 6.1 Background ....................................................................................................................... 132 6.2 Establishing Performance Assessment Criteria ................................................................ 133 Water Resources and Environmental Sustainability....................................... 134 6.2.1 Personnel Productivity.................................................................................... 138 6.2.2 Physical Assets Efficacy ................................................................................. 139 6.2.3 Operational Integrity ...................................................................................... 139 6.2.4 Provision of Safe Drinking Water .................................................................. 140 6.2.5 Quality of Service ........................................................................................... 142 6.2.6 Economic and Financial Viability .................................................................. 143 6.2.76.3 Modeling Approach .......................................................................................................... 144 6.4 A Case Study of Okanagan Basin ..................................................................................... 155 Okanagan Basin .............................................................................................. 155 6.4.1 Analysis and Results....................................................................................... 156 6.4.2 Sensitivity Analysis ........................................................................................ 159 6.4.3 Performance Management using In-UPM ...................................................... 162 6.4.46.5 In-UPM for Complex Decision Making ........................................................................... 163 6.6 Summary ........................................................................................................................... 165 Chapter 7 A Risk-based Customer Satisfaction Management Model ......................................... 167 7.1 Background ....................................................................................................................... 167 7.2 Risk Assessment ............................................................................................................... 168 Root Cause Analysis (RCA) ........................................................................... 168 7.2.1 Failure Mode Effect Analysis (FMEA) .......................................................... 169 7.2.2 Fuzzy-FMEA .................................................................................................. 169 7.2.37.3 Modeling Approach .......................................................................................................... 171 7.4 Okanagan Case Study ....................................................................................................... 175 viii Study area ....................................................................................................... 175 7.4.1 Baseline Data collection ................................................................................. 175 7.4.2 Risk Identification .......................................................................................... 176 7.4.3 Root Cause Analysis....................................................................................... 176 7.4.4 Fuzzy-FMEA .................................................................................................. 179 7.4.5 Risk Prioritization ........................................................................................... 179 7.4.6 Customer Satisfaction Management ............................................................... 184 7.4.7 Discussions ..................................................................................................... 187 7.4.87.5 Summary ........................................................................................................................... 191 Chapter 8 Conclusions and Recommendations ............................................................................. 192 8.1 Summary and Conclusions ................................................................................................ 192 8.2 Originality and Contribution ............................................................................................. 196 8.3 Limitations and Recommendations ................................................................................... 196 References……. ................................................................................................................................ 198 Appendices…… ................................................................................................................................ 214 Appendix A: Summary of Performance Indicator Systems ........................................................... 214 Appendix A-1: IWA manual of performance indicators for water supply services (IWA 2006) ............................................................................................ 214 Appendix A-2: IBNET system of performance indicators (World Bank 2011) ........................ 215 Appendix A-3: Performance indicators developed by NWC, Australia (NWC 2012) .............. 216 Appendix A-4: AWWA system of performance indicators (AWWA 2004) ............................. 217 Appendix A-5: OFWAT system of performance indicators (OFWAT 2012 ............................ 218 Appendix A-6: Performance indicator system proposed by NRC, Canada (NRC 2010) .......... 219 Appendix A-7: Performance indicator system proposed by ADB (ADB 2012) ........................ 220 Appendix A-8: Performance indicator system proposed by ISO (CSA 2010) .......................... 221 Appendix B: Ethics Approval Certificate and Forms .................................................................... 222 Appendix B-1: Ethics approval certificate ................................................................................. 222 Appendix B-2: Ranking signed performa filled by Utility-I ..................................................... 223 Appendix B-3: Ranking signed performa filled by Utility-II .................................................... 226 Appendix B-4: Ranking signed performa filled by Utility-III ................................................... 229 Appendix B-5: Ranking signed performa filled by Expert-I ..................................................... 232 Appendix B-6: Ranking signed performa filled by Expert-II .................................................... 235 Appendix B-7: Ranking signed performa filled by Expert-III ................................................... 238 ix Appendix C: Fuzzy Rules for Intra-Utility Performance Management Model (In-UPM) ............. 241 Appendix C-1: Matrix defining fuzzy rules for ‘water resources and environmental sustainability ................................................................................................... 241 Appendix C-2: Matrix defining fuzzy rules for ‘environmental protection’ ............................. 242 Appendix C-3: Matrix defining fuzzy rules for ‘impact of flushing water’ ............................... 242 Appendix C-4: Matrix defining fuzzy rules for ‘water resources sustainability’ ..................... 242 Appendix C-5: Matrix defining fuzzy rules for ‘water resources management’ ...................... 243 Appendix C-6: Matrix defining fuzzy rules for restrictions, ‘conservation and management’ ................................................................................................ 244 Appendix C-7: Matrix defining fuzzy rules for the ‘catchment and treatment employees’ ...... 245 Appendix C-8: Matrix defining fuzzy rules for the ‘metering and distribution employees’ ..... 245 Appendix C-9: Matrix defining fuzzy rules for the ‘loss due to field accidents’ ...................... 245 Appendix C-10: Matrix defining fuzzy rules for the ‘personnel healthiness’ ........................... 246 Appendix C-11: Matrix defining fuzzy rules for the ‘overtime culture’ ................................... 246 Appendix C-12: Matrix defining fuzzy rules for the ‘Productivity ratio’ ................................. 246 Appendix C-13: Matrix defining fuzzy rules for the ‘working environment efficacy’ .............. 247 Appendix C-14: Matrix defining fuzzy rules for the ‘personnel adequacy’.............................. 248 Appendix C-15: Matrix defining fuzzy rules for the ‘personnel health and safety’ .................. 249 Appendix C-16: Matrix defining fuzzy rules for the ‘personnel productivity’ .......................... 250 Appendix C-17: Matrix defining fuzzy rules for the ‘rehabilitation and replacement of pipes’ ....................................................... 251 Appendix C-18: Matrix defining fuzzy rules for the ‘distribution system maintenance’ .......... 252 Appendix C-19: Matrix defining fuzzy rules for the ‘delivery point maintenance’ .................. 253 Appendix C-20: Matrix defining fuzzy rules for the ‘inspection and cleaning routine’ ........... 253 Appendix C-21: Matrix defining fuzzy rules for the ‘distribution system integrity’ ................. 254 Appendix C-22: Matrix defining fuzzy rules for the ‘distribution system failure’ .................... 255 Appendix C-23: Matrix defining fuzzy rules for the ‘distribution system performance’ .......... 255 Appendix C-24: Matrix defining fuzzy rules for the ‘distribution network productivity’ ......... 255 Appendix C-25: Matrix defining fuzzy rules for the ‘operational integrity’ ............................ 256 Appendix C-26: Matrix defining fuzzy rules for the ‘public health safety’ .............................. 257 Appendix C-27: Matrix defining fuzzy rules for the ‘distribution systems water quality’ ........ 258 Appendix C-28: Matrix defining fuzzy rules for the ‘treatment systems water quality’ ........... 259 Appendix C-29: Matrix defining fuzzy rules for the ‘water quality compliance’ ..................... 260 x Appendix C-30: Matrix defining fuzzy rules for the ‘water quality and public health safety’ ......................................................... 260 Appendix C-31: Matrix defining fuzzy rules for the ‘treated water storage capacity’ ............. 260 Appendix C-32: Matrix defining fuzzy rules for the ‘storage capacity’ ................................... 261 Appendix C-33: Matrix defining fuzzy rules for the ‘storage and treatment systems capacity’ ....................................................... 261 Appendix C-34: Matrix defining fuzzy rules for the ‘monitoring systems integrity’ ................ 261 Appendix C-35: Matrix defining fuzzy rules for the ‘physical systems efficacy’ ...................... 262 Appendix C-36: Matrix defining fuzzy rules for the ‘customers’ information level’ ................ 262 Appendix C-37: Matrix defining fuzzy rules for the ‘water quality adequacy’ ........................ 263 Appendix C-38: Matrix defining fuzzy rules for the ‘customer service reliability’ .................. 264 Appendix C-39: Matrix defining fuzzy rules for the ‘response to complaints’ ......................... 264 Appendix C-40: Matrix defining fuzzy rules for the ‘complaints related to system integrity’ ..................................................................................... 265 Appendix C-41: Matrix defining fuzzy rules for the ‘customer satisfaction level’ ................... 266 Appendix C-42: Matrix defining fuzzy rules for the ‘service reliability and customer satisfaction’ .................................................................................................... 267 Appendix C-43: Matrix defining fuzzy rules for the ‘customer water affordability’ ................ 267 Appendix C-44: Matrix defining fuzzy rules for the ‘operation and maintenance cost sustainability’ ................................................................................................. 267 Appendix C-45: Matrix defining fuzzy rules for the ‘economic stability’ ................................ 268 Appendix C-46: Matrix defining fuzzy rules for the ‘revenue collection efficacy’ ................... 268 Appendix C-47: Matrix defining fuzzy rules for the ‘operational cost sustainability’ ............. 268 Appendix C-48: Matrix defining fuzzy rules for the ‘economic and financial viability’ .......... 269 xi List of Tables Table 2.1 Various basis of sized-based classification of water utilities ........................................... 15 Table 2.2 Number of water supply performance indicators under different categories by various agencies............................................................................................................................ 40 Table 2.3 Evaluation of different performance assessment systems for their applicability to SMWU ...................................................................................... 43 Table 2.4 Checklist of key performance indicators used in developing countries ........................... 45 Table 3.1 Proposed water resources and environmental indicators ................................................. 53 Table 3.2 Proposed personnel/ staffing indicators ........................................................................... 56 Table 3.3 Proposed physical/ asset indicators .................................................................................. 57 Table 3.4 Proposed operational indicators ....................................................................................... 58 Table 3.5 Proposed water quality and public health indicators........................................................ 63 Table 3.6 Proposed quality of service indicators ............................................................................. 65 Table 3.7 Proposed Financial/ Economic indicators ........................................................................ 67 Table 4.1 Selected PIs through initial screening .............................................................................. 73 Table 4.2 Scoring system and definition of criteria ......................................................................... 74 Table 4.3 Evaluation scale used in pairwise comparison ................................................................. 75 Table 4.4 Random indices established by Saaty (1980) .................................................................. 75 Table 4.5 Pairwise comparison matrix for weight estimation using AHP ....................................... 80 Table 4.6 Normalized comparison matrix for weight estimation using AHP .................................. 81 Table 4.7 The scoring matrix along with criteria weights ............................................................... 82 Table 4.8 The normalized weighted matrix ..................................................................................... 85 Table 4.9 Concordance and discordance interval sets for performance indicators in WE category .................................................................................................................... 85 Table 4.10 Net Outranking of selected indicators .............................................................................. 87 Table 4.11 Final ranking of selected PIs under DMB for SMWU ................................................... 101 Table 5.1 Benchmarking transformation functions developed for performance benchmarking for SMWU .............................................................................................. 113 Table 5.2 Weight estimation using Simos’ Method ....................................................................... 125 Table 5.3 Description of performance levels with proposed actions ............................................. 130 Table 6.1 Performance objectives, performance measures (PMs), performance indicators (PIs), and data variables ........................................................... 135 Table 6.2 Universe of discourse (UOD) of performance indicators .............................................. 152 xii Table 7.1 Linguistically defined probability of occurrence (P), consequence (C), and detectability (D) .......................................................................... 180 Table 7.2 Results of fuzzy-FMEA ................................................................................................. 181 Table 7.3 Priority levels and RPN range ........................................................................................ 183 Table 7.4 Proposed mitigation actions based on risk assessment results ....................................... 185 xiii List of Figures Figure 1.1 Scope of Asset Management (modified from IIMM 2006) ............................................. 5 Figure 1.2 Thesis structure and organization .................................................................................... 7 Figure 1.3 Proposed research framework for performance management of SMWU………………………………………………………………………………….9 Figure 2.1 Framework showing approach adopted for critical review of performance indicators systems and performance assessment frameworks ........................................................ 14 Figure 2.2 Schematic schemes of water supply systems based on different types of sources; a) Fresh surface water source; b) Fresh groundwater sources; c) Saline groundwater source; d) Saline surface water source .......................................................................... 16 Figure 2.3 Total 20-year financial need by system size and project/ component type (in billions of January 2007 dollars) (developed using USEPA 2009 data) ........................................ 19 Figure 2.4 Variation in 20-year distribution system investment needs by system size from 1995-2007, according to USEPA infrastructure needs survey and assessment for United States (developed using USEPA 1997, 2001, 2005, 2009 data).................................... 19 Figure 2.5 Percent of pipe system by age class in different system sizes (developed using AWWA 2007data) ....................................................................................................................... 20 Figure 2.6 Average miles of pipe per system by diameter in different system sizes (developed using AWWA 2007 data) .............................................................................................. 21 Figure 2.7 The concept of the Faria and Alegre (1996) level of service framework ...................... 25 Figure 2.8 Concept of layer pyramid used by Algere et al. (2000) for calculating performance indicators ....................................................................................................................... 26 Figure 2.9 An example of interaction between various indicators and sub-indicators in the NWC (2012) performance assessment framework “A partial reproduction of Fig 1 Interrelationship between NFP indicators, p. 26 of National Performance Framework: 2010-11 urban performance reporting indicators and definitions handbook” ............... 31 Figure 2.10 ADB (2012) Project Design and Monitoring framework (DMF) ................................. 36 Figure 2.11 Content and application of the ISO (2007).................................................................... 38 Figure 3.1 Proposed system of PIs to start, proceed and improve the performance evaluation mechanism in SMWU ................................................................................................... 52 Figure 3.2 Components of water balance for calculation of water losses in water distribution defined by Farley and Trow (2003) .............................. Error! Bookmark not defined. Figure 3.3 The four basic methods of managing real losses (Source: Lambert et al. (1999) ......... 61 xiv Figure 4.1 Distribution of PIs in different categories by various agencies, a graphical representation of Table 2.2 ............................................................................................ 70 Figure 4.2 Modeling approach for selection of PIs for SMWU ..................................................... 72 Figure 4.3 Outranking relations of water resources and environmental PIs showing DMB .......... 80 Figure 4.4 Outranking relations of personnel PIs showing DMB .................................................. 90 Figure 4.5 Outranking relations of physical PIs showing DMB ..................................................... 92 Figure 4.6 Outranking relations of operational PIs showing DMB ................................................ 94 Figure 4.7 Outranking relations of water quality and public health PIs showing DMB................. 96 Figure 4.8 Outranking relations of quality of service PIs showing DMB ...................................... 98 Figure 4.9 Outranking relations of financial and economic PIs showing DMB ............................. 99 Figure 4.10 Net Concordance (C) and discordance (D) indexes for all seven categories of PIs; (a) Water resources and environment; (b) Personnel; (c) Physical; (d) Operational; (e) Water quality and Public Health; (f) Quality of Service; (g) Financial and Economic ........................................................................................ 102 Figure 4.11 An example of cognitive map for estimation of water resources and environmental sustainability index ...................................................................................................... 103 Figure 5.1 Graphical description of Equation [5.1] showing misleading calculation of performance score ............................................................................................................................ 106 Figure 5.2 Relative performance of water utilities in terms of performance gap between the calculated PI values and benchmarks using performance score .................................. 106 Figure 5.3 Modeling approach of performance benchmarking model for SMWU ...................... 108 Figure 5.4 Examples of performance benchmarking relationships (a) per capita water consumption of residential consumers (WE2), a water resources indicator, (b) percentage of service connection repairs in a year (OP5), an operational indicator ...................................... 116 Figure 5.5 Aggregated performance indices for all the functional components, (a) performance indices of Utility A, (b) performance indices of Utility B........................................... 129 Figure 6.1 Methodology for performance management of SMWU ............................................. 145 Figure 6.2 A conceptual hierarchical structure for performance assessment of SMWU - An example of functional component of water resources and environmental sustainability........................................................................................ 146 Figure 6.3 Standard trapezoidal membership functions used in this study; e.g., b1, b2, b3 and b4 are used to define the ranges of fuzzy numbers for 'Medium' ..................................... 151 Figure 6.4 Reported pressure, water quality and service connection complaints FY 2012 for different WSSs in the utility under study .................................................................... 157 xv Figure 6.5 In-UPM results for the utility for assessment year 2012 ............................................. 158 Figure 6.6 Results of primary and secondary level PMs for ‘quality of service’ component....... 158 Figure 6.7 Sensitivity analysis results for all the functional components ..................................... 161 Figure 6.8 Secondary level performance measures for service reliability and customer satisfaction in three systems within the water utility ...................................................................... 163 Figure 6.9 In-UPM results showing overall performance of the utility for year 2014 after the implementation of improvement action....................................................................... 164 Figure 7.1 Standard trapezoidal membership functions used in this study ................................... 170 Figure 7.2 Chen (1985) defuzzification method for trapezoidal fuzzy numbers .......................... 171 Figure 7.3 Risk-based modeling approach for assessment of customer satisfaction .................... 172 Figure 7.4 A vignette of customers’ complaints with respect to their causes ............................... 177 Figure 7.5 Root cause analysis (RCA) for customer complaints in SMWU ................................ 178 Figure 7.6 Number of failure modes with risk priority levels ...................................................... 184 Figure 7.7 Risk clustering for customers’ satisfaction assessment for the existing situation to take Action 1&2 ....................................................................................... 185 Figure 7.8 Results of risk mitigation, A-1: Automation of booster stations, A-2: Source water change, A-3: Implementation of a routine service connections inspection program, A-4: Increasing level of water treatment…………………………………….……………. 186 Figure 7.9 Cumulative risk reduction vs. cumulative cost of mitigation actions.......................... 190 Figure 8.1 Integrated framework for performance management of SMWU ................................ 194 xvi List of Abbreviations ACWUA Arab Countries Water Utilities Association ADB Asian Development Bank AHP Analytical Hierarchical Process APRH Portuguese Association of Water Resources AWWA American Water Works Association AWWARF American Water Works Association Research Foundation BC British Columbia BCMOH British Columbia Ministry of Health MoE Ministry of Environment BTFs Benchmarking Transformation Functions C Consequence CAN Canadian CARL Current Annual Real Losses CCW Consumer Council of Water CI Consistency Index CR Consistency Ratio CSA Canadian Standards Association CS Customer Satisfaction CWW Columbus Water Works CWWA Canadian Water and Wastewater Association D Detectability DBPs Disinfection By Products DM Decision Maker DMB Decision Makers Boundary DMF Design and Monitoring Framework ELECTRE Elimination and Choice Translating Reality EPA Environmental Protection Agency FCM Federation of Canadian Municipalities FCs Fecal coliforms FDS Financial Debt Service FE Financial and Economic FMEA Failure Mode Effect Analysis FMs Failure Modes FRBM Fuzzy Rule Based Modeling FTEs Full Time Employees GHG Green House Gases GIS Geographic Information System HO Home Owner IBNET International Benchmarking Network for Water and Sanitation Utilities IIMM International Infrastructure Management Manual ILI Infrastructure Leakage Index In-UPM Intra-utility Performance Management Model IU-PBM Inter-utility Performance Benchmarking Model IWA International Water Association IWSA International Water Services Association KPIs Key Performance Indicators KWA Kiwa Water Research KWH Kilowatt Hour xvii LOS Level of Service L-WSS Large Water Supply Systems LWU Large water utilities MCDA Multicriteria Decision Analysis MISO Multi Input Single Output MHLS Ministry of Healthy Living and Sport MGD Million Gallons per Day MRR Maintenance, Rehabilitation and Renewal NCEL National Civil Engineering Laboratory NCR National Research Council NGO Non-governmental Organization NIS Negative-ideal Solution NRW Non-revenue Water NTU Nephelometric Turbidity Units NWC National Water Commission NWWBI National Water and Wastewater Benchmarking Initiative NWWBI-PR National Water and Wastewater Benchmarking Initiative Public Report OFWAT Office of Water Services OP Operational OPM Office of Public Management New South Wales O&M Operation and Maintenance P Probability of Occurrence PE Personnel PH Physical PIs Performance Indicators PIS Positive-ideal Solution PMs Performance Measures PPMS Project Performance Management System PRD Preservation, Renewal, and Decommissioning PRV Pressure release valve PSAB Public Sector Accounting Board QS Quality of Service QUU Queensland urban utilities RCA Root Cause Analysis RI Random Index RPN Risk Probability Number SA-WRC South Africa Water Research Council SC Service Connection SIM Service Intensive Mechanism SoSI Security of Water Supply Index SISO Single Input Single Output S-WSS Small Water Supply Systems SM-WSS Small to Medium Sized Water Supply Systems SMWU Small to Medium Sized Water Utilities THMs Trihalomethanes UARL Unavoidable Annual Real Losses UFW Unaccounted for Water UOD Universe of Discourse USEPA United National Environmental Protection Agency WB World Bank WDS Water Distribution System xviii WE Water Resources and Environment WHO World Health Organization WOP Water Operators Partnership Program WP Water Quality and Public Health WQ Water Quality WSP Water and Sanitation Program WSS Water supply system WSSC Washington Suburban Sanitary Commission WSSs Water Supply Systems xix List of Symbols A Pairwise comparison matrix (ELECTRE) A Linguistic constant (FRBM) Ap First alternative in a pair of alternatives Aq Second alternative in a pair of alternatives B Fuzzy output B’ Crisp output of B Bj Fuzzy subset C Quantitative composite measure C(p,q) Concordance set D(p,q) Discordance set C Average values of all Cpq CL Composite measure minimum CH Composite measure maximum D Average values of all Dpq HS Satisfaction maximum i Component j Child J1 Set of benefit attributes J2 Set of cost attributes k Parent l Generation Lmin Minimum possible RPN value Lmax Maximum possible RPN value LS Satisfaction minimum m Previous generation n number of criteria (ELECTRE) P Performance of a factor Pi* Performance index of each functional component rij Normalized criteria values R Fuzzy rule set Ri Rule number Rij Normalization matrix S Satisfaction score achieved from a qualitative customer survey UT(x) Defuzzified value of risk factor vpj Weighting normalized rating vij Weighted value of each performance score (TOPSIS) xx Vij Normalized weighted matrix WS Survey weighting w Criteria weights wij Corresponding weight of the indicator (TOPSIS) WC Quantitative composite measure weighting xij Assigned value to the degree of alternative Ai with respect to the criteria Xj xij Performance score (TOPSIS) X Criteria X Linguistic input (antecedent) variable (FRBM) X* Positive-ideal solution X- Negative-ideal solution Y Output (consequent) variable (FRBM) Yi* Distance from X* for each performance indicator Yi- Distance from X- for each performance indicator µa(x) Membership function µBk(y) Output fuzzy set λ Eigenvalue xxi Acknowledgements All praises be to the ALLAH Almighty who gave me the strength to materialize this research in the present form. I would like to express my sincere appreciation and thanks to my advisor Professor Dr. Rehan Sadiq - you have been a tremendous mentor for me during this journey. Dr. Sadiq is very patient person, a high quality researcher, and a dedicated teacher. I would like to thank for his encouragement that allowed me to grow as a researcher. Your advices throughout my studies both on research and towards my career were priceless. I would like to thank Dr. Solomon Tesfamariam for his valuable guidance and critical comments throughout my research. I also thank Dr. Cigdem Eskicioglu for her suggestions in the improvement of my work. I would also like to thank all the staff of various water utilities in Okanagan Basin, who helped me in collection of baseline data and shared their experiences with me. This research could be impossible without the financial support from the Natural Sciences and Engineering Research Council Collaborative Research and Development (NSERC CRD). I also acknowledge the University of British Columbia for providing me the International Partial Tuition Scholarship and University Graduate Fellowship to cover a part of my tuition fee. A special thanks to my family. Words cannot express how grateful I am to my mother, father, mother-in-law, father-in-law, my brothers (Ali and Hassan), my sister (Ayesha), and my nephews (Ahmed, Fatima, Ismail, and Umer) for the sacrifices that you’ve made on my behalf. Your prayers helped me to go that far. I would also like to thank all of my friends who supported and incented me to strive towards my goal. Finally, I want to express my appreciation to my beloved wife Nazish Haider who always supported me in the moments when there was no one to answer my queries. Nazish, without your love and support, it would’ve been impossible to complete this work. The appreciation is certainly due to faculty and staff at the University of British Columbia who supported me at several occasions during the study period. I would also like to thank my friends and colleagues Golam Kabir, Hassan Iqbal, Gizachew Demmissi, and Gyan Kumar Sharma who were always there for me. I would also like to thank Amanda Brobbel, who affectionately guided me in thesis writing. xxii Dedication Lovingly dedicated to my mother, Mrs. Saeeda Ziai. Chapter 1 Introduction 1.1 Background Access to safe drinking water in sufficient quantity and at an affordable cost is the basic human right, irrespective of the geographical location and size of their community (WHO 2012). Like all other infrastructure systems, the water supply systems face a number of unique challenges in the 21st century, such as rapid population growth, uncertain climate, socio-environmental issues, limited water resources, and ongoing economic crises (Berg and Danilenko 2011). Water utilities need to perform proficiently in order to face these challenges, manage their assets, and increasing customer satisfaction. These utilities consist of different functional components, such as water resource management & environmental stewardship, operational practices, personnel productivity, physical infrastructure, customer service, water quality and public health safety, socio-economic issues, and financial viability. Each one of these components may consist of several sub-components, e.g., the component of personnel productivity may include staff adequacy, health and safety, working environment, etc. Moreover, a water utility may consist of one or more water supply systems (WSSs) which have their specific geographical characteristics. A utility will only attain high sustainability objectives, when all of its WSSs, functional components and sub-components are performing efficiently. Water supply systems can be divided into vertical and linear components. The vertical components consist of treatment plants, pumping stations, and storage facilities, whereas the linear components are transmission mains and distributions system pipelines. Generally, linear components are more expansive and their value can be 60 to 80% of the overall cost of the WSS (Stone et al. 2002). All these components may face a number of problems associated with their continuous aging process, including low pressure, water loss, and water quality deterioration (Alegre 1999, Alegre et al. 2006). As per the Canadian Infrastructure Report Card (2012), 15.4% of the water mains possess fair to very poor condition. Moreover, 14.4% of vertical assets were found to be in fair to very poor condition. The estimated replacement cost of these infrastructure is 25.9 billion $CAN (i.e. 2,082 $CAN per household) in Canada (CIRC 2012). Keeping in view the existing infrastructure condition and investment needs, in Civil Infrastructure Systems Technology Road Map 2003–2013, the federal, provincial, territorial and municipal governments and industry partners were requested to allocate funds to infrastructure research and development (CCPPP 2003). 2 Alegre and Coelho (2012) defines asset management for urban water utilities as “the set of processes that utilities need to have in place in order to ensure the performance of the asset in line with the service targets over time, that risks are adequately managed, and that the corresponding costs, in a lifetime cost perspective, are as low as possible”. The first step towards effective asset management is assessing the performance of above stated components of a water utility. Subsequently, based on the performance assessment results, the utility management can establish desirable level of service (LOS) with acceptable risk, and can develop future financial plans. Even smaller utilities can adopt sustainable asset management strategies to enhance their effective service life (Brown 2004). About 95% of the water supply systems in Canada are being operated by small and medium sized water utilities (SMWU) serving population less than 50,000 (Statistics Canada 2009). National Water and Wastewater Benchmarking Initiative (NWWBI), Canada was established in 1997. As per the recent public report published in 2013 (stated performance of water, wastewater and storm water utilities FY 2011), the wastewater and water utilities have been participating in NWWBI since 2003 and 2005, respectively. However, most of them are large water utilities (LWU) with population more than 50,000 i.e., 50% of Canadian utilities covering more than 60% of the population. So far, the participation of SMWU has almost been negligible in NWWBI. The possible reasons seem to be: i) there is no well-structured performance assessment framework available for such utilities which can simply (though comprehensive) be implemented under given technical and financial constraints, and ii) due to less economies of scale, SMWU may avoid to participate with large utilities which may delineate deficiency performance. Therefore, SMWU in Canada are managing their assets without knowing whether they are meeting their primary performance objectives or not. 1.2 Research Motivation Over the last several years, the water utilities have been encouraged to effectively manage their assets due to several emerging factors, such as increasing number of customers and their expectations, more awareness towards water resources conservation, emerging environmental impacts, climate change issues, lack of trained personnel, increasing energy and regulatory requirements, and increasing financial stresses. Based on Public Sector Accounting Board (PSAB) Canada guidelines, applicable from 1st January 2009, all the municipalities should account for their tangible capital assets and amortize their financial statements regularly (Zoratti 2009). Subsequently, based on the remaining service life, and existing condition of assets, maintenance, rehabilitation and renewal (MRR) strategies for linear and vertical assets can be planned. Several studies have been conducted, in the past, on risk based MRR for 3 physical assets of WSSs (Francisque et al. 2014, Liu et al. 2010, Lounis et al. 2010, Loganathan et al. 2001). However, primary objective of a water utility is much broader than just meeting its physical infrastructure needs. A comprehensive performance management can help the utility to achieve its overall sustainability objectives, such as: i) optimization of human and financial resources, ii) conservation of water resources, iii) protection of environment, iv) provision of safe and productive working environment for personnel, v) protection of public health, vi) provision of safe drinking water for the community, and vii) achieving customers’ reliability through efficient operations and response to their complaints. Therefore, an overall goal of this research is to develop a comprehensive and practical performance management framework for SMWU. Performance Management 1.2.1 The performance improvement process in any water utility initiates with an effective performance assessment, i.e., comparing the utility performance with other similar utilities (in size and geographical location) and with the standards established by various regulatory agencies (Marques and Witte 2010; Alegre et al. 2006). Various agencies around the world have developed systems for inter-utility performance assessment (benchmarking) based on performance indicators (PIs) (Coelho 1997, Alegre et al. 2006, Berg and Danilenko 2011, NWC 2012, AWWA 2004, OFWAT 2012, NRC 2010, ADB 2012). In general, larger utilities are older than SMWU and contain much larger and expansive physical infrastructure, e.g., water mains, treatment plants, etc. Also, they have to satisfy a large number of concerned and responsive customers. The performance related issues (e.g., extensive energy requirements, widespread environmental impacts, large pipe bursts, and loss of amenities during vandalism) of larger utilities were recognized decades ago (Stone et al. 2002). Consequently, most of these performance assessment systems were primarily developed for LWU. SMWU are different than large utilities in several ways, including but not limited to: less efficient management information systems; serve fewer people with small-sized pipelines; less involved in management activities; and inefficient working as well as utilization of technical, human and financial resources (Ford et al. 2005, Braden and Mankin 2004, Brown 2004, Hamilton et al. 2004). Consequently, SMWU are facing several technical, socio-economic, and environmental challenges to meet regulatory guidelines (Dziegielewski and Bik 2004). For instance, according to Water Canada (2013), water utilities in British Columbia have gone through maximum number of boil water advisories as compared to other 4 provinces, and most of them are SMWU with population less than 50,000. Interior Health Canada (2013) has reported various reasons for these advisories, such as source water contamination, flushing of hydrants, construction, repair and maintenance works, equipment failure, and inadequate treatment. SMWU have some advantages over LWU; for example, they have smaller, relatively less complex and newer physical structures and have simple organizational structures which provide more opportunity for change management. Also, SMWU have fewer impacts on natural systems due to smaller withdrawals, and produce lesser carbon emissions. Hence, SMWU cannot adopt the above mentioned systems of PIs as such due to data limitations and intricacies of the existing systems. According to European Project (COST Action C18: Performance assessment of urban infrastructure services), there is an urgent need for comprehensive research to improve performance management in SMWU (Alegre 2010). Decision Making at Strategic, Tactical, and Operational Levels 1.2.2 The scope of asset management shown in Figure 1.1 consists of three levels, i.e., strategic, tactical, and operational (Alegre and Coelho 2012). Strategic level planning involves a review of the functional environment to ensure all the elements such as corporate, community, environmental, financial, legislative, institutional, and regulatory factors are appropriately considered in asset management. It further includes a clear statement of strategic objectives, policies, anticipated outcomes, and risk management plans (Haider 2007). The tactical planning is the application of detailed asset management procedures, and standards in a cost-effective way to achieve strategic goals by meeting desired service levels. Moreover, the objectives related to technical and customer service standards and financial requirements are also set at this level (Foster et al. 2000). Asset management information systems including condition and performance evaluation, capacity availability, and lifecycle costs are also managed in this level (Karababas and Cather 1994). Operational plans at level 3 include controls to confirm delivery of asset management policies, strategies, legal requirements, objectives and plans developed at Levels 1 & 2. Activities at this level include condition monitoring, process control, staff training, communication with stakeholders, information and data control, and emergency response (IIMM 2006). In SMWU, senior management is involved in both the strategic and tactical level decision making simultaneously; similarly, technical management is responsible for operational level decisions along with their primary responsibilities (i.e., tactical level decision making). Practically, from an engineering management point of view, the tactical level is the most critical one in overall planning, and thus is primarily addressed in this research. 5 Figure 1.1 Scope of Asset Management (modified from IIMM 2006) 1.3 Research Gap In existing NWWBI pubic report, the calculated values of different PIs are just compared with minimum, average, and maximum values of the participating utilities (i.e., LWU). Such simple comparison of individual PIs does not provide information about the overall performance of a water utility. Secondly, all these benchmarks are available for larger utilities, due to inherent less economies of scale in SMWU, the application of these benchmarks for inter-utility benchmarking of SMWU needs extensive efforts. The benchmarking process needs to be practical (for measurable PIs), besides being comprehensive enough to cover all the functional components. When one or more functional components are underperforming, the tactical level decision making can be improved by honing in the sub-components. Such analysis need to be performed at intra-utility to evaluate the performance of different WSSs operating within a utility. Moreover, there are research gaps that exist in terms of addressing specific performance related issues (at component level) in SMWU, for instance, customer satisfaction is a primary objective of a water utility to provide reliable services. Existing methods based on customer interviews might not be practically Strategic level AM goals & policy ownership definition Tactical level Assets Management Purchasing Marketing Finance Engineering Resource Planning Human resources Customer service Operational level External Factors Operation Management Maintenance Management Condition Monitoring Reliability Management Risk Management Location Management Inventory Control Work Management Regulations Legislation Government Agencies Business stakeholder Economic Forecast Pressure Groups External Consultants Suppliers Contractors Auditors Contract Management Registry Management 6 possible for smaller utilities with limited resources; therefore the operational personnel strive hard to respond to the complaints without any management strategy. With the result, there is no structured mechanism available to evaluate the risk of customer dissatisfaction. A comprehensive assessment of the SMWU over their entire lifecycles (i.e., continuous benchmarking), answering the above stated research gaps followed by effective asset management plans, can help the utilities for attaining sustainability. There are several models, guidelines and decision support tools proposed and developed by various agencies and organizations around the world to serve this purpose. Most of these tools are based on extensive, long-term, and expansive (requiring large human and financial resources) database, which is presently not available for SMWU. Therefore, a substantial knowledge gap still exists on performance management, particularly for SMWU (NRC 2010, Alegre 2010). 1.4 Objectives The primary goal of this research is to develop a comprehensive and practically applicable decision support framework for performance management of SMWU operating in Canada and other parts of the world. The specific objectives of this research are to develop: 1. potential performance indicators for SMWU through a screening process and the state-of-the-art literature review, 2. a model for the selection of the most suitable PIs for SMWU based on their, applicability, measurability, understandability, and comparability using multicriteria decision analyses, 3. an inter-utility performance benchmarking model for SMWU, 4. an intra-utility performance management model for sustainability of SMWU, 5. a comprehensive risk based customer satisfaction management model for SMWU, and 6. apply all the models developed in this research on actual case studies for the proof-of-concept. 1.5 Thesis Structure and Organization Figure 1.2 illustrates the organization of thesis containing eight chapters to achieve the above mentioned objectives. Chapter 2 includes the review of literature for existing frameworks for PIs and performance assessment of water utilities. In Chapter 3, potential PIs are identified, which are subsequently evaluated on the basis of different criteria using multicriteria analysis for final selection of PIs in Chapter 4. An inter-utility performance benchmarking model is developed in Chapter 5. In Chapter 6, a detailed 7 performance management model is developed for SMWU under uncertainty. The final model is developed in Chapter 7 for managing risk of customer dissatisfaction in SMWU. Finally, Chapter 8 contains summary of research outcomes and recommendations for the future research. Figure 1.2 Thesis structure and organization OBJECTIVE 1 Chapter 1: Introduction and modeling framework Chapter 2: Literature review Chapter 3: Identification of suitable PIs for SMWU OBJECTIVE 2 Chapter 4: Selection of important PIs using multicriteria analysis for performance assessment of SMWU OBJECTIVE 3 OBJECTIVE 4 Chapter 5: Inter-utility performance benchmarking model for SMWU Chapter 6: Intra-utility performance management model for SMWU OBJECTIVE 5 Chapter 7: A risk based framework for customer satisfaction management in SMWU OBJECTIVE 6 Application of model on a case of Okanagan Basin for proof-of-concept Chapter 8: Conclusion and Recommendations 8 1.6 Proposed Framework In order to attain the objectives outlined above, a framework is presented in Figure 1.3. As described earlier, a water utility consists of different functional components, and sub-components, which needs to operate efficiently to attain an overall sustainability. If one or more of these components and/ or sub-components are underperforming (i.e., not meeting desired benchmarks, criteria, and standards), the utility managers need to improve the performance of the respective component. Moreover, all the WSSs operating within the utility should also perform efficiently. The main purpose of this framework is to facilitate the management of SMWU for short-term and long-term decision making to achieve: i) the sustainability performance objectives for all the functional components of the utility as a whole and for its WSSs, and ii) customer satisfaction by providing a reliable quality of service. A brief description of the models developed in this research is given below, and the details are provided in the following chapters. 1.6.1 Terminology Adopted in Research There are different models, techniques and methods are used in this research. It is important to understand the terminology developed for this research in order to appreciate the integrated concept of performance management for SMWU. The term ‘framework’ is used for the holistic approach developed in Figure 1.3. The term ‘model’ is used for the components of this framework, when detailed modeling approaches are used to solve the research gaps for performance assessment or management of SMWU, e.g., model for selection of PIs, intra-utility performance management model, etc. The term ‘method’ is used for compensatory and non-compensatory MCDA methods used in this research. The term ‘technique’ is used for applied mathematical procedures, such as fuzzy set theory, failure mode effect analysis, sensitivity analysis, etc. 1.6.2 Identification of Potential PIs for SMWU Although, a breadth of literature is available on performance indicators, performance benchmarking and performance assessment for water utilities, there is still a gap exists for specific interest of SMWU. In this chapter, a comprehensive review of the literature has been carried out to rationally assess the suitability of reported performance evaluation systems for SMWU in terms of their simplicity (easy and simple data requirements) and comprehensiveness (i.e., all the components of a WSS) using expert judgment. The review also evaluates the individual PIs with respect to their understandability, measurability, and comparability (i.e., within and across utility comparisons). 9 Figure 1.3 Proposed research framework for performance management of SMWU Risk Management Selection of suitable PIs using MCDA Inter-utility performance benchmarking of SMWU Identification of potential PIs for SMWU Are all the components performing ‘High’? Maintain performance for sustainability of SMWU Yes No Intra-utility performance assessment for components and sub-components at utility level Are all the sub-components are performing ‘High’? Yes ‘Low’ detailed investigation required Performance Management (Data/ Decision Variables) ‘Medium’ utility level improvement required Intra-utility performance assessment for components and sub-components at system level Are all the systems are performing ‘High’? Yes No Assessing risk of customer satisfaction in SMWU Are all the customers are satisfied with utility’s performance? Yes No Objective 1 Objective 2 Objective 3 Objective 4 Objective 5 Objective 6 END CHECK Sustainability objectives achieved for SMWU 10 On the basis of this detailed review, a conceptual performance evaluation system, consisting of a list of PIs grouped into their respective categories, has been developed in Chapter 3. The system provides a stepwise approach, starting the performance evaluation process with the most significant and easy to measure PIs, and moving to a relatively complex set of PIs depending on the availability of resources and specific operating conditions. However, this list of initially identified PIs needs to be further investigated for final selection of PIs for SMWU. 1.6.3 Selection of Performance Indicators In Chapter 2 and 3, potential PIs were identified for SMWU in water resources and environment, personnel, operational, physical, water quality, quality of service and financial categories through literature review and initial screening. These PIs are evaluated against applicability, understandability, measurability and comparability criteria using multicriteria decision analysis (MCDA) based on experts opinion and experienced judgment of personnel involved in water utilities management in Chapter 4. The MCDA method adopted in this research provides an opportunity to the utility management to encompass the most suitable PIs based on data availability, and specific needs of their utility. 1.6.4 Inter-utility Performance Benchmarking In Chapter 5, the finally selected PIs in Chapter 4 are used to develop an inter-utility performance benchmarking model (IU-PBM) for SMWU. Different (linear, exponential, logarithmic and polynomial) transformation functions have been established to translate the calculated PIs into performance levels, which is based on literature, NWWBI reports and expert judgment. The weights are estimated through group decision making by ranking of PIs by different water utilities in the Okanagan basin, British Columbia, Canada, and the opinion of experts working in water infrastructure management. Finally, performance indices have been established by aggregating the transformed performance levels. The IU-PBM results presented in the form of a web diagram demonstrate the utility’s performance to the top level management for pragmatic decision making. The proposed model has also been implemented for two SMWU operating in Okanagan Basin to demonstrate its practicality. 1.6.5 Intra-utility performance management for SMWU If the results of IU-PBM show that all of the functional components are not meeting desired LOS, there is a need for detailed investigations at utility level. In Chapter 6, an intra-utility performance management model (In-UPM) is conceptualized and developed for effective decision making. A hierarchical based top- 11 down approach is used; starting from overall sustainability performance objectives of the functional components at the top, followed by primary and secondary performance measures of the sub-components, and indicators (basic building blocks) receiving inputs from data/ decision variables at the bottom. The issues related to data scarcity are addressed by utilizing benchmarking data from larger utilities, peer-reviewed literature, and expert elicitation from local municipalities. In-UPM is robust enough to deal with temporal and spatial variations, i.e., it can assess the performance of a water utility as a whole and/ or different water supply systems operating within a utility for a given assessment period. System level assessment is required when one or more functional components or sub-components are either performing ‘medium’ or ‘low’. Sensitivity analyses are performed to rank the performance indicators based on their percent contribution to each functional component. In-UPM is implemented for a medium sized water utility containing three sub-systems in the Okanagan Basin (British Columbia, Canada). 1.6.6 Managing Customer Satisfaction in SMWU Literature review carried out above reveals that the conventional customer satisfaction (CS) assessment methods based on performance benchmarking and customer interviews might not be technically and financially sustainable for SMWU. A risk based model is developed in Chapter 7 for managing CS in SMWU, primarily based on the evaluation of customer complaints. The proposed model also includes the experience of the operational staff to support decision making for effective improvement actions. Customer dissatisfaction is evaluated in terms of risk of CS, which starts when a customer reports a complaint to the utility; however, a complete evaluation of CS depends on the duration between the time of the report and response up to the complete resolution of the complaint. Different categories of complaints are identified from an exhaustive record of customer complaints obtained from a medium sized utility in Okanagan Basin, British Columbia, Canada. All possible modes of failures are identified using root cause analysis for detailed risk assessment. The inherent uncertainties associated with data limitations and experts opinion have also been addressed. 12 Chapter 2 Literature Review A part of this chapter has been published in Environmental Reviews, an NRC Research Press Journal as a review articles titled “Performance Indicators for Small and Medium Sized Water Supply Systems: A Review” (Haider et al. 2014a). This chapter contains two main sections. With regard to the objectives defined in Chapter 1, the first section includes a state-of-the-art literature review for, system of PIs, selection of indicators, performance assessment models, and methods of assessing customer satisfaction for water utilities. Based on the review, to practically achieve the research objectives, a framework for the overall research is presented in the last section of this chapter. A water utility may contain one or more than one WSSs. However, in this chapter the terms small to medium sized water utilities (SMWU) and small to medium sized water supply systems (SM-WSS) are used interchangeably depending on the context of discussion. 2.1 Background A systematic approach is adopted to comprehensively review the reported PI systems, keeping in view the specific requirements of SMWU. The main components of the overall review process are shown in Figure 2.1. Discussion on the major differences between LWU and SMWU is followed by the review of various PI systems proposed by different organizations. A PI system usually consists of general asset information, baseline data variables, the categorization of given PIs, linkage between data variables and PIs, the formulae or equations to calculate the PI, and the comparison with benchmarks. Appropriate grouping of selected PIs provides an effective mechanism to assess the category-wise performance of the utility, i.e., water resources and environment, personnel, operational, physical assets, water quality, quality of service, and financial. This categorization further aids decision makers in identifying and then rectifying the weaknesses of the utility in a more systematic manner. For example, a relatively new system is performing worse than an old system of the same size and socio-economic conditions, due to staff and management inefficiencies. In this case, the utility may receive a large number of customer complaints, even though all the other categories might be working efficiently. An extremely complex or detailed PI system might not be feasible for the SMWU. In this respect, PIs systems proposed by various agencies have been evaluated based on their simplicity, comprehensiveness (i.e., all aspects of a WSS are covered), and the potential to be applicable to SMWU. Moreover, the data required to estimate the PIs are either not available or missing in most situations. 13 Suitability of PIs for SMWU is assessed on the basis of their understandability, availability of existing data, and the data that can be frequently collected in future. It is also important for the utility operators and managers to establish the important PIs for which data is available or attainable with limited resources. This chapter is sub-divided into six sections. The following section consists of size-based classification of water supply utilities and the main differences between the SM-WSS and the large WSSs (L-WSS). A state-of-the-art review is then carried out in Section 2.3 of the available PI systems being utilized worldwide by different agencies. The review also evaluates each system in terms of measurability and understandability of the PIs, and the PIs are grouped into various categories. The review further explains the specific objectives and conditions in which these systems were developed, and thereby describes their limitations. In the next section, some case studies of different countries having limited data resources for performance evaluation are presented to identify the simplest PIs (though not necessarily the most important ones). The following sections review the methods for selection of PIs, and performance assessment of water utilities. In the last section the overall framework of this research is presented. 2.2 Classification of Water Supply Systems In order to evaluate the existing PI systems, it is important to understand the difference between SM-WSS and L-WSS, and how and on what basis these systems have been distinguished from each other in various countries. The components of a WSS greatly depend on the water source (i.e., fresh surface water, ground water, sea water, or saline ground water) as shown in Figures 2.2 a,b,c,&d. Figure 2.2 shows that the overall water supply scheme could be as simple as Figure 2.2b (in a case of a fresh groundwater source) and as complex as Figure 2.2d (a case of sea water source). To some extent, the selection of PIs (particularly water quality) depends on type of the source type and water quality. A more detailed framework, including a larger number of PIs, might be required for the same size WSS with a different water source. Although the situation in case of private SM-WSS (e.g., England and Wales) could be different, the review covers the performance evaluation system adopted for private WSSs as well; however, the main focus is on public sector SM-WSS. 14 Figure 2.1 Framework showing approach adopted for critical review of performance indicators systems and performance assessment frameworks 2.3 Size Based Classification of Water Supply Systems The WSSs are categorized on the basis of their sizes to efficiently perform their organizational, financial, and operational activities. The criterion of system size classification varies around the world (Ford et al. 2005) (Table 2.1). In most parts of the world, including Central and North America, the utilities are commonly classified as small, medium, and large based on the volume of supplied water, number of connections, and population served (Corton and Berg 2009). In New Zealand’s and South Africa’s Water Difference between Small & Medium sized WSS Vs. Large sized WSS - Population served - Number of connection - Use of smaller dia. Pipes - System age - Investment needs - Public health risk - O&M systems PIs Systems Framework and Classification of PIs Developed by Various Agencies - Simplicity - Comprehensiveness - Applicability to SM-WSS PIs Proposed by Various Agencies - Understandability - Measurability - Comparability DESCRIPTION OF REVIEW ITEMS BASIS OF REVIEW PERFORMANCE INDICATORS FOR SM-WSS (Chapter 3) BASIC – START UP (applicable to both developing and developed countries) ADDITIONAL (applicable to both developing* and developed countries) ADVANCED (applicable to medium sized WSSs in developed countries) a in case where data is available or possible to collect 15 Research Councils, the basis of size classification is the number of connections (Lambert and Taylor 2010; Mckenzie and Lambert 2002). Table 2.1 Various basis of sized-based classification of water utilities System Size Country/ Agency with basis of classification USEPA1 (2002) USEPA (2009) WB2 (2002) New Zeeland (2010) SA-WRC3 (2002) AWWA4 (2004) (Population served) (Population served) (Population served) (No of service connections) (No of service connections) (Flow MGD) Large >50,000 >100,000 >500,000 >10,000 > 50,000 > 50 Medium 3,300 – 50,000 3,300 – 100,000 125,000 – 500,000 2,500 – 10,000 10,000 – 50,000 5 – 50 Small <3,300 <3,300 <125,000 < 2,500 <10,000 < 5 1United states environmental protection agency; 2World Bank; 3South Africa Water Research Council; 4American Water Works Association According to the Irish EPA, the small system is the one that serves less than 5,000 people (Ford et al. 2005). The Province of British Columbia, Canada, has a tiered classification for small water systems (WS) based on the number of connections, ranging from 1 connection for a restaurant or a resort (i.e., WS4) to more than 20,000 connections (i.e., WS1c) (MHLS 2010). Sometimes, one utility is operating two or more WSSs. In this case, the performance of the WSS should be evaluated separately if the water source, geographical characteristics, and land uses are different. Each classification system presented in Table 2.1 has its own constraints. For example, there is a large difference between the population served and the number of connection ratio in developing countries as compared to the developed world which is due to high population densities and larger number of persons per connection, particularly in urban areas (WSP 2009). Secondly, the variations in per capita water consumption are also large enough to relate the community size with the flow requirements due to type of supply and standard of living. Every country or region should characterize the size of a municipality keeping in view all the relevant factors discussed above. In general, utilities having population greater than 50,000, number of connections greater than 10,000, and demand higher than 50 million gallons per day (MGD) have been considered as large ones. 16 (a) (b) (c) (d) Figure 2.2 Schematic schemes of water supply systems based on different types of sources; a) Fresh surface water source; b) Fresh groundwater sources; c) Saline groundwater source; d) Saline surface water source P RO OHR DISTRIBUTION DISTRIBUTION P OHR LEGEND: Vertical components: Storage Treatment Pumping Station Horizontal components: Transmission main Distribution main P DISTRIBUTION LAKE/ OCEAN RIVER/ ESTUARY TREATMENT PLANT OHR P TRANSMISSION RO THERMAL DESALINATION OR STORAGE LAKE RIVER TREATMENT PLANT OHR P TRANSMISSION TRANSMISSION DISTRIBUTION STORAGE 17 Difference between L-WSS and SM-WSS 2.3.1 There are several factors that make the SM-WSS different from the L-WSS, some of which are directly related to the size of the system. Factors specific to the SM-WSS are as follows:  relatively new inclusions to developments in the proximity of large cities;  pipe sizes are small (thus less costly) due to less water demand;  less impact of natural resources due to relatively small water withdrawals; and  less production of greenhouse gases (GHG) emissions from energy consumption. In other cases, factors are site specific and might not be applicable to all SM-WSS, particularly in the case of privately-operated, organized water supplies in developed countries. These factors may include:  lack of financial resources;  low capital and operation costs (if compared on the basis of the same type of water source);  lack of technical staff, equipment, and vehicles;  lack of awareness and access to recent technologies (true for both public and water supplier); and  less intention to manage and more to replace and/or renew the system components. To qualitatively and quantitatively justify these differences, a detailed inventory of different types and sizes of utilities operating around the globe is required, which unfortunately is not readily available in literature and/or might have not been reported due to lack of attention given to performance evaluation of SM-WSS utilities to date. A notable exception is the periodic reporting for drinking water infrastructure future needs in the United States by the USEPA. Therefore, a brief review of these reports is presented below as a useful case study to understand the differences between L-WSS and SM-WSS. USEPA Drinking Water Infrastructure Needs Surveys – Case Study 2.3.2 The United States Environmental Protection Agency (USEPA) has been publishing “Drinking water infrastructure needs survey and assessment for next 20 years” reports periodically for the years 1995, 1999, 2003, and 2007, which were published in years 1997, 2001, 2005, and 2009, respectively. Figure 3.2 shows the findings of a recent report, published in 2009 (showing financial needs for the year 2007) for all types of systems. According to this report, the percentages of the population living in small WSS (S-WSS) and L-WSS are 9% and 45%, respectively. The figure also reveals that medium WSS (M-WSS) 18 have proportionate (i.e., 45%) total financial needs for 45% of the total population (USEPA 2009), whereas the S-WSS account for almost double (18%) of the community’s water system financial needs per percentage of population (i.e., 9%) due to low economies-of-scale. Figure 2.3 also shows that maximum (more than 60%) financial needs have been estimated for improvements of distribution systems, and the remaining 40% is for treatment (filtration, disinfection, and corrosion control), and source (surface water intake structures, drilled wells, and spring collectors). A shift in trend of the distribution system investment needs for all sizes of WSS can be seen in Figure 2.4. The reason for SW-WSS having the highest investment needs could be the change in population limits from 50,000 to 100,000 people. It is possible that some L-WSS may have moved into the M-WSS category since 2007. According to the USEPA evaluation study on S-WSS conducted from June 2005 to December 2005, these systems are facing many challenges (as described above) along with regulatory/compliance challenges. Massachusetts officials stated that if the required amount for system improvement is less than $100,000, it is not cost-efficient for a small system to furnish a loan application (USEPA 2006). Moreover, S-WSS lack in economies of scale (lower capital cost but high O&M cost per household) as compared to large systems, and thus are likely to be lacking in their technical, financial, and management capacities (USEPA 2001, Dziegielewski and Bik 2004, Brown 2004, Maras 2004, Braden and Mankin 2004). Source Water 2.3.3 The selection and application of different PIs related to various components of a WSS depend on the type of source water, regardless of the size of the system. A small WSS relying on surface water source (Figure 2.2a) has to consider PIs related to water storage and treatment facilities; on the other hand, performance of a large WSS with a fresh ground water source (Figure 2.2b) can be evaluated without such PIs. System Age and Pipe Size 2.3.4 American Water Works Association AWWA (2007) conducted a Community Water System Survey in 2000 on 1,806 systems of all population sizes. A summary of the percentage of pipe per system, by age, for each size of system (population served) is shown in Figure 2.5. It can be observed that most of the pipes (more than 80%) in SM-WSS are less than 40 years old. Therefore, the operational difficulties related to pipe age might be less than with L-WSS. 19 Figure 2.3 Total 20-year financial need by system size and project/ component type (in billions of January 2007 dollars) (developed using USEPA 2009 data) Figure 2.4 Variation in 20-year distribution system investment needs by system size from 1995-2007, according to USEPA infrastructure needs survey and assessment for United States (developed using USEPA 1997, 2001, 2005, 2009 data) 04080120160L-WSS (>100,000) M-WSS (3,300 - 100,000)* S-WSS (<3,300)Investment Needs (Billion US $) Distribution & Transmission Treatment Storage Source Total* Population 0204060801001995 1999 2003 2007Distribution system Investment Needs (Billion US $) S-WSS (<3,300) M-WSS (3,300 - 100,000)* L-WSS (>100,000)* change in population limit from 50,000 to 100,000 in 2007 20 Figure 2.5 Percent of pipe system by age class in different system sizes (developed using AWWA 2007data) The AWWA’s 2007 report also included results for an average length (miles) of each size pipe, from less than 6 in. to greater than 10 in., in different system sizes; these results are shown in Figure 2.6. The figure shows that water mains having sizes greater than 10 in. exist mainly in L-WSS. Most of the pipe diameters in SM-WSS are less than 6 in.; therefore, conventional and expansive condition based assessment techniques applicable to L-WSS cannot be applied to SM-WSS. In order to make on-time and optimized decisions, rational PIs integrated with risk based methods can play an effective role for such systems. Public Health Risk 2.3.5 According to Ford et al. (2005), the public health problems faced by S-WSS are different than those experienced by L-WSS. The following reasons for this have been reported in the literature (USEPA 2006, Hamilton et al. 2004):  higher exposure of pathogens due to operational issues in chlorination practices;  inadequacy of treatment systems to deal with outbreaks;  limited financial resources for monitoring and mitigation measures;  lack of trained laboratory staff;  higher concentrations of nitrates and pesticides in source water due to more farming activities in rural proximity; and  overall mismanagement such as lack of preventive maintenance, poorly designed and constructed facilities, and inefficient operation and maintenance. 0102030405060708090100<100 101-500 501-3300 3300-1000010-50000 50-100000 100000-500000> 500000S-WSS M-WSS L-WSS% pipe per system by age (yrs) < 40 years40 - 80 years> 80 years 21 Meeting drinking water quality standards is most difficult for smaller systems. USEPA (2006) statistics over the years have shown that noncompliance with drinking water regulations increases as the size of the system decreases due to inadequate resources in terms of both equipment maintenance and qualified operators. Figure 2.6 Average miles of pipe per system by diameter in different system sizes (developed using AWWA 2007 data) Operation and Management Systems 2.3.6 Management capacity is one of the most important components of successful operations of a WSS. The main problem particularly with S-WSS is a lack of fully-qualified technical individuals (i.e., in many cases, drinking water operations are not their sole occupation). According to Braden and Mankin (2004), many small communities are struggling to evaluate required improvements, generate funds, and manage the more advanced systems required to meet drinking water standards. They also reported that most of the small systems cannot attract and/or retain officials with the required knowledge due to poor pay scales and low job recognition (Ford et al. 2005). Consequently they are operating with part-time technical personnel and few staff members to plan, oversee, and manage infrastructure improvements. Moreover, most of the systems do not have effective leadership. Physical, Water Quality, and Environmental Sustainability 2.3.7 In a distribution system the common problems are associated with loss of water or pressure due to leakage and pipe breaks. There is always a trade-off between the cost of increasing water production and the cost 02004006008001000<100 101-500 501-3300 3300-1000010-50000 50-100000 100000-500000> 500000S-WSS M-WSS L-WSSAverage length of pipe per system by dia (miles) < 6 in Dia.6 - 10 in Dia.> 10 in Dia. 22 of repair; however, this might not be practical for SM-WSS facing water scarcity issues or limited capacity at source waters. Another point of view is that it is cost-effective for SM-WSS to simply fix lines when they break. In Australia, the best risk management approach for S-WSS is to focus on responsiveness to failure – to reduce the cost per break repair and the time-out-of-water (Cromwell et al. 2001). Detailed condition assessments of small mains having lower priority risks, using statistical analysis of raw data on certain parameters such as break trends segmented by pipe material and soil type to evaluate overall replacement needs, would not cost as much as other forms of actual condition assessment of specific lines (Sadiq et al. 2010). On the other hand, non-destructive testing (e.g., closed-circuit television CCTV cameras) are limited to larger diameter pipe (i.e., typically 24 in. diameter or greater in the US) due to high costs involved in testing procedures (Liu et al. 2012). Most condition assessment methods are expensive for SM-WSS, therefore only reliable and effective PA can help with decisions regarding their practical application. Performance indicators related to environmental and socio-economic sustainability of SM-WSS should also be considered in the overall PA framework. In any WSS, energy is consumed in several operation and maintenance activities. Moreover, higher water usage not only impacts water resources directly but also generates large wastewater volumes, which eventually leads to huge energy requirements for treatment, reuse, or final disposal. GHG generated by all these activities are responsible for climate and hydrological changes (Parfitt et al. 2012). The components of a WSS shown in Figure 2.2 are inter-dependent; inefficient working of any of these components leads to the loss of either water or energy, or both. Sometimes SM-WSS are located close to each other and rely on the same water source. In this case, sustainability indicators including climate change impact on water availability in source water should also be included in the PA framework, as a larger population can be affected in case of drought. Other environmental concerns such as impacts of chlorinated water on aquatic life due to leakage in a water mains and flushing of pipes passing through or nearby natural water bodies, need to be addressed. In general, the concentration of residual chlorine is water distribution systems ranges between 0.5 to 1mg/L; while, much smaller concentrations can significantly affect the fish and other aquatic micro-organisms (Donald et al. 1977). PIs can be used to describe the overall LOS for various physical components (storage, treatment, distribution, etc.) as well as functional components (physical, customer services, environmental, financial, 23 etc.) of a water utility. In the following sections, a state-of-the-art review is carried out on existing PIs systems developed by various agencies around the globe. 2.4 Literature Review of Performance Indicators Systems for Water Utilities Performance is the degree to which infrastructure provides the services to meet the community expectations, and is a measure of effectiveness, reliability and cost (NRC 1995). The performance of a water utility depends on the efficient and reliable working of all of its functional components, including water resources, physical assets, operational, customer service, personnel, public health, environmental, and financial. The performance of a WSS is evaluated to indirectly estimate the conditions and rehabilitation needs in order to ensure continuous and reliable working of all of these components of a WSS during their entire service life before the occurrence of a failure. Once a failure has occurred, the cost of corrective action is much more than the planned preventive action would have been. The difference in the planning and management cost and the cost of corrective actions justifies the need for performance assessment. There are several methods of performance evaluation for water utilities given in the literature, e.g., PIs, total factor productivity indexes, production frontiers, etc. (Coelli et al. 2003). However for methods other than PIs, more sophisticated data is required, which is difficult to acquire or is sometimes even missing for SM-WSS, due to on-going restructuring and expansions, and inadequate information technology. Review of literature in the following sections starts with a brief overview of commonly used terminology and history of PIs followed by a comprehensive review on recent developments. Terminology and Historical Background of Performance Indicators 2.4.1 The Canadian Water and Wastewater Association (CWWA) (2009) briefly defined the terms, PIs, variables, benchmarks, and target as follows: Performance Indicator - A performance indicator is a parameter or a value derived from other parameters, which provides information about the achievements of an activity, a process, or an organization with a significance extending beyond that directly associated with the calculated value of the parameter itself. For example, the average number of liters of water supplied per person per day. Indicators are typically expressed as commensurate or non-commensurate ratios between the variables. 24 Variables - Performance indicators involve the measurement of data variables generated by analysis of the service performed. The selected variable should be easy to understand; accurately measureable with available equipment, staff, and funds; easily reproducible or comparable; should refer to the geographical area and reference time of the study area; and be relevant to the indicator to be developed. These are basically the baseline data required to determine the associated value of a performance indicator, e.g., number of service connections, population served, total water main length, and annual costs. Benchmark – This is a numerical point of reference generally for the past or present. For example, in 2008, the average supply of water to residential customers was 350L/p/d. The benchmark values established for the future should be considered as targets. Target - A target in reference is a determined value for the PI, which is to be achieved over time (future) through the conduct of a program. For example, a target for average water supply would be to reduce average demand to 300L/p/d by the year 2012. The International Water Supply Association (IWSA) selected the topic “Performance Indicators” for one of its world congresses during the early 90’s, but the concept could not garner much interest. However, three to four years later this concept was highlighted by a number of senior members of water utilities. A good PI system is the one that contains PIs which are fewer in number, clearly defined, non-overlapping, useful for global application (i.e., wider applicability), easily understandable, refer to a certain time period (i.e., preferably one year), address a well-defined geographical area, and represent all the relevant aspects of water utility performance (Alegre 1999). The National Civil Engineering Laboratory (NCEL) in Portugal proposed a hierarchical structure of PIs, including four main groups, for the first time in 1993. Each group consists of a number of PIs and a sub-set of indicators in the upper layer (Matos et al. 1993). Later in 1994 and 1995, the Portuguese Association of Water Resources (APRH) and NCEL jointly proposed a hierarchical structure of LOS into four categories (i.e., organization, engineering, environmental, and capital categories) (Defaria and Alegre 1996) (Figure 2.7). This structure was more efficient as it provided the basis of PA of the organization staff and also incorporated consideration of environmental impacts on water resources. Alegre (1999) carried out a comprehensive review for the development of conceptual framework and PIs for WSSs. The work done by various agencies and researchers on PA of WSSs before the year 2000 was covered in that review (National Civil Engineering Laboratory - NCEL, Portugal , Matos et al 1993; 25 American Water Works Association Research Foundation – AWWARF, Alegre 1999; Portuguese Association of Water Resources - APRH, Portugal, Defaria and Alegre 1996; Malaysian Water Association 1996; Water and Sanitation Division, World Bank – WB, Yepes and Dinderas 1996; Asian development Bank – ADB, McIntosh and Ynoguez 1997; Dutch Contact Club for Water Companies and The 6-cities group of the Nordic Countries, van der Willigan 1997; and, International Water Services Association – IWSA, Alegre 1999). The findings of this review reveal that water resources, water quality, and environmental indicators were not given sufficient importance until 1997. One of the reasons could be fewer environmental issues due to relatively lower population and low water demands; moreover, awareness about eagerly addressing environmental problems has also been accelerated during the 21st century. State-of-the-art developments in PIs for WSSs by different international agencies in various parts of the world are presented in the following section. Figure 2.7 The concept of the Faria and Alegre (1996) level of service framework IWA Manual of Best Practice 2.4.2 According to Alegre (1999), PIs are used to assess the performance of a WSS in terms of efficiency and effectiveness. The efficiency is a measure of the extent to which the resources of a WSS are utilized optimally to provide the service (i.e., ratio between input consumed and output achieved); and Global Technical Environmental Financial Coverage Reliability Water Quality Hydraulic Performance Continuity of supply Complaints related to pressure Minimum Pressure violations Maximum Pressure Violations L1: Global L2: Main L3: Pivotal L4: Auxiliary L5: Calculation level 26 effectiveness is a measure of the extent to which the targeted objectives (specifically and realistically defined) of the utility as a whole and its management units are achieved. The first edition of International Water Association (IWA) Manual of Best Practices was published in July 2000. The PIs system was developed through close collaboration with experienced managers, practitioners, and researchers (Nurnberg 2001). This manual was applied for the benchmarking water utilities particularly in Europe, including Austria and Germany (Theuretzbacher-Fritz et al. 2005). An overview of performance benchmarking studies (about 30 large scale companies, 270 medium and small-sale utilities, and about 20 bulk supply companies), including legal framework, and technical standards, was presented in IWA-World Water Congress-Berlin 2001 in Germany. It was observed that all of them were using different approaches for their PA and thus there was a lack of a standardized assessment system. The use of IWA Manual of Best Practice (Performance Indicators for Water Supply Systems) for standardized and comparable PA was recommended by the congress (Nurnberg 2001). The general concept of this manual was layered pyramid structure, starting from raw data at the bottom, feeding the PIs on the above layers (Figure 2.8). This structure consists of a theme of indicators, sub-indicators, and variables. Alegre et al. (2000) structured more than 150 PIs in six categories (i.e., water resources, personnel, physical, operational, quality of service, and economic and financial). These groups help the user to identify the purpose of a specific indicator and are given a two letter code (e.g., ‘Pe’ represents “Personnel” group). Each group is sub-divided into sub-groups for further understanding (e.g., total personnel, personnel per main function, personnel qualification, etc.). These sub-groups consist of a number of PIs calculated from variables; where variables are the baseline data elements consisting of measured or recorded values in specific units. All these measured values are required to be used with certain accuracy and reliability bands to indicate the quality of the observed data (Alegre et al. 2000). Figure 2.8 Concept of layer pyramid used by Algere et al. (2000) for calculating performance indicators Variable (Data Elements) Indicators/ sub-indicators Indicator Sub-group Indicator Group 27 These PIs were further prioritized on the basis of their relative importance (level of priority) as level 1,2&3. Level-1 is the first layer of indicators that provides a synthetic global overview of the efficiency and effectiveness of the water undertaking. Level-2 consists of additional indicators, which provide a more detailed insight than the Level-1 indicators for users who need to go further in depth. Level-3 indicators provide the referred comprehensive global assessment of the undertaking (Alegre et al. 2000, 2006). This level-based-structure can provide a basis to start a performance evaluation process for SM-WSS with Level-1 indicators; in the case of SM-WSS it could be possible that Level-3 indicators are not important. The IWA published the 2nd edition of its manual in 2006. In this edition, the total number of PIs increased to 170; however, no changes were made in categories of PIs (Alegre et al. 2006). This reference in this review has been used as IWA (2006) for the purpose of comparison between different agencies. A summary of PIs along with a number of sub-indicators is presented in Appendix A-1. For detail of sub-indicators and allocated levels, interested readers are referred to Alegre et al. (2006). It can be seen in Appendix A-1 that the system seems to be well balanced and covers all aspects of a WSS. Indicator sub-groups (in each category) provide a way to effectively interpret an indicator; for example, the sub-group of ‘pressure and continuity’ is placed under the category of quality of service. This sub-group further contains 8 PIs related to pressure and continuity of supply, such as adequacy of supply pressure at delivery points in terms of percentage, and number of water interruptions per thousand connections, respectively. The proposed system seems to be well structured with wider applicability and can include new indicators as well. Water quality has also been given sufficient importance by explaining aesthetic, physical-chemical, and biological indicators separately. However, more recently identified environmental indicators such as greenhouse gas emissions were missing. SigmaLite (2.0) software was developed on the basis of the framework proposed in the 2nd edition of the IWA manual. Details can be seen at www.sigmalite.com. The software also generates charts showing comparisons between the calculated PIs and the benchmarks. The World Bank 2.4.3 The International Benchmarking Network for Water and Sanitation Utilities (IBNET) was launched in 1996 under the water and sanitation program of the World Bank (WB). The IBNET was developed by the Energy and Water Department of the WB with the aim to provide access to comparative information and to promote best practices among water supply and sanitation providers worldwide. The IBNET provides standardized measurements (i.e., a set of tools) to the water suppliers to assess their own operational and 28 financial performance and against the performance of similar utilities at national, regional, and global levels. Through the IBNET platform the utilities around the world, including South Asia and Africa, found an opportunity to compile and share their operational performance and costs (Berg and Danilenko 2011). This reference has been used as WB (2011) for the purpose of comparison between different agencies. The IBNET is a toolkit having a set of financial, technical, and process indicators, which provides a gradual approach to utilities having little or no variables data to calculate the indicators, by providing a “Start-up kit” to move slowly towards a more advanced benchmarking system. In this way, this kit could be a useful tool for SM-WSS and utilities in developing countries with data limitations. However, the toolkit does not cover a large set of indicators that have been used by water supply agencies in developed countries, such as IWA (2006). The IBNET categorizes the 80 selected PIs into 12 groups as presented in Appendix A-2 (Berg and Danilenko 2011). Along with the categories mentioned in Appendix A-2, the manual also address 8 normalizing factors, including operating cost, staff, revenue, system failure indicators, population, number of connections, volume of water used, and network length. Service coverage is taken as a separate category in IBNET, whereas in IWA (2006) the coverage indicators are considered in the quality of service category. Water losses are taken up as non-revenue water. The IBNET system assesses the network performance in terms of number of breaks per km per year; this indicator along with other water loss indicators is considered under the operational category in the IWA manual. In IBNET, only metering is considered amongst all the physical components of the WDS, whereas other important indicators including pumping, valves, hydrants, treatment, and storage have not been included. Other categories mentioned in Appendix A-2 (i.e, operating costs and staff, billing and collections, financial performance, and assets) are mainly financial and/ or economic indicators and could be addressed under one category as sub-components to simplify the overall structure of the system. However, water affordability is an important indicator for future investment planning in the water sector, particularly in developing countries and SMWU. In water quality parameters only residual chlorine is considered. Overall, the IBNET system appears to be suitable for cross-utility and cross-country performance comparisons in developing countries. 29 National Water Commission (NWC), Australia 2.4.4 The Australian Government’s National Water Commission (NWC) has developed a national performance framework (NPF) for performance evaluation for both the urban and rural water utilities in Australia. The commission used only 32 selected PIs grouped into four categories (i.e., characteristics, customer service level, environmental and water management, and financial) for comparative analysis of rural water utilities. For urban water utilities with complex WSSs and large administrative structures, 116 PIs have been grouped into seven categories (i.e., water resources, asset data, costumers, environment, public health, finance, and pricing). The commission has been publishing annual reports on performance reporting indicators and definitions since 2005 (NWC 2012). Unlike the IWA manual, here all the indicators are not calculated as ratios, divisions, and percentages. The reason could be that this framework is used to compare the performance of utilities operating specifically in Australia under similar environmental conditions. A summary of the PIs is presented in Appendix A-3. The dry conditions resulting in water shortages have led Australia to partially rely on ‘rainfall independent supplies to enhance the water security in the country’ such as desalination of sea water, recycled water, etc. (NWC 2012, Chartres and Williams 2006, Bari et al 2005). Moreover, per capita water consumption in Australia is one of the highest in the world (Stoeckel and Abrahams 2007). Due to these issues, water resource indicators have been given primary importance in Appendix A-3. Twenty-three PIs of water supply provide detailed insight to all possibilities of water resources (e.g., groundwater, surface water, desalinated marine water, desalinated groundwater); the types of supplies (e.g., residential, commercial, industrial, etc.); and the water quality (potable and non-potable). An example of the indicator of “Total sourced water” is presented in Figure 2.9. The indicators selected in this system are not defined as data elements or variable, sub-indicators or process indicators. For example, all the indicators given in Figure 2.9 are essentially the data values (i.e., volume of water sourced) (see comments in Appendix A-3). These values cannot provide the basis of cross-comparison with other similar WSSs at an international level. These PIs need to be calculated in terms of per thousand of population served or similar units. Indicators like percentage of recycled water can be compared with other WSSs practicing the same concept of reuse of supplied water (after applying desired level of wastewater treatment). The way environmental indicators are addressed in the NWC (2012) framework, in terms of GHG emissions, is worth mentioning. Most of the development activities are generating various types of GHG 30 such as CO2, CH4 and N2O, which eventually lead to global warming. For example, use of vehicles for water quality monitoring, bills delivery and collection, transportation of construction materials and operational equipment, and operations of pumps and motors for distribution of water at desirable pressures produces significant greenhouse emissions. Moreover, most of the supplied water is converted into wastewater after use, and thus requires a certain degree of wastewater treatment either for reuse (to meet desired irrigation of landscaping standards) or final disposal (depending on assimilative capacity of the receiving water body). In this regard, different wastewater treatment plant operations also utilize energy and thus produce GHG emissions. The Commonwealth of Australia (CWA) (2011) expresses different emission factors in terms of a quantity of a given GHG emitted per unit of energy (e.g., kg-CO2-e/GJ for energy; tCH4/t coal for fuel; etc.). These emission factors were used to calculate GHG emissions with activity data (e.g., kiloliters multiplied by energy density of petrol used). The overall GHG emissions from each activity of a WSS have been calculated as CO2-equivalent. The CO2-equivalents are the amounts of carbon dioxide that would have the same relative warming effect as the greenhouse gases actually emitted. It is desirable to consider indicators concerning global climate change for sustainable development in water sector. The indicator system proposed by NWC (2012) could be very useful in areas facing water shortage or droughts. The framework shown in Figure 2.9 is indeed very comprehensive, and provides a deep insight to the interrelationships between different water resource indicators. Similar interactions have been developed between different indicators for other groups. Details can be seen in NWC (2012). However, certain very important indicators such as personnel and operational (i.e., pump, storage, network inspection, leakage inspection, etc.) have not been included. In case of a very detailed organizational and administrative structure for water utilities, like the one in Australia, indicators related to personnel and operational efficiency might not be significantly important for consideration. However, for SMWU around the world where technical staff availability to solve day to day operational problems varies from system to system, these indicators are inevitable for the overall PA. American Water Works Association (AWWA) 2.4.5 The American Water Works Association (AWWA), established in 1881, is the first scientific research organization of drinking water in North America. AWWA started the performance evaluation of their water utilities in 1995 under the utility quality service program. In one of their early documents, titled “Distribution System Performance Evaluation” published in 1995, criteria consisting of adequacy 31 (quantity and quality), dependability (interruptions), and efficiency (utilization of resources) were used to evaluate the performance of water distribution systems. The main focus of discussion was related to distribution system components, and not on the environmental and water resource components. Figure 2.9 An example of interaction between various indicators and sub-indicators in the NWC (2012) performance assessment framework “A partial reproduction of Fig 1 Interrelationship between NFP indicators, p. 26 of National Performance Framework: 2010-11 urban performance reporting indicators and definitions handbook” Later in 2004, the AWWA launched the QualServe Benchmarking Program (originally named as a utility quality service program) which proposed a set of high level (most important) PIs with a goal to facilitate inter-utility comparisons, and intra-utility trends analysis. In this benchmarking program 22 key performance indicators were suggested for water and wastewater utilities. Amongst these, 17 indicators are applicable to water supply. Some of these indicators comprised a set of indicators (or sub-indicators), which makes a total count of 35 indicators (Lafferty and Lauer 2005). The PIs have been grouped into four categories, including organizational development, customer relations, business operations, and water operations. The indicator system showing groups of PIs along with individual PIs is presented in Appendix A-4. Using these PIs, the AWWA published a survey data and analysis report in 2005 for benchmarking 202 water utilities across all over North America, including Canada. Amongst all the participating utilities in the survey, only 21% were SMWU with a population less than 100,000. It is also reported that SMWU were operated at more residential cost (customer relation indicator) than the LWU. Conversely, water distribution pipelines renewal and replacement rates (business operations indicator) were much lower in SMWU than the larger ones. Environmental and water resources indicators have also not been addressed. Ground Water Potable Total Sourced Water Surface Water Marine desalination Recycled Volume of water received from bulk supplier Storm water used Non-Potable Recycled Storm water Received 32 Office of the Water Services (OFWAT) – UK and Wales 2.4.6 The technical evolution of the water industry gives an insight into the history of privatization of water utilities in United Kingdom (UK). In 1970, the British government decided to merge hundreds of medium and/or small sized water companies into 10 water authorities to solve the problems associated with lack of human resources and equipment. Later, the water authorities found that the existing systems had become very old and required massive investments for improvements; therefore, it was decided in 1990 to privatize all water and wastewater services in England and Wales. Moreover, the consumers had higher expectations due to increasing water rates; thus the Office of the Water Services (OFWAT) monitors and compares the performance of these water companies to ensure that consumers are getting what they pay for and also to check the compliance of the companies with their legal obligations (OFWAT 2010). Recently, OFWAT has reported a minimum set of 14 key PIs (grouped into 4 categories) to review the prices and to check the regulatory compliance of water companies in UK and Wales (OFWAT 2012, 2010, and 2010a). The information regarding these indicators is given in Appendix A-5 along with description of some specific indicators such as Service Intensive Mechanism (SIM) and Security of Water Supply Index (SoSI). SIM is used to control the water rates, whereas SoCI indicates the guarantee a supplier can give to ensure the level of service. Moreover, sufficient importance has been given to environmental indicators. The application of OFWAT indicator system seems to be suitable for a regulatory authority dealing with a number of utilities, irrespective of their size. At an individual level, companies might have been using a much wider set of indicators (e.g., indicators covering the quality of supplied drinking water). The reason for considering such a small number of indicators probably is the overall performance monitoring of different companies, for which financial and customer service indicators are more important to ensure customers satisfaction. Some of these PIs include several variables and provide information regarding performance of more than one indicator qualitatively or quantitatively. For instance, SIM is a financial mechanism to incentivize optimum levels of customer service through the price control process. It is comprised of a quantitative indicator that measures complaints and unwanted contacts; and a qualitative indicator that measures how satisfied customers are with the quality of service they receive, based on a survey of consumers who have had direct contact with their water company. 33 SIM can be calculated to assess the performance of the company for customer satisfaction using the following formula (OFWAT 2012 and 2010): SIM = {[(S – LS) / (HS – LS)] x WS} + {[1 – ((C – CL) / (CH – CL))] x WC} [2.1] Quantitative indicator Qualitative indicator where, S is the satisfaction score achieved (qualitative survey); LS is the satisfaction minimum (1); HS is the satisfaction maximum (5); WS is survey weighting (50); CL is composite measure minimum (0); CH is composite measure maximum (600); WC is the quantitative composite measure weighting (50); and C is the quantitative composite measure, which can be calculated by using the following equation: C = (All lines busy + calls abandoned + unwanted telephone contacts + (written complaints*5) + (escalated written complaints*100) + (Consumer Council of Water (CCW) investigated complaints*1,000)) / (connected properties/1,000) [2.2] Each of the elements in Equation (2.1 and 2.2) have been given weights (i.e., the values being multiplied with the elements; for example all lines busy weighs “1” and CCW investigated weighs “1000”) to reflect the increasing impact on consumers and the resulting cost to the supplier. In equation (2.2), the weights seem to be established based on the relative importance of the elements and their frequency of occurrence. For instance, the frequency of escalated written complaints is usually lower than the written complaint, but their significance is higher, in result their corresponding weight of ‘100’ is 20% higher than the weight assigned to the written complaints (i.e., 5). The SIM is calculated annually by combining the two indicators (quantitative and qualitative) with equal weightings (i.e., 50). The value of SIM is assessed as “tolerance” designated with different colour, such as green (SIM > 50), amber (SIM = 40-50), and red (SIM < 40). A higher value of SIM indicates better performance (OFWAT 2012, 2010). According to OFWAT (2012), new small size companies are entering into the OFWT system and the set of indicator system proposed in Appendix A-5 has been well established. National Research Council (NRC) – Canada 2.4.7 NRC (2010) proposed a model framework composed of three building blocks: objectives, assessment criteria, and the PIs. As per NRC, the six key objectives (i.e., public safety, public health, economy, environmental quality, social equity, and public security) can only be satisfied through a feasible and 34 durable initial design, an efficient and continuous maintenance system, the preservation activities to ensure acceptable physical condition, and acceptable levels of functionality during the entire life cycle of these important public assets (NRC 2010). In the framework proposed by NRC, the PIs are evaluated for eleven assessment criteria to provide the linkage between the PIs and the objectives of the WSS. These assessment criteria are also known as ‘indices’ or ‘indicator domains’. These are the statements or principles used to determine whether the specified, above-stated objectives have been met or not (either qualitatively or quantitatively). Under these objectives, 31 PIs along with their associated groups are presented in Appendix A-6. The allied assessment criteria for each group of indicators are defined in the following (NRC 2010): - Safety Impacts: How water supply services support the reduction of incidents or accidents that result in death/injury and/or property loss. Health Impacts: The health impacts (both direct and indirect) which are beneficial or detrimental to consumers as well as to the general public. - Security Impacts: The performance of the water supply service in terms of protecting the security of the users, operators, and public at large. - Economic Impacts: The direct and indirect impacts (beneficial or detrimental) of WSS on local, regional, and national economies. Environmental Impacts: The direct and indirect impacts of water supply service on the natural environment (air, water, soil, fauna, and flora) and on climate change. - Quality of Service: An assessment of how well the service meets established levels of service, regulatory requirements, industry standards, and customer satisfaction. - Access to Service: The geographical coverage and affordability of infrastructure services, and provision of access to people with disabilities. - Adaptability: The capacity of the service to adapt to short and long term changes and pressures. - Asset Preservation, Renewal and Decommissioning (Asset PRD): The management of water supply assets to keep the service operational at its intended level of service through inspection, routine maintenance, repair, rehabilitation, renewal, and ultimately, decommissioning. - Reliability of Service: Ability of WSS to perform its required function under stated conditions for specified periods of time. - Capacity to Meet Demand: The capacity of the service to meet demand under current and future conditions, extreme events, and in emergency situations. The indicators proposed by NRC (2010) in Appendix A-6 provide detailed information regarding environmental, public health, social, security, and economic performance. On the other hand, PIs related 35 to personnel, physical and operational aspects of a water utility have not been given required importance. Out of the 37 PIs listed in Appendix A-6, 18 PIs are essentially the service indicators; the remainders are related to both the services and the assets. For example, protection against climate change impact can be reduced by maintaining and improving the conditions of the pumps and vehicles (i.e., lowering the emissions from fuel burning and use of electricity), which will also improve the overall service of the WSS. Overall, the PIs seem to be more suitable (directly) in the context of asset management at strategic level and need to be supported with more PIs for practical performance assessment of SMWU at tactical level. Asian Development Bank (ADB) 2.4.8 The Asian Development Bank (ADB) has developed the project performance management system (PPMS), having a result-based management approach focusing on service targets and outcomes. The concept emerged during the 1980’s, when result-based management was gradually applied to public sector management in the course of public sector reforms. The result-based management needs a series of tools to carry out strategic planning, performance monitoring and assessment, and reporting. However, this management system needs three prerequisites: i) support from the leadership, ii) result-based organizational culture, and iii) improved support systems. Moreover, the approach includes the whole project life cycle encompassing project identification, preparation, appraisal, loan negotiations and approval, implementation, and finally the project evaluation (ADB 2012). As ADB is a funding agency, it is primarily concerned with the allocation and utilization of resources in a water supply project. After the completion of the project, the ADB’s independent evaluation department appraises the project performance, and also shares the experiences and lessons learned for the planning and design of new projects. The PPMS developed by the ADB is based on a comprehensive project design and monitoring framework (DMF) shown in Figure 2.10. In DMF the cause-effect relationship between inputs, activity, outputs, outcome, and impacts has been established to determine the targets at the result level, and to select the PIs for gauging these selected targets. The selection of the relevant indicators is carried out with the participation of all the stakeholders in the process of problem identification, targets analysis, solution selection, formation of assumption, and risk analysis. The first column in Figure 2.10 is a design summary, which outlines the elements of the project (i.e., inputs, output, outcome, and impact). The other three columns provide a framework for project performance and monitoring. Details can be seen in ADB (2012). 36 To understand the PA mechanism proposed by ADB (2012), the core components of the framework are impacts, outcomes, outputs, activities, and inputs. Impacts are the goals or longer-term targets referred to as the sectorial, sub-sectorial, or in some cases national targets; or the social, economic, environmental, and policy changes brought about by the project. Outcomes are expected targets for realization upon project completion and they should explicitly describe the specific development issues to be addressed by the project. Outputs are the physical assets, tangible goods, and/or services delivered from the project, as well as the descriptions of the project scope. Activities consist of a series of tasks conducted for the realization of outputs from production, and Inputs are the main resources necessary for engaging in activities and generating outputs, including staff, equipment, materials, consulting services, and operating funds. Design Summary Performance Target/ Indicator Data Sources/ Reporting Assumptions (As)/ Risks (Rs) Impact As/ Rs Outcome As/ Rs Outputs Activities with Milestones inputs Figure 2.10 ADB (2012) Project Design and Monitoring framework (DMF) In the DMF system, targets (outcome) can be gauged both quantitatively and qualitatively. In situations when quantitative analysis are not possible, qualitative indicators are determined and then converted to quantifiable data using normalization methods to realize their gauging functionalities. According to this system, a PI should be clear, relevant, economical, adequate, and monitorable (CREAM), where Clear means precise and unambiguous; Relevant means appropriate and timely; Economical means available at reasonable costs; Adequate means sufficient to access performance; and Monitorable defines that the indicator can be independently verified. Overall, 15 PIs at impact level and 39 at outcome level for urban water supply systems are presented in Appendix A-7. Indicators given in Appendix A-7 are grouped based on specific targets established by the ADB. On the other hand, all these indicators also belong to various groups of indicators as described by other organizations (refer to comments column of Appendix A-7) (IWA 2006, AWWA 2004, NWC 37 2012). The framework proposed by ADB seems to be relatively complex for practical application for SM-WSS. However, its contribution could be useful in order to identify the commonly used important PIs. Canadian Standards Association (CSA) 2.4.9 The Canadian Standards Association (CSA) is a non-profit organization chartered in 1919. Later, it was accredited by the Standards Council of Canada in 1973. The CSA Technical Committee reviewed and recommended the International Organization for Standardization (ISO) Standards guidelines (i.e., CAN/CSA-Z24510, CAN/CSA-Z24511, CAN/CSA-Z24512) for improvement of service to users for Canadian water utilities in 2007. Amongst these standards, CAN/CSA-Z24510 is service-oriented whereas, both CAN/CSA-Z24511 and CAN/CSA-Z24512 are management-oriented. These guidelines are applicable to both publicly operated and privately owned water utilities. All these standards propose a step-by-step, loop-back approach to establish PIs (CSA 2010). According to CSA (2010), the main objective is to provide guidelines (consistent with the goals defined by the relevant authorities) to stakeholders for the management, assessment, and improvement of water utilities to ensure desirable service to the users. The contents of the international standards given in ISO 24512: 2007 (CAN/CSA-Z24512-10) are shown in Figure 2.11. The management components of a utility defined by CSA (2010) in CAN/CSA-Z24512 are activities and processes to achieve the principal objectives including protection of public health, meeting user’s needs and expectations, provision of services under normal and emergency situations, sustainability, promotion of sustainable development of community, and protection of environment. In order to meet these objectives, the CSA (2010) framework provides guidelines at organization, planning and construction, and O&M levels for efficient performance of all the management components. It is obvious that each management component has its own hierarchal structure and specific requirements at all levels. For example, the objective with storage structures is provision of emergency services; therefore, the guidelines applicable to this asset would be related to watershed characteristics. The next step in the CSA (2010) framework is defining the assessment criteria. One service criterion can be related to more than one objective. For example, the assessment criteria of source protection is associated with protection of public health, provision of services under normal and emergency situations, sustainability, and protection of the environment at the same time. The calculations of PIs proposed in this framework are similar to IWA (2006) based on context information and variables. Details of each component of the framework can be seen in CSA (2010). 38 A practical application of the CSA (2010) system is the Canadian National Water & Wastewater Benchmarking Initiative (NWWBI). The 2012 Public Report prepared by AECOM (2012) summarizes the performance evaluation results of 41 water utilities from the year 2010. In this report, the performance of a utility is addressed for its three major physical components (i.e., utility, water distribution, and water treatment). A total of 62 PIs are reported against all the objectives/goals (Appendix A-8). In CSA (2010) no specific discussion has been made on the performance assessment of SMWU. Although the framework seems to be well structured and comprehensive, its direct application for SMWU needs to be further investigated by involving stakeholders in the benchmarking process (CSA 2010). Figure 2.11 Content and application of the ISO (2007) 2.5 Evaluation of Performance Indicators Systems The PIs are grouped in different categories by various agencies (CSA 2010, NWC 2012, ADB 2012, OFWAT 2012, WB 2011, NRC 2010, IWA 2006, AWWA 2004) as stated above. Different agencies have used different terminologies as per their specific organizational setups and operational requirements. For Identification of Physical and Management Components Definition of Service Objectives Application of Management Guidelines for water utility’s O&M Definition of Assessment Criteria Definition of Performance Indicators Performance Assessment Vs. Project objectives 39 example, Alegre et al. (2006) included a water interruption indicator in the operational category, whereas the same PI was grouped into the customer relations category by AWWA (2004). Moreover, various PIs associated with economic performance of a WSS are inter-related (e.g., finance, economic, and pricing). Some of the agencies like NWC (2012) have included PIs of pricing and finance in separate categories, whereas others have grouped them into the same category of economic and finance (AWWA 2004, ADB 2012). Generally in the case of SMWU, due to relatively smaller financial structure than LWU (for which various categories have been developed), the relevant financial PIs can be included in one category. A comparison of different categories in Table 2.2 presents the way different agencies grouped different PIs in each category; most of the PIs are related to finance, customer service, and operational categories. Data availability might not be consistent among different utilities even within the same geographical region. For example, sufficient data for performance evaluation is available in a small or medium-sized utility operating in the near vicinity of a large urban center; and within the same region a similar sized utility working under the same or different operating conditions away from large cities (less interaction with LWU) might not be having the similar type of the data. Therefore, a set of important PIs should be established for cross-comparison by country or region, and then each utility can include additional PIs according to availability of data, water source, and administrative setup for intra-utility performance management. The agencies that consider environment as a separate category are mostly the ones responsible for combined PA of water supply and wastewater systems, because most of the indicators are related to discharge of wastewater and sludge disposal issues (NWC 2012, AWWA 2004). Furthermore, most of the agencies included impacts on water resources in the category of water resources rather than environment, and therefore only one parameter is left (i.e., GHG emissions) in the environment category, which is relatively new as well (CSA 2010). Therefore, it is recommended that water resources be included in the category and that the category be renamed “Water Resources and Environment”. In addition, indicators related to impact of residual chlorine (not addressed so far) on aquatic life should be included in this category. The PIs related to water quality and public health are considered either under the category of customer services or operational. For example, IWA (2006) considered water quality in the operational category, whereas the WB (2011) considered residual chlorine as the only water quality indicator under the category of service. These are important parameters to ensure supply of safe drinking water to the consumers. It is well established that water quality regulation compliance decreases as the size of the 40 utility decreases (USEPA 2006a), meaning more water quality problems and resulting public health issues in SM-WSS as compared to L-WSS. Therefore, water quality and public health indicators are grouped into a separate category in Table 2.2 to emphasize on their importance. Table 2.2 Number of water supply performance indicators under different categories by various agencies PI category WB (2011) OFWAT (2012) ADB (2012) NWC (2011) NRC (2010) IWA (2006) AWWA (2004) CSA (2010) Water resources/ Environmental 111 2 15 23 + 312 315+2 4 - 5 Physical assets 1 - 2 213 - 15 - 7 Personnel/ Staffing (6+5)3 - 1 - - 26 1120 17 Water quality/ public health 24 - 13 7 3 5 121 7 Operational (3+1)5 48 10 5 7 + 316 3919 822 6 Quality of Service/ Customer satisfaction (9+3+5)6 3+19 2 12 (4+1+3)17 34 2 4 Economic/ Financial/ Pricing 357 410 11 314+18 718 47 923 16 Total 81 14 5411 73 33 170 31 62 1 IBNET includes these indicators under water consumption category (Table 4 for details) 2 IBNET considers only metering level (Table 4 for details) 3 6 under process category and 5 under operating cost and staff category (Table 4 for details) 4 These indicators are considered under quality of service (Table 4 for details) 5 3 Non-revenue water, and 1 of pipe breaks (Table 4 for details) 6 9 indicators from process category, 3 form service coverage and 5 quality of service (Table 4 for details) 7 4 under process indicators, 6 operating costs & staff, 20 billing, 2 financial, 2 assets and one affordability category (Table 4 for details) 8 OFWAT used the term reliability, availability and security for this category (Table 7 for details) 9 SIM includes additional indicators related to customers complaints and 1 hosepipe restrictions from reliability category (Table 7 for details) 10 OFWAT indicators are based on financial performance of companies through more indicators (Table 7 for details) 11 Personnel and customer complaints have not been addressed in detail; due to different structure see Table 9 for details 12 environmental indicators were considered separately by NWC (2012) 13 Indicators of water loss and pipe breaks (5 indicators) were included under asset category (See Table 5 for details), now added to operational 14 3 pricing indicators and 18 finance (Table 5 for details) 15 these water resources indicators were considered under environmental quality by NRC 16 6 indicators considered in public health and 3 under economy (See Table 8 for details) 17 4 public safety; 1 social equity and 3 public security (See Table 8 for details) 18 3 under social equity and 4 under economy category by NRC (2010) 19 water quality monitoring was considered under operational category by IWA (2006) (See Table 3 for detail) 20 AWWA (2004) used the term organizational development (Table 6 for details) 21 drinking water compliance under water operations category by AWWA (2004) 22 water disruptions under customer relations; system renewal rate under business operations; water loss and structural integrity under water operations 23 indicators were distributed amongst customer relations, business operations and water operations categories Only the IWA (2006) has given desirable importance to the numbers, skills, training, and qualification of personnel (staff) by developing 26 PIs under a specific category allocated to personnel. Some of these PIs are specific to different components of SM-WSS. For instance, consider a small WSS having saline groundwater as the only available water source; the only possible option would be a tertiary-level, highly-technical water treatment facility, such as reverse osmosis. Therefore, the PIs related to hiring of skilled operators, their regular trainings and their salaries will be extremely important in this case. Overall the IWA (2006) system seems to be more balanced with a maximum number (170) of total PIs, followed by WB (2011) with 81 indicators relatively well distributed amongst all categories as compared to the rest of 41 the PIs systems. PIs have also been distributed rationally to cover all physical and management components of a WSS with a total number of 62 PIs by CSA (2010). It is well recognized that PIs should be clearly defined, easy and economical to measure and verify, easily understandable, and relevant to a specific WSS. Moreover, the overall framework of the PA should be simple, well defined, comprehensive (i.e., covering all components of a water utility), and comparable with similar utilities at regional, national, and international levels. Various systems and PIs are reviewed in the above sections along with their strengths and limitations. Here an effort is made to evaluate these systems for their general application (not limited to a specific region) to SMWU. The findings of the literature review conducted above are summarized in Table 2.3. The PIs defined by each PA system are evaluated in the context of SMWU on the basis of the following criteria: - Understandability – the indicator should be easily understandable to both the utility operators and the public; - Measurability – the data required to calculate an indicator should be easy to measure and the indicator’s calculation should also be as simple as possible; and - Comparability – the indicator should be comparable across similar utilities in the same region as well as for international comparisons. The following criterions have been used for the evaluation of the PIs frameworks proposed by these agencies, as mentioned in Table 2.3: - Simplicity – How simple (i.e., interrelationships between data variables, indicator groups, and PIs) is the framework to be implemented for SMWU? - Comprehensiveness – How and up to what level of details does the framework consider the most important aspects (i.e., personnel, customer service, financial, and environmental) of the WSS? - Overall Applicability – Is the framework applicable to SMWU in its original form with minimum modifications (i.e., by just selecting the relevant suitable PIs w.r.t a specific WSS)? The IWA (2006) seems to be the most suitable system for SMWU in Table 2.3. This system provides a wide range of PIs with a comprehensive classification system. Moreover, categorization of PIs into different levels may also facilitate the managers of SMWU starting with Level-1 indicators and then including higher levels depending on data availability. The way the data variables (as m3, numbers and cost), and the PIs (in terms of percentages and ratios) are distinguished from each other also provides an 42 opportunity for the regulatory agencies and utility managers to perform cross-comparison with similar sizes and types of utilities. The PIs systems developed by ADB (2012), NWC (2012), and CSA (2010) also seem to be suitable for SMWU with appropriate modifications. It is very important to mention here that the rationale developed to review the distribution of PIs by various agencies in the above sections is done to evaluate that how the important PIs have been grouped in each category associated with various components of the overall PA framework of SMWU. The purpose of comparison between different PIs systems is primarily to meet the above stated objective rather than identifying the strengths and/or the limitations of these PIs systems. It is understandable understand that each system of PIs has been developed to meet the specific objectives of a certain project within its defined geographical boundaries. 2.6 Performance Assessment of Water Utilities with Limited Resources – Some Case Studies South Asia - Bangladesh, India, and Pakistan 2.6.1 Bangladesh is one of the most densely populated countries in the world with over 146 million residents (WSP 2009). According to the Government of Bangladesh, the urban utilities in Bangladesh are not performing well due to lack of effective management. Under the Bangladesh benchmarking and performance improvement for water utilities project facilitated by Water and Sanitation Program – South Asia (WSP-SA), the concept of performance benchmarking was introduced in 2005-06 for 11 utilities of all sizes (i.e., serving a population ranging from 21,000 to 10,000,000). The Government of Bangladesh took the initiative to introduce benchmarking and performance improvement programing (BM&PIP) tools (i.e., IB-Net) along with other stakeholders. In a similar study conducted by World Bank, the performance of over 30 urban water utilities across Bangladesh, India, and Pakistan under WSP-SA was compared to evaluate the effectiveness of the program (WB 2010). Almost all of the utilities in these countries are providing intermittent supply with an average duration of 5 hours a day. The selected PIs used in this study are given in Table 2.4. In result of the benchmarking process in Rajkot, India a 48% increase in billing and 31% increase in collection were achieved in a 3 year period between 2006 and 2009. Moreover, 20,000 unauthorized connections have also been regularized in the same time frame (WB 2010). It can be observed in Table 2.4 that the benchmarking process is currently focusing on meeting water demands and revenue collection to meet the financial and operational requirements in these South Asian countries. It is expected that with time, water 43 quality, personnel, environmental, and water resource indicators will also be included in the benchmarking process. Table 2.3 Evaluation of different performance assessment systems for their applicability to SMWU Eastern Europe, Caucasus and Central Asian (EECCA) Countries - Armenia 2.6.2 In the past, under the administration of the former Soviet Union (FSU), water was supplied to consumers by the public sector suppliers at very low prices. The gap between service revenues and cost of provision used to be filled by the government budget (Mitrich 1999). Armenia is one of the 12 countries in Eastern Europe, Caucasus, and Central Asia (EECCA). After the collapse of the old administrative system and subsequent financial crises in the FSU, the WSSs completely deteriorated and were unable to meet the demand of residential, commercial, industrial, and institutional consumers in Armenia (Mkhitaryan 2009). In 2001, the water sector of FSU was decentralized and privatized to improve the situation, as part of institutional, legislative, and regulatory reforms in the country. In this system, priority was given to the customer satisfaction due to higher water rates. Therefore, it was decided that performance of the water suppliers would be assessed and customer feedback should be appreciated for their satisfaction. After reviewing the previous studies and the data availability, the PIs were finalized by the Public Service Regulatory commission (PSRC), Armenia in 2005 and 2008 (PAGS 2008). These selected PIs are listed in Table 2.4. After implementing these PIs, substantial improvements were observed: an almost 50% reduction in energy; a massive drop in per capita consumption from 250 to 87 lit/day due to improved Performance Assessment System Performance Indicators Performance Assessment Framework Understandability Measurability Comparability Simplicity Comprehensiveness Overall applicability to SMWU WB (2011) OFWAT (2012) ADB (2012) NWC (2012) NRC (2012) IWA (2006) AWWA (2004) CSA (2010) Legend: - High - Medium - Low 44 metering system; and significant improvements in user fee collection efficiency, from 21% to 90% before and after the privatization, respectively (Mkhitaryan 2009). Arab Countries 2.6.3 Recently in July 2010, the first training course to establish key performance indicators (KPIs) and benchmarks for water utilities in the MENA/Arab region was held in Alexandria, Egypt. The course was organized by Arab countries water utilities association (ACWUA); InWEnt Capacity Building International, Cairo and Germany; and Alexandria Water Company, Egypt. A number of representatives from six Arab countries including Egypt, Jordan, Syria, Yemen, Palestine, and Morocco participated in the course. One of the main objectives was to promote the use of common PIs within the MENA/Arab region. The four categories of PIs including personnel, quality of service, O&M, and finance and economics were proposed (ACWUA 2010a). These proposed PIs were further discussed in the 1st Arab Water week held in Amman, Jordan during December, 2010. This time 65 participants from 13 countries (in addition to above stated countries, Algeria, Tunis, Lebanon, Bahrain, UAE, Kenya, and Albania) participated in the course and proposed a set of PIs, given in Table 2.4, to start the benchmarking process in the region (ACWUA 2010b). Africa - Malawi and 134 Water Utilities 2.6.4 Kalulu and Hoko (2010) carried out the PA of a public water utility in the city of Blantyre, Malawi. It is a commercial and industrial city with a population of 661,000 as per 2008 estimates. The baseline data (variables) were collected from legal and policy documents, and Blantyre water. The PIs used in their study are given in Table 2.4. For calculation of water loss, the Unaccounted-for-water (UFW) indicator was used instead of the non-revenue water (NRW) which has been used in other case studies of developing countries. The PIs were then compared to the best practice targets proposed by the WB in 2002 in the form of “A Water Scoreboard” (Tynan and Kingdom 2002). This document is a WB note in which Tynan and Kingdom (2002) used the data from 246 water utilities in 51 developed countries and proposed the KPIs to establish best practice targets for developing countries (Table 2.4). 45 Table 2.4 Checklist of key performance indicators used in developing countries Water Operators Partnership Program for Africa (WOP-Africa) was initiated in December, 2006 with the Nairobi Workshop by the WB Water and Sanitation Program (WSP) to endorse the idea of involving a number of African utilities (WOP-Africa 2009). Under this program, the self-assessment process of 134 African utilities from 35 countries was started with a comprehensive utility self-assessment questionnaire (USAQ) adopted from the IB-NET assessment tool. The PIs selected from the standard IB-NET assessment tool are listed in Table 2.4. Table 2.4 indicates that the PIs related to water supplied, water sourced, and the overall water balance are the most commonly used indicators in the water resources category. Number of personnel per 1000 connections is also a very common indicator. Customer complaints were recorded in almost all the cases Indicator Category Performance indicator Arab countries (2010) Africa (2009) South-Asia (2009) Armenia (2009) Scoreboard (2002) Water Resources/ Environmental Water sourced     Water supplied     Water balance     Water consumption     Personnel Number of staff per 1000 conn.     Personnel Training     Health and safety     Quality of service/ Customer service Supply coverage     Customer complaints     Response to complaints     Water availability/ supply duration     Operational Non-revenue water/ UFW     Metering (new conn & maintenance)     Water treatment plant operation     Water main breaks     Water Quality/ Public Health Water Quality compliance     Financial/ Economic Revenue / financial inputs (bills)     O&M cost     Energy cost     Water charges     Billing Efficiency     Collection period     Working ratio     Tariff structure     Connection charges      46 in Table 2.4. The NRW is the most commonly used indicator in the operational category; and the overall costs of O&M, energy, and water charges are the only financial PIs. The results of these case studies suggest that even in developing countries a set of easily measurable PIs can improve the performance of water utilities. Thus, these PIs also provide a guideline for SMWU with limited available data. A framework consisting of suitable PIs is devised to start, implement, and improve the performance evaluation process for SMWU in Chapter 3. 2.7 Selection of Performance Indicators PIs should be selected against suitable multiple criteria such as, adequacy, applicability, usefulness, measurability, attainability, understandability, relevancy, comparability, etc. (ADB 2012, Lee 2010, Giff and Crompvoets 2008, Artley and Stroh 2001). From reported PA studies for water utilities, the PIs have either been selected primarily on the basis of data availability or to assess the performance of a specific component (e.g., water quality) (Zhang et al. 2012, Sadiq et al. 2010, Coulibaly and Rodriguez 2004). In a recent study by Shinde et al. (2012), PIs have been revised for small water utilities in Japan using principal component analysis based on the data obtained from 177 utilities. Such approach for selecting PIs primarily based on available data may overlook several important organizational components of a water utility. Toor and Ogunlana (2010) selected PIs for large scale public sector projects by conducting a survey in which ranking for suitability of PIs was done based on professional judgment of the stakeholders. Wong et al. (2008) developed key indicators for intelligent building systems through a survey based on suitability of each proposed indicator. Ugwu and Haupt (2007) evaluated the PIs for sustainability of infrastructure project adopting the similar ranking approach by involving different types of respondent (i.e., contractor, architect, engineer, consultants, public and private clients, etc.). This approach does not cover other important selection criterions of an indicator, such as measurability, comparability, and understandability. Besides, participants of the surveys in this approach have conventionally been asked to score the attributes of the indicator on a five-point Likert scale (1= not suitable to 5= most suitable). These ordinal (qualitative) scales with such a small range of rating generate significantly small differences in final scores. Selection of a set of most suitable PIs based on such scores might not be easy for the decision maker (DM). Secondly, defining the cut-off (for example top 40% or value greater than 3.5) for the list of ranked PIs limit the applicability of the selected PIs only for the specific case and also does not provide a planned opportunity to include additional PIs in the future with improvements in data management practices. 47 A system of initially identified PIs is devised to start, implement, and improve the performance evaluation process for SMWU in Chapter 3. The proposed system mainly consists of a list of most simple and relevant PIs based on the review carried above. For final selection of PIs, a detailed model based on multicriteria analysis is developed in Chapter 4. 2.8 Performance Benchmarking for Water Utilities Benchmarking has become an essential and continuous activity in several organizations, and has gained strategic importance to improve their performance in today’s competitive environment (Sun 2010). Conventionally, linear regression equations of PIs have also been used for metric performance benchmarking process (Sindhe et al. 2013, AWWA 1996). This approach does not appropriately address the relative performance of the average performing utilities; therefore, the PA results based on such linear relationships might be misleading. A detailed argument on this has been established in Chapter 5. Lambert et al. (2014), reported 14 years of experience of best practices for water balance developed by the International Water Association (IWA) in 71 water utilities spread over 12 high-income European countries. Singh et al. (2014) used 4 PIs to assess the performance of 12 water utilities in India. Sinde et al. (2013) developed linear regression equations (obtaining data from 199 utilities) for performance benchmarking of small utilities in Japan. Rouxel et al. (2008) used contractual and commercial PIs for customer service management of three privately owned water utilities in Italy. Theuretzbacher-Fritz et al. (2008) discussed the use of different types of denominators for PIs, and their use in performance benchmarking of Austrian water utilities. Plame and Tillman (2008) studied the application of sustainable development indicators; in addition to the financial PIs they also included environmental, operational and social PIs for Swedish water utilities. In the past, Marques and Monteiro (2001), used 50 indicators divided into 5 groups for performance management, including structural, operational, quality of service, personnel and economic by developing regression equations using the data obtained from 25 water utilities in Portugal. Most commonly used non-parametric methods for performance benchmarking of water utilities, include data envelopment analysis, Malmquist productivity index, and total factor productivity (Berg and Marques 2011). Marques et al. (2011), applied data envelopment analysis using more than 500 observations encompassing 1144 water utilities in Japan. Carvalho and Marques (2011) included exogenous variables in the efficiency assessment of 66 Portuguese water utilities using non-parametric data envelopment analysis. Corton and Berg (2007) used the total factor productivity index for 48 benchmarking Central American water utilities. This method focuses on productivity changes over time and takes several inputs and outputs into account for analyses. Stochastic frontier analysis has also been used for benchmarking economic indicators of water utilities (Antoniolli and Filippini 2001). Correia et al. (2008) applied stochastic frontier analysis to estimate cost functions, using data from 66 water utilities representing 60% of the Portuguese population. Alsharif et al. (2008) also used data envelopment analysis to evaluate the efficiency of water supply systems in Palestine, primarily in terms of water loss. Singh et al. (2014) compared the PIs system and data envelopment analysis for performance benchmarking of 12 water utilities in India; and obtained similar results from both the methods. Low economies of scale (e.g., highest operating expenses, low debt ratios, low cost recovery, etc.) and data scarcity are the major challenges for SMWU (Worthington and Higgs 2014, Rahill and Lall 2010). The number of PIs should be kept minimum, although balanced to cover all the functional components, to minimize the cost of data collection and validation of performance assessment model (Alegre et al. 2009). The Inter-American development Bank recently developed a universal rating system known as AquaRating for water and wastewater utilities (IDB 2014). The system normalizes the calculated values of 113 assessment elements without addressing the above stated challenges for SMWU. Moreover, existing systems aggregate the normalized PIs scores with the simple weighted average method without considering to what extent this value is close to or away from the desirable performance (Figure 2). The performance evaluation results without considering these issues may not rationally accommodate the sensitivities in the calculated values of the PIs for SMWU. Most of the above mentioned studies were conducted for the inter-utility performance assessment based on comparison among the utilities operating in the same geographical region. First, such assessment methods are based on involving similar water utilities over several years; which is not the case for Canadian SMWU. An inter-utility performance benchmarking model is developed in Chapter 5 to assess the performance of all the functional components of SMWU. A water utility may consist of more than one WSSs and each functional component (e.g., personnel, operational, etc.) may include several sub-components. Moreover, aggregating all the PIs to estimate the overall performance of a functional component (i.e., inter-utility benchmarking) can eclipse the underlying processes (sub-components). For an underperforming utility, it is important to identify the lacking processes (within a functional component) for effective decision making. Studies intra-utility performance management have not been frequently reported in literature. A comprehensive model is developed addressing all these issues in Chapter 6. 49 2.9 Customer Satisfaction Assessment Traditionally, CS for water utilities have been assessed in two ways: i) performance benchmarking, ii) interviews and surveys. The first type is conducted with the help of basic performance indicators, such as, number of customer complaints, response to the reported complaints, unplanned service interruptions, water rates and billing mechanisms (USAID, 2008; Marques and Monterio, 2001). Generally, these indicators are calculated as the number of reported complaints per a specified number of customers (say, 1000), over a specific assessment period, which are then compared with the other water utilities to establish performance benchmarks. In the second type, the utility investigates the customer preferences, and acceptance level (willingness to bear the performance gap). Preference means that the selected option (by the customers) can only be compared with other available options through surveys on willingness to pay (KWR, 2008). Most of the performance assessment studies carried out in the past were focused on operational, personnel, water quality, and financial components of the water utilities (e.g. Berg and Danilenko, 2011; Sadiq et al. 2010; Corton and Berg, 2009; El-Baroudy and Simonovic, 2006). National Water Commission Australia (NWC), Canadian Standards Association (CSA), American Water Works Association (AWWA), International Water Association (IWA), include complaints of water quality, water continuity, pressure, billing, service, duration and frequency of unplanned interruption, estimates an average targeted time of repose to these complaints, and cost of customer communication (NWC, 2012; CSA, 2010; AWWA, 2008; Alegre 2006). The actual root causes of the complaints and the time to resolve these complaints have not been addressed in these benchmarking processes. Several agencies and other peer-reviewed research have used customer interviews for the assessment of CS (Hanson and Murrill, 2013; CDM, 2011; KWR, 2008; USEPA, 2003). Recently in the United States, the effective utility management initiatives’ strategic plan for FY 2013-2017 has proposed to promote and enhance community participation, environmental stewardship communications for awareness, collection of customers’ feedback, and analyses of this feedback to prioritize customers’ communication needs (CWW, 2013). In the UK and Europe, for privately owned water utilities, the CS is being assessed using the water rates and service incentive mechanism index, which qualitatively and quantitatively evaluates the CS. In the qualitative analysis, interviews are conducted; whereas the quantitative analysis use records of telephonic and written complaints, complaints responded or abandoned, and the time of response to complaints (WUG, 2014; OFWAT, 2012). Such extensive approaches might not be viable for SMWU on periodic bases where limited personnel expertise and financial constraints are the main challenges. 50 A detailed risk based model is developed in Chapter 7 for the assessment and management of customer satisfaction using a sustainable approach for SMWU based on the record of customer complaints and experience of field personnel. 51 Chapter 3 Identification of Suitable Performance Indicators A part of this chapter has been published in Environmental Reviews, an NRC Research Press Journal as a review articles titled “Performance Indicators for Small and Medium Sized Water Supply Systems: A Review” (Haider et al. 2014a). In this chapter, a system is devised based on the initially identified PIs to start, implement, and improve the performance evaluation process for SMWU. The proposed system primarily identifies the relevant PIs for each functional component based on the review carried out in Chapter 2; detailed selection is carried out in chapter 4. 3.1 Background The proposed system of PIs shown in Figure 3.1 provides a stepwise approach based on three levels of indictors, including start-up, additional, and advanced PIs depending on the availability of resources and site specific requirements of SMWU. Required data variables are presented for the calculation of start-up PIs, here as, additional and advanced PIs are only listed because the detailed data requirement of these PIs cannot be covered here. In this regard, enough data sources have been provided in Appendix-A for consultation to use the advanced PIs. Moreover, the details of selected PIs used for IU-PBM and In-UPM are also given in Chapters 4, 5 and 6. The utility can evaluate its existing data availability, and then can select the level from which it can start its PA process. A small water utility facing issues related to lack of funding and availability of trained staff in a developing country may start the performance evaluation process with a few of the most relevant start-up indicators. The start-up indicators might be different in the case of a small utility located near a larger city in a developed country, e.g., it is easier for the municipality to hire and retain trained personnel. The number and types of PIs in the latter case would be much higher than the former. Moreover, special care is required in the case of a medium sized water utility, where population is slightly more than that of a small sized water utility. For example, if the maximum population limit for a small sized utility is 3300 persons, the medium sized utility with 5000 persons might be facing all the difficulties as smaller utilities. 52 Figure 3.1 Proposed system of PIs to start, proceed and improve the performance evaluation mechanism in SMWU 3.2 Categorization of Performance Indicators for SMWU In the following sub-sections, the proposed PIs along with their units and required data variables are listed under each category. The selected PIs categories include water resources and environmental, personnel/ staff, physical assets, operational, water quality and public health, quality of service, and financial/ economic indicators. The users of these selected PIs under each category are also mentioned in respective Tables. The users, in a water utility, are classified as technical personnel (T), managers (M), and policy/ decision makers (P) based on the nature and relative significance of the PI in the decision making process. It is possible that more than one user are associated with one PI. Water Resources and Environmental Indicators 3.2.1 Proposed water resources and environmental indicators for SMS utilities are presented in Table 3.1. Some of the water resource indicators strongly interact with environmental ones. For the sustainable utilization of natural water resources, amount of water sourced should not affect the existing designated use of the source (i.e., downstream use in case of surface waters, and lowering of groundwater table). In case of ground water, source yield of a pump is usually given with pump design, and/or in the findings of the hydrogeological investigations carried out during selection of the source. On the other hand, hydrological START UP - Basic Review of performance indicators Check for available data IDENTIFICATION Most important for the utility under study Measurable with the available data ASSESSMENT Calculate the selected Indicators Compare with benchmarks and performance of similar utilities IDENTIFICATION – Additional Additional relevant and important indicators Data required for calculation of new indicators (economically) ASSESSMENT Calculate the selected Indicators Compare with benchmarks and performance of similar utilities IDENTIFICATION – Advanced Further advanced level indicators (if required) Data required for calculation of new indicators (economically) 53 analysis of low drought conditions in the case of a surface water source can give the information about maximum allowable draw during low flow periods. It is better to have assessments on the basis of yearly average, but assessment periods for less than a year can also be adopted by using the proportionate time periods (i.e., 365/assessment period) (IWA 2006). Availability of water resources in a sustainable way is an important water resource indicator. Table 3.1 Proposed water resources and environmental indicators No Use of the PI1 Basic (Start-up) Indicators Additional Indicators3 Advanced (Long-term)4 Indicators Calculation Data variable 1. T/M/P Availability of water resources (%) [(volume of supplied water in a year)/(Annual yield of water resources)] x 100  Water supplied metered volume  Assessment of possible annual yield of source (form surface or ground or both) as per regulations.  Efficiency of reused water supply  Water license capacity  Sector vise availability of water resources (i.e., domestic, industrial, commercial)  Sector vise availability of water resources for both potable and non-potable water (if applicable) 2. T/P Greenhouse gas emissions2 (tones CO2-equivalents per 1000 connected water properties) GHG emissions = [quantity of electricity purchased (KWH) x (Emission factor/1000 connections)]  Number of connections  Electricity consumption records  Emission factors established at state level  GHG emissions from routine transport fuel emissions  Disposal of backwash water (aquatic life should not be affected)  GHS emissions from fuel consumption by stand-by pumps, construction equipment working during maintenance works etc. 3. T/ M/ P Days with restrictions to water service (hosepipe or sprinklers) (%) [(Total number of days with water service restrictions during the year)/(Days in a year)] x 100  Record of the number of days with service restriction during the whole year - - 4. T/M  Impact of residual chlorine on aquatic life due to leakage in mains passing nearby the natural water bodies 5. T/M  Impact of residual chlorine on aquatic life due to flushing of mains 1T – Technical Personnel, M – Management Personnel; P – Policy/ Decision Makers 2 might be applicable to medium sized utilities only in countries where emission factors for consumption of purchased electricity from the grid have been established (CWA 2011) 3 need to be added during within a year 4 might be suitable for medium sized utilities in developed countries Every water source needs to be sustainably utilized by ensuring the continuity of its intended water uses. Integrated water (quality and quantity) management plans requires estimation of optimum degree of treatment of receiving wastewater (also known as total maximum daily loads) based on the allowable threshold concentrations of the aquatic ecosystem to avoid water quality problems. In this connection, a minimum amount of water in the water body is always required to provide a certain dilution. Environmental protection agencies in collaboration with water supply agencies issue water licenses to the water utilities based on their management plans. The utility is supposed to draw water within the allocated license capacity. To achieve this objective ‘water license capacity’ should be compared with the amount to water supplied as an indicator. This indicator also predicts the needs for water conservation, water reuse, and investigation of a new water source in future. 54 In water scarce areas, reuse of wastewater is commonly practiced particularly in water scarce countries (e.g., for agriculture with treatment through waste stabilization ponds). If the amount of water reused cannot be calculated initially, these estimates can be done in the following years. The concept of water reuse can also be useful in situations where the main water use is agriculture. In this regard, there is a need for careful assessment using water balance approach based on the amount of available reuse water, precipitation, and crop/water requirements. This approach is environmentally sustainable with additional socio-economic benefits. Further detailed PIs can be added in later years (if relevant) for type of water (potable, non-potable) and type of service area (domestic, industrial, commercial, etc.) as mentioned in NWC (2010). GHG emissions are responsible for climate change, and may lead to drought conditions in the future. These emissions can be considered at the start since it is not difficult to collect data regarding electricity bills - and the number of connections. However, all countries might not have established emission factors for consumption of purchased electricity for CO2, CH4, and N2O emissions as CO2 equivalent. The values of combined emission factors range between 0.3 to 1.21 in Australia for various states and territories. However, a value of 0.67 was recommended for other territories in Australia in which these emission factors have not been established (CWA 2011). Water restrictions describe the indirect situation of availability of water resources in the supply area, and effectiveness of water conservation plan. Sprinkler water regulations should be implemented throughout the year under present scarcity of freshwater all around the globe. It is also recommended to develop sustainable strategies to control water consumption even in areas where abundant water resources are available. Acceptance to higher water losses on the basis of financial analysis (i.e., lower cost of water in case of plentiful ground or surface waters as compared to the repair cost of the leaking water mains or service connections) could be highly misleading for long-term sustainability. Consider a WSS with very high water consumption; more than 80% of the water is being converted into wastewater, which will not only lead to higher wastewater treatment cost (and GHS emissions) but will also negatively impact the receiving water body (e.g., low dissolved oxygen, eutrophication, etc.). Therefore, sufficient attention should be given to the PIs associated with water resources and environment. Impact of residual chlorine on aquatic life has not been addressed in literature so far. In this regard, the following two indicators are proposed here (PIs No. 3&4, Table 3.1): 55 WE 3: Impact of residual chlorine on aquatic life due to mains leakage and breaks: This PI can be estimated in terms of the distance of broken water main from the receiving water body and the ground slope WE 4: Impact of residual chlorine on aquatic life due to flushing of water mains = (Flow rate of water in receiving water body)/ (Flow rate of water drained from flushing of main) OR (the distance between the point of flushing and the receiving water body in terms of length of surface drain) In both the indicators, the impact can be considered negligible if high (i.e., 1:10) dilution is available in the receiving water body. Personnel/ Staffing Indicators 3.2.2 Personnel and staffing indicators are important and strongly aligned with the performance of the other functional components as well. However, it is observed in the review conducted in Chapter 2 that most agencies either have not given relative importance to this category or have ignored completely. Like the other organizations, human resources is also an important department in a water utility; therefore the skills, qualifications, experience, training, health and safety, and overtime culture should always be included in the performance evaluation process, irrespective of the size of the utility. The personnel indicators recommended by IWA (2006) arranged within the proposed, staged framework for SMWU are presented in Table 3.2. Most of these indicators are simple to calculate with the help of data obtained from the human resource department of the utility under study. The number of O&M staff may increase with the age of the physical assets and their associated problem. If the hiring of additional staff is planned accordingly, the utility performance will be maintained due to prompt response to customer complaints. On the other hand, in case of new and small utilites where customer complaints and operational failures are low, a higher number of O&M staff may result in low productivity. Additional PIs related to specific components, such as water resources, catchment, treatment, transmission, and distribution can also be calculated for SMWU, as shown in Table 3.2. Personnel training for state-of-the-art equipment, operational methods, and new software are also important. According to Brown (2004), one of the most important indicators of successful SMWU in the United States is training of operators and decision makers. Major accidents during operations in which workers have been hospitalized should always be recorded to indirectly measure the effectiveness of the health and safety procedures adopted by the utility. 56 Table 3.2 Proposed personnel/ staffing indicators No Use of the PI Basic (Start-up) Indicators Additional Indicators1 Advanced (Long-term)4 Indicators Calculation Data variable 1. T/ M Employees per connection (No/1000 connections)2 [(Number of full time employees)/(Number of service connections/1000)]  Total personnel  Total number of service connections  Employees per volume of water supplied - 2. M  Management personnel (%)  Technical Personnel (%) [(Number of full time management (i.e., finance, human resources, marketing, customer services) employees)/(Total number of full time employees)] x100 [(Number of full time technical (i.e., planning and construction, operation and maintenance) employees)/(Total number of full time employees)] x100  Number of connections  Number of total management personnel  Number of total technical personnel  Water resources and catchment management employee  Abstraction and treatment employee  Transmission, storage and distribution employee  General management personnel  Human resources management personnel  Financial and commercial personnel  Customer services personnel  Planning and construction personnel  Operation and maintenance personnel 3. M/ P Qualified Personnel (%) [(Number of full time employees with university degree and basic education)/(Total number of full time employees)] x100  Number of personnel with university degree and high school education  Total number of employees  University degree personnel  Basic education personnel  Other qualification - 4. M/ P Personnel Training (Hours/employee/year) [(Number of training hours during a year)/(Total number of employees)]  Total number of training hours  Total number of employees  Internal trainings  External trainings - 5. T/ M Working accidents (No/100 employee/year)3 [(Number of major working accidents in a year)/(Total number of employees/ 100)]  Number of major accidents in a year  Total number of employees  Absenteeism  Overtime 5. - - - Water quality monitoring personnel (No/ 100 tests/ year) - 6. - - - Metering management personnel (No/100 meters) - 1 could be added during next year 2 for smaller utilities units of employees/100 connections can be used 3for smaller utilities units of No/10employee/year can be used 4might be suitable for medium sized utilities in developed countries Physical Assets Indicators 3.2.3 Physical indicators are related to performance and efficiency of various components (assets), such as storage, pumping, treatment, transmission, and distribution mains. Most of these components are designed to meet the demand until the end of their design period (i.e., optimally equal to their structural life). Considering both the remaining capacity and the total structural life together, technically feasible and economically viable future planning can be done. The outcome of these PIs is useful for asset management. Proposed indicators in this category are listed in Table 3.3. It can be seen in Table 3.3 that treatment plant utilization and level of metering are the most important PIs in this category. Treatment plant capacity utilization indicates the need for additional treatment units in future, and the metering level is important for estimating water losses. It is very common in SMWU to provide water at flat rates, particularly when the source water is ample. Such utilities may face several operational complications, including wastage of large volumes of water, and difficulties in estimation of 57 non-revenue water. The estimates of water loss would be more accurate if based on the data from bills of metered connections. It is also reported that the reason for higher water loss is the poor metering system (Corton and Berg 2009). Detailed PIs related to valves, hydrants, and automation and control could be important for MWU in developed countries. Some of the source water bodies, particularly rivers and streams, are influenced by large flow variations. Storage of water during high flows is necessary with application of rain water harvesting methods or diversion structures. Therefore, the indicator of raw water storage capacity is also included for SMWU in Table 3.3. Table 3.3 Proposed physical/ asset indicators No Use of the PI Basic (Start-up) Indicators Additional Indicators2 Advanced (Long-term)3 Indicators Calculation Data variable 1. M/ P Treatment plant capacity1 (%) [(maximum volume of water treated per day)/(maximum daily designed capacity)] x 100  Treated water supplied from water treatment plant  Design capacity of treatment plant -  Remaining capacity of treatment plant 2. M/ P Raw water storage capacity (days) [(Net capacity of raw water reservoir)/(volume of supplied water during the period of assessment)]  Volume of the reservoir (can be monitored by keeping the record of reservoir level)  Volume of supplied water (metered volume supplied)  Assessment period of one year will give reliable estimate  Treated water storage (days)  Remaining capacity of storage 3. T/ M Metering level (%) [(number of connection with meters installed)/(total number of connections)] x 100  Number of metered connections  Total number of connections -  Metering density (No/ 1000 service connections) 4. - - -  Pumping utilization (%)  Energy recovery (%)  Standardized energy consumption (See IWA Manual for details)  Reactive energy Standardized energy consumption (See IWA Manual for details) 5. - - - -  Valves density (No/km of main)  Hydrant density (No/km of main) 6. - - - -  Degree of automation and remote control units 1 applicable to the utilities relying on surface water sources or saline ground water treatment is done with reverse osmosis or marine water treatment with thermal desalination 2 can be added during next year 3 might be suitable for medium sized utilities in developed countries Operational Indicators 3.2.4 Operational indicators are essentially related to inspection and maintenance of the physical assets discussed above. IWA (2006) and others have included water quality monitoring indicators in the same category, but in this research, water quality indicators are proposed under a separate category. The operational indicators recommended are listed in Table 3.4. Periodic cleaning of storage tanks is mandatory to reduce water quality issues resulting from formation of algae, particularly for surface water sources. For smaller diameter mains, it might not be economical to conduct condition assessment. Therefore, percentage of mains subjected to leakage during the year need to be identified during the 58 assessment period for assessing the structural integrity of the WDS. Lengths of rehabilitated mains also provide information regarding the operational efficacy of the utility, as well as the conditions of the mains. Table 3.4 Proposed operational indicators No Use of the PI Basic (Start-up) Indicators Additional Indicators1 Additional (Long-term)2 Indicators Calculation Data variable 1. T Cleaning of storage tanks per year [(total volume of storage tanks cleaned during the assessment period)/(total volume of all storage tanks)] - sum of volume of storage tanks cleaned - Total storage volume of tanks - - 2. T/ M Leakage (%/year) [(Length of mains detected to leakage in a year)/(total mains length)] x 100 - - No of breaks in a pipe (total length of that pipe is known from design) - Pipe lengths  Leakage detection and repairs - 3. M/ P Rehabilitation , renewal or replacement of mains (%/year) [(Length of mains rehabilitated, renewed or replaced during a year)/(total mains length)] x 100 - No of repairs - Pipe lengths  Rehabilitated, renewed and replaced mains individually - 4. T/ M Unaccounted for Water (UFW) [(System input volume) – (billed and unbilled authorized consumption)] - System input volume - Data of billed consumption - Data of unbilled authorized consumption  Apparent losses  Real losses  Infrastructure leakage index (ILI) 5. T/ M Main Failure (No/100km/year) [(Number of main failures during the year including valves and fitting)/ (total main length x 100)] - Data of main failure irrespective of type - Total main length of the distribution system  Pump failure  Hydrant failures  Power failures 6. T/ M Operational meters (%) [(Number of direct customer meters installed that are operational )/ (total number of meters installed)] x 100 - Total number of installed meters - Complaints received for out-of-service meters or meters found out-of-service during the meter reading  Customer meter reading efficiency - 7. - - -  Refurbishment or replacement of pumps (%/year)  Frequency of inspection of pumps - 8. - -  Inspection of mains (valves, fittings and hydrants)  Valves replaced  Service connections replaced 9. - - - -  Inspection and calibration of instruments (See IWA 2006 for details) 10. - - - -  Degree of automation and remote control units 1 can be added within a year 2 might be suitable for medium sized utilities in developed countries The “best practice” proposed by the IWA Task Force, a water balance to determine losses in a water distribution network, is shown in Figure 3.2 (Alegre et al 2000; Hirner and Lambert 2000). This is the only comprehensive water loss calculation framework that can be efficiently used for cross-utilities comparisons at an international level (Lambert 2003). All the components of the water balance given in Figure 3.2 need to be determined in terms of volume of water (preferably for one year). The main component of apparent losses are illegal use (theft), and the errors associated with billing, data handling, 59 and metering as shown in Figure 3.2. Experience shows that apparent losses may range between 1 – 9% of the total system input volume (Lambert 2002). It has also been reported that the main component of apparent losses is inaccuracies in meters (Mutikanga et al. 2009, Criminisi et al. 2009). The other component of water losses is the real losses (also known as physical losses) due to leakage from different components of a WSS. Recent studies have found that one third of the total water lost in urban areas is due to the leaks and breaks of water mains (Kanakoudis and Tsitsifli 2010). The following four real loss indicators have been reported in the literature (Sharma 2008, Radivojević et al. 2008, Hamilton et al. 2006):  percentage of system input volume;  per property per day;  per length (km) of mains per day;  per service connection per day;  per service connection per day per meter pressure;  per length (km) per day per meter pressure; and  per length (length of main + length of service connections up to meter locations) of system per day. Hamilton et al. (2006) developed a matrix to identify the limitations of the above mentioned real loss indicators in consideration of key factors that affect real losses According to them, none of the indicators take all the key factors affecting real loss into account. Detailed discussions on the application of the above stated PIs of real losses can be found in the literature (Kanakoudis and Tsitsifli 2010; Radivojević et al. 2008; Hamilton et al. 2006; Lambert & Hirner 2000; Lambert and Morrison 1996; Arscott & Grimshaw 1996; Butler and West 1987). It is well recognized that real losses cannot be completely avoided economically due to continuous, unavoidable deterioration of the WDS (Radivojević et al. 2008). The IWA Task Force recommended a comparison between the Current Annual Real Losses (CARL) and the Unavoidable Annual Real Losses (UARL) (Hamilton et al. 2006). Lambert et al. (1999) developed the following empirical relationship to calculate UARL: UARL (liters/day) = (18 * Lm + 0.8 * Nc + 25 * Lp) * P [3.1] 60 where Lm is the length of mains (km); Nc is the number of service connections; Lp is the length of private service pipes from property boundary to the meter (m); and P is the average pressure (m). A value of zero for Lp can be used when a meter is installed at the boundary line. Figure 3.2 Components of water balance for calculation of water losses in water distribution define by Farley and Trow (2003) ILI, previously known as the International Leakage Index, has been well recognized as the most appropriate indicator for the calculation of real losses (physical losses). It frequently investigates the management efficiency at a certain operating pressure. Liemberger (2002) proposed the following formula to calculate ILI as the ratio CARL and UARL: UARLCARLILI  [3.2] The ILI calculated from Equation [3.2] is unit-less, therefore is more suitable for international cross-utility comparison. The relationship between ILI, CARL, and UARL has been best presented in Figure System Input Volume (Net Production) Authorized Consumption Water Losses Unbilled Authorized Consumption Billed Authorized Consumption - Billed metered consumption - Billed unmetered consumption - Unbilled metered consumption - Unbilled unmetered consumption Real Losses - Leakage on transmission & distribution mains - Leakage and overflow at storage tanks - Leakage on service connections up to point of consumer metering Apparent Losses - Unauthorized consumption - Metering inaccuracies and data handling errors Revenue Water Nonrevenue Water (NRW) Unaccounted for water (UFW) 61 3.3. The outer rectangle shows that CARL increases with aging of the WDS, and hence the ILI value. To reduce the volume of CARL, management methods (shown as arrows in the Figure 3.3) to control the real losses need to be applied. Asset management pushing the CARL rectangle from the bottom includes selection, installation, maintenance, renewal, and replacement of deteriorating assets of a water supply system (Lambert and McKenzie 2002). Figure 3.3 The four basic methods of managing real losses (Source: Lambert et al. (1999) Limberger (2002) presented a graphical visualization of ILI from 1-100, where a value of “1” is ideal but not necessary to be set as the target value. Liemberger and McKenzie (2005) found that the ILI formula had limited applicability in developing countries where data is often unavailable and/or inaccurate due to limited resources. According to them the efficiency of the ILI depends on the accuracy of the UARL formula to some extent but mainly on the annual volume of real losses (i.e., CARL), average pressure, and the data related to the distribution network. It is uneconomical to completely control the leakage from all the reservoirs and pipe mains; NWC/DoE (1980) stated that there is always an economic level of leakage (ELL). According to OFWAT (2003), the ELL is the level at which further reduction in leakage becomes more expensive than producing water from another source. The optimum or economic level can be determined by adding the cost of distributing treated water and the cost of reducing leakage. Potentially Recoverable Real Losses UARL CARL Speed and quality of repairs Active Leakage Control Pressure Management Pipeline and Asset Management 62 The water production cost will vary with the type of network and level of treatment. However, there were questions raised about the application of UARL formula and using the ILI approach for systems operating with less than the 3000 service connections, service connection density of less than 20 per km of main length, and an average pressure less than 25 meters, which could be the case of SM-WSS. Due to the significant amount of unauthorized consumption from illegal connections, the concept of non-revenue water (NRW) is not a reliable estimate of real losses. However NRW is still most commonly used in developing countries and SM-WSS as it is easy to calculate and can be useful as a financial indicator (Kanakoudis and Tsitsifli 2010; Lambert 2003; IWA 2006). It is recommended here as well for use as a financial indicator for SM-WSS. Water Quality and Public Health Indicators 3.2.5 Most of the health problems in S-WSS are caused by pathogenic micro-organisms (MOs), which can be removed by disinfection. The most common MOs in small systems are Escherichia coli (E-coli) and Campylobacter species (Ford et al. 2005). The type of pathogen depends on the water source and the geographical location of the area. However, it is always easy to identify and rectify water quality outbreaks in small systems due to their relatively smaller network size. A well-known example, Washington County Fair, New York, USA in 1999, describes the potential risk of public health associated with a contaminated shallow well in small counties; 921 diarrhea cases were reported as a result of drinking non-chlorinated water (MMWR 1999). Chronic and acute chemical risks include arsenic, nitrates, pesticides, disinfection by-products (DBPs), iron, lead, pH, etc. Trihalomethanes (THMs) and Haloacetic Acids (HAAs) are the main DBPs if the source water contains sufficient organic matter and the disinfection method is chlorination. The most common aesthetic water quality aspects are taste, odour, and colour (WHO 2011). Detailed reviews on the fate and transport of various chemical constituents (fluoride, iron, nitrification, chlorine residual) in water supply systems and their impacts on both human health and system integrity have been frequently reported in the literature (Benson et al. 2011; Fisher et al. 2011; Zhang et al. 2009; Ayoob and Gupta 2007). Natural waters may contain some chemical elements that are naturally radioactive. According to the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR 2008), the global average yearly dose from all environmental sources of radioactive radiation is about 3.0 mSv/year per 63 person. However, the possibility of cancer risk through ingestion of drinking water has been reported for doses of 100mSv for extended time exposures. It is not economically viable to identify individual radionuclides in the WSS, but a screening process is practical for identifying the total radioactivity without regard to specific types of radionuclides. Details can be seen in WHO (2011). Therefore, it is important to conduct radioactivity tests in SM-WSS periodically. Proposed water quality and public health indicators are presented in Table 3.5 (IWA 2006, NWC 2011, NRC 2010). Table 3.5 Proposed water quality and public health indicators No Use of the PI Basic (Start-up) Indicators Additional Indicators1 Additional (Long-term)2 Indicators Calculation Data variable 1. T Aesthetic water quality tests carried out (%) [(Number of treated water aesthetic tests carried out during a year)/(Number of treated water aesthetic tests required by applicable standards per year)] x 100  Record of aesthetic water quality tests (taste, colour and odour) of treated water  Applicable water quality standards or regulations  Tests performed for taste  Tests performed of odour  Tests performed for colour - 2. T Microbiological water quality tests carried out (%) [(Number of treated water microbiological tests carried out during a year)/(Number of treated water microbiological tests required by applicable standards per year)] x 100  Record of microbiological water quality tests of treated water  Applicable water quality standards or regulations  Tests performed for pathogens  Test performed for viruses  Tests performed for helminthes - 3. T Physico-chemical water quality tests carried out (%) [(Number of treated water chemical tests carried out during a year)/(Number of treated water chemical tests required by applicable standards per year)] x 100  Record of chemical water quality tests of treated water  Applicable water quality standards or regulations  Tests performed for residual chlorine  Tests performed for dissolved solids  Tests performed for Arsenic or other toxic chemicals  Tests performed for chloramines  Tests performed for THMs and HAAs  Tests performed for dissolved organics and inorganics 4. T Radioactivity water quality tests carried out (%) [(Number of treated water radioactivity tests carried out during a year)/(Number of treated water radioactivity tests required by applicable standards per year)] x 100  Record of overall radioactivity water quality tests of treated water  Applicable water quality standards or regulations - - 5. T/ M/ P Population days with boiled water advisories (%) [(Number of days with boiled water advisory during a year)/(Total days in a year)] x 100  Record of the days when boiled water advisories were issued - - 6. - - - -  Reduction in number of illnesses, injuries and deaths resulting from the performance improvement 7. - - - -  Risk based drinking water management plan (Yes/ No) 8. - - - -  Public disclosure of drinking water performance (Yes/No) 1 can be added within a year 2 might be suitable for medium sized utilities in developed countries Sources of chemical constituents in surface waters are natural rocks, industrial and domestic activities, fertilizers and pesticides used in agricultural activities; and the use of specific chemicals (e.g., coagulants, polymers, and chloramines) in the treatment processes. Therefore, special considerations for detailed parameter wise analysis are recommended at the time of source selection to select the most suitable water quality parameters and their sampling frequency. Moreover, extra care and frequent monitoring of various types of aesthetic, chemical, and microbiological water quality aspects would be required for fresh surface 64 water sources. For fresh ground water sources, residual chlorine and microbiological water quality aspects are more important because the source is free from suspended and dissolved organic and inorganic elements and the only possibility of microbiological contamination is from cross-connections in cracked pipelines. Turbidity and pathogens are the main problems of surface water sources, which are conventionally controlled through water treatment facilities (e.g., coagulation, sedimentation, filtration, disinfection). However, in SMWU such treatment facilities are not installed and the utilities rely primarily on source water quality. Further for surface water, higher dissolved oxygen (DO) concentrations from saturated water sources (rivers and lakes) may also exacerbate corrosion of metal pipes (WHO 2011). If the source is marine water, all types of chemical, biological, and aesthetic parameters need to be controlled and monitored. The treatment processes in this case could be either the “conventional treatment followed by reverse osmosis” or the “thermal desalination process. All these treatment facilities may operate at different efficiencies depending on skills of the operators, source water quality, and structural condition and age of plant components. Thus the implementation of a well-structured water quality monitoring plan is always required to ensure the provision of safe drinking water to the consumer. Quality of Service Indicators 3.2.6 Customer satisfaction is the most important objective of any utility. Agencies have used different PIs to check the efficiency of the utility in this context. Customers will only be satisfied when they get the best service at the cost they paid for the water. Cost of water may range from free (e.g., public stand post installed by an NGO in a small sized water utility of a developing country) to very high (e.g., desalinated water in a medium sized water utility of a developed country). The satisfaction can be mainly correlated to the maximum coverage, adequate quantity (i.e., continuous supply at required pressure), acceptable quality (i.e., meeting with water quality standards and guidelines), prompt response to customer complaints, and less time taken to install a new connection or meter. Customers may not receive a response to their complaints if the supplier’s email spam filter prevents messages from coming through. As discussed earlier, in the case of privately owned WSSs (e.g., England and Wales) the customers’ expectations are too high proportional to the cost of water, and the indicators used to assess customer satisfaction might not be practical in general. Depending on of the level of response to written consumer complaints, water suppliers in England and Wales whose response efficiency was higher than 99% (within ten working days after the receipt of complaint) were given an 65 incentive to increase their water charges (OFWAT 2009-2010). Therefore, the PIs of customer satisfaction need to be carefully selected for SMWU. Table 3.6 Proposed quality of service indicators No Use of the PI Basic (Start-up) Indicators Additional Indicators Additional (Long-term)3 Indicators Calculation Data variables 1. M/ P Population coverage (%) [(resident population served by the water undertaking)/(Total population of the study)] x 100  Record of the population served (based on experts opinion, demographic surveys and analysis)  Total population of the area  Building supply coverage (%)  Population coverage by service connections  Household and business supply coverage (%) 2. M/ P Population coverage by public stand posts – developing countries (%)1 [(Resident population served by the water undertaking through public stand-posts)/(Total population of the study area)] x 100  Record of the population served (based on experts opinion, demographic surveys and analysis)  Total population of the area  Population per public stand-post  Per capita water consumption in public stand-post - 3. M/ P Operational water points and stand-posts – developing countries (%) [(Number of water points that are operational)/(Total number of water points in the study area)] x 100  Record of the operational stand-posts  Total number of stand-posts installed - - 4. T/ M Adequacy of supply pressure [(Number of service connections at which pressure is equal or higher than the target pressure)/(Number of service connections)] x 100  Record of complaints regarding low pressure points  Results of pressure monitoring survey  Total number of connections - - 5. T/ M Continuity of supply (%) [(Number of hours when the system is pressurized during a year)/(Total days in a year)] x 100  Record of the hours when the system was not pressurized - - 6. T/ M Water interruptions (%) [(Number of hours when the system is pressurized during a year)/(Total days in a year)] x 100  Record of the hours when the system was not pressurized - - 7. T/ M Average frequency of unplanned interruptions (No/100 connections)2 [(Total number of unplanned interruptions during the year)/(Number of service connections *100)]  Record of the number of unplanned interruptions during the whole year  Number of connections - - 8. M/ P Water quality compliance of supplied water (%) [(Total number of treated water samples complying with standards in a year)/(total number of tests performed in a year)] x 100  Record of water samples analyzed  Microbiological tests compliance  Chemical tests compliance  Aesthetic tests compliance  Radioactivity tests compliance - 9. M Total complaints per connection (No/100/year)** [(Total number of complaints during the year)/(Number of service connections *100)]  Record of the number of complaints during the whole year  Number of connections -  Pressure complaints  Continuity complaints  Water quality complaints  Interruptions complaints  Billing and queries 10. M Total response to written complaints (%) [(Total number of responses to written complaints)/(total number complaints in a year)] x 100  Record of the written complaints in a year  Record of complaints responded -  Pressure complaints  Continuity complaints  Water quality complaints  Interruptions complaints  Billing and queries 11 - - - -  Total telephonic complaints  Percentage of calls answered within 30 sec by an operator 12. - - - -  New connection efficiency  Time to install a customer meter  Connection repair time 1 specific to developing countries only 2 For small systems only, otherwise (No./1000 connections) should be used 3 might be suitable for medium sized utilities in developed countries 66 A proposed set of quality of service PIs is listed in Table 3.6. Aspects related to coverage and customer complaints are addressed in most of the indicator systems. Complaints are easy to record but might not reflect the actual performance of the water utility due to the fact that some customers don’t complain about problems they experience. IWA (2006), OFWAT (2012), NWC (2011) and ADB (2012) have proposed other PIs as well for example, water restrictions, call response duration, supply pressure, efficiency of connection and meter installations, etc. A lower number of complaints is an indirect measure of an efficient utility. There are two main reasons for relatively larger numbers of PIs in this category as given in Table 3.6. Firstly, these are the most important PIs for SMWU, due to the fact that these utilities usually have fewer routine maintenance staff and vehicles, and the customers’ complaints are the most efficient source of problem identification. Secondly, the data required to measure these PIs only needs good recordkeeping instead of expansive data collection and analysis exercises. Through simple analyses and comparisons with similar utilities, efficient conclusions can be drawn on the overall performance of the utility under study. SMWU frequently deal with agricultural customers. The connections to the agricultural sector are widespread and sometimes even the customer is unable to identify the problem location. Sometimes, the customer is not using the water at all, based on crop water requirements, and there is a pipe break somewhere in the fields. In such cases, the customer will receive a bill much higher than expected, resulting in an indirect complaint regarding bill inaccuracy, which in reality is due to the pipe break. The PA team should relate the actual cause of the complaint to the relevant component of the WSS. This can be done through use of a well-developed customer complaint work order. The response to the complaint should be indicated on the performa, and also whether the actual problem was the same as recorded by the customer or something different than that (i.e., in-house plumbing issue). Moreover, the work order should mention how many visits were required to completely resolve the problem, how much distance was travelled, etc. This data would be useful to estimate the cost of each response or repair activity. Later, on the basis of such analysis, the management of a SMWU can take the improvement actions. Financial and Economic Indicators 3.2.7 Different organizations have split the financial indicators into billing, pricing, asset, operating cost, and other categories (See Table 2.2 for details). For a small municipality it is more important to know whether or not its operating costs are being met with the revenues it is generating, and whether or not it is serving its debts (WB 2011). IWA (2006) has provided a list of 46 financial indicators by splitting the cost according to type of cost (i.e., main functions of the water utility, technical functions, etc.), which might 67 not be practical for SMWU. An effort is made in Table 3.7 to identify the important financial PIs. Utilities can also select additional and advanced indicators as per their requirements and data availability. Running costs of a water utility consist of overall O&M costs, and the cost of permanent manpower, whereas, the capital costs include net interest and depreciation during the assessment period. Table 3.7 Proposed Financial/ Economic indicators No Use of the PI Basic (Start-up) Indicators Additional Indicators Additional (Long-term)1 Indicators Calculation Data variables 1. M/ P Revenue per unit volume of supplied water ($/m3) [(Operating revenues-capitalized costs of the constructed assets)/(Authorized consumption during the year)]  Operating revenue during the year  Authorized consumption during the year -  Sales revenues  Other (if applicable) revenues 2. T/ M/ P Non-revenue water (NRW) [(Cost of the systems input volume) – (billed authorized consumption)] - System input volume - Data of billed consumption - Unit cost of water - - 3. M/ P Unit total costs ($/m3) [(Total costs including running cost & capital costs)/(Authorized consumption during the year)]  Running costs  Capital costs  Authorized consumption during the year -  Unit running costs  Unit capital costs 4. M/ P Unit investment ($/m3) [(Cost of investments (expenditures for plant and equipment)/( Authorized consumption during the year)]  Cost of total expenditures for plant and equipment  Authorized consumption during the year - - 5. M Average water charges ($/m3) [(Water sales revenue from all types of customers)/ (Total authorized consumption during the year)]  Total revenue from total water sold  Authorized consumption during the year - - 6. M Operating cost coverage ratio [(Total annual operational revenues)/(Total annual operating costs)]  Total operational revenue from total water sold  Authorized consumption during the year -  Delays in accounts receivable  Investment ratio  Average depreciation ratio  Late payment ratio 7. M/ P Debt service ratio (%) [(Cash income)/(Financial debt service “FDS”)] x 100  Total annual net income  FDS contains the cost of interest expenses, the cost of loans and the principle repayment debt instruments, -  Debt equity ratio 8. M/ P Liquidity (Current ratio) [(Current assets)/(Current liabilities)]  Current assets include cash in hand, receivable accounts, inventories and prepaid expenses  Current liabilities include accounts payable, current liabilities and current portion of remaining long-term liabilities - - 9. T/ M/ P Underground infrastructure renewed or rehabilitated (%) [(Underground infrastructure renewed or rehabilitated annually)/ (Total underground infrastructure)] x 100  Lengths of underground water mains renewed or rehabilitated in a year  Total lengths of mains  Value of horizontal components of infrastructure renewed or rehabilitated (%)  Value of vertical components of infrastructure renewed or rehabilitated (%) - 10. - - - -  Manpower cost  Electrical energy costs 11. - - - -  Management functions cost  Financial and commercial functions costs  Customer service functions costs  Technical service function cost 12. - - - -  Water resources and catchment management costs  Abstraction and treatment costs 68  Transmission, storage and distribution costs  Water quality monitoring cost 1 might be suitable for medium sized utilities in developed countries IWA (2006) recommended calculating the financial indicators for one year; however, assessment periods less than a year can also be used with necessary explanation. Wyatt (2010) developed a financial model to optimally manage NRW in developing countries. He stated that bill collection in developing countries is not as efficient as in developed countries; thus, there is a need to differentiate between the water that is billed and the actual revenue collected. The situation could be worse in the case of SMWU in developing countries. In such circumstances, the indicator in terms of revenue per unit volume of supplied water can provide a more rational basis for cross-comparison; instead of using NRW on the basis of billed authorized consumption (refer to PIs 1 and 2 in Table 3.7). The PIs of the unit total cost and the unit investment given in Table 3.7 will facilitate life cycle costing and long-term asset management by utility managers and decision makers. The indicator of the average water charges provides a rational basis for fixing the water charges based on the type and number of customers of each use (residential, agricultural, industrial, etc.). Operating cost coverage is an important PI for defining the operational efficiency in financial terms. For private organizations, or a water utility developed with loans, debt service ratio could be an important indicator even for a S-WSS. Indicators of investment ratio, depreciation ratio, and late payment ratio might not be of significance for SMWU, except high value, medium sized utilities in developed countries. Therefore, the additional PIs given in Table 3.7 can be selected on a long-term basis for such MWUs. 3.3 Summary It is well recognized that PIs would be different for developing countries than for developed ones due to variations and limitations associated with data availability. In this regard, selected PIs for SMWU have been identified in three stages (levels) after conducting a detailed review in Chapter 2 of the PIs used by different organizations in both developed and the developing countries. Start-up PIs are proposed for both developing and developed countries, requiring limited data to initiate the PA process; additional PIs are proposed for developed countries and for developing countries if the data can be collected; and advanced PIs are proposed for MWU (having sufficient resources) in developed countries. The PIs identified in this chapter are further evaluated for final selection in next Chapter 4. 69 Chapter 4 Selection of Performance Indicators A part of this chapter has been published in Urban Water Journal as a research articles titled “Selecting Performance Indicators for Small to Medium Sized Water Utilities: Multi-criteria Analysis using ELECTRE Method” (Haider et al. 2015a). The suitable PIs in Chapter 3 are further evaluated in this chapter using MCDA to achieve a concise list of suitable PIs covering all the functional components of SMWU. 4.1 Background A PI is used to measure the performance of a program in terms of percentage or index; and it is monitored at defined intervals, and can be compared to one or more criteria or standards (OPM 1990). In Chapter 2 the existing systems of PIs in literature are reviewed in the context of SMWU. A summary of distribution of PIs grouped into different categories (covering various organizational components of a water utility) by the above mentioned agencies was presented in Table 2.2; here a graphical representation is shown in Figure 4.1. It can be seen that major categories include PIs of operational, quality of service, and financial; however, other categories also need to be given relative importance. Although, SMWU have lesser participation in benchmarking process, and inadequate and inaccurate data available to calculate PIs; such utilities in developed countries have the potential to improve their performance assessment process with the ability to shift to latest technologies, and an eagerness (as well as structure) to improve their operational and monitoring data inventories. This situation stresses the need to identify and select appropriate and simple PIs for SMWU. Literature review in Section 2.7 (Chapter 2) revealed that the selection of suitable PIs using the ordinal (qualitative) scales with a small range of rating (1 to 5), generate significantly small differences in final scores, might not be easy for the decision maker (DM). Performance assessment is a continuous process, and defining the cut-off for the ranked PIs may limit the applicability of the selected PIs only for the specific case, and also does not allow including additional PIs as per the future needs. Therefore, the method for selecting the PIs should adequately address these issues. 70 Figure 4.1 Distribution of PIs in different categories by various agencies, a graphical representation of Table 2.2 MCDA outranking methods such as Elimination and Choice Translating Reality (ELECTRE) based on pairwise comparisons of alternatives are suitable for qualitative attributes, and also when the differences of evaluations are small (Kabir et al. 2013, Figueria et al. 2005). In this research, ELECTRE method is used due to three main reasons. Firstly, by accumulating the smaller differences (of scoring) between different alternatives (PIs) under each criterion, distinct outranking relations between different PIs can be established. Secondly, the network diagrams established based on the outranking relations between all the PIs (included in the evaluation process) provides an opportunity to encompass the important ones to initiate the PA process and also to add more PIs at later stages. In this way, the PIs which might not be important for a specific utility or in view of decision makes are still available in the network diagrams; this is not possible if a discrete cut-off is used. Thirdly, the final ranking based on overall dominance structure through ELECTRE method can be used in allocating importance weights to the PIs during detailed PA process for developing performance indices. 4.2 Modeling Approach The modeling approach used for the selection of PIs for SMWU is shown in Figure 4.2. Initially, potential PIs (grouped into 7 most commonly used categories) were screened out using simple checklist process from existing PIs available in literature in Chapter 3. A set of 4 criterions has been established to evaluate the suitability of an indicator using multicriteria analysis. Weights to each criterion have been assigned 0 20 40 60 80 100 120 140 160 180OFWAT (2012)ADB (2012)WB (2011)NWC (2011)CSA (2010)NRC (2010)IWA (2006)AWWA (2004)Water resources & environmentPhysical & AssetsPersonnel/ StaffingWater quality & Public HealthOperationalQuality of ServiceFinancial & economic 71 using Analytical Hierarchical Process (AHP) through group decision making process. A matrix between PIs and evaluation criteria was generated by scoring each indicator against each criterion. The ELECTRE method was used to develop the outranking relationships between the indicators under each category, and to establish final preferences. The details of each step of the framework shown in Figure 4.2 are presented below. Criteria for Selection of PIs and Ranking System 4.2.1 From initial screening in Chapter 3, 114 potential PIs presented in Table 4.1 were identified for further evaluation using MCDA. The following criterions have been used to evaluate the most suitable PIs for SMWU: C1 – Applicability: how much an indicator is applicable and relevant for the performance assessment of SMWU? It is related to the overall technical, environmental, and socio-economic relevance of the indicator. C2 – Understandability: how much an indicator is understandable to both the public and the utility personnel? It is related to the type of data the PI involves and interpretability of the indicator. C3 – Measurability: how much the indicator is measurable? It is related to the availability, accuracy, and frequency of monitoring data required for the calculation of the indicator. C4 – Comparability: how much the calculated value of indicator is comparable with the other similar utilities in the region and/ or national or international level? It is important to define the ranking system (as much as possible) to facilitate the scoring process. In this connection, an attempt has been made in this study by defining all the ranks under each criterion in Table 4.2. The selected criterions were defined on ordinal scales, and decided to rank on a 5-point translated as 1 as ‘Very Low’; 2 as ‘Low’; 3 as ‘Average’; 4 as ‘High’; and 5 as ‘Very High’. Applicability and measurability were spanned over 5 ranks with an interval of 1, whereas, understandability and comparability criterion were ranked as 1, 3, or 5. The consistent variability is assumed in the ranking process for all the criteria. For comparability criteria, the higher ranks have been given to the indicators included in National Water and Wastewater Benchmarking Initiative (NWWBI) evaluating the performance of 41 water utilities (generally LWU with population greater than 50,000) for the year 2010. The minimum, average, and maximum values of the PIs given in the public report published in 2012 by AECOM provide an opportunity to the utilities to compare the values of their PIs with other utilities in the region. 72 Figure 4.2 Modeling approach for selection of PIs for SMWU Initial Screening Literature review of available PIs systems for WSS (Chapter 1) CSA (2012) 62* OFWAT (2012) 14 ADB (2012) 54 WB (2011) 81 NWC (2011) 73 NRC (2010) 33 IWA (2006) 170 AWWA (2004) 31 Selection of 114 short-listed PIs for SMWU (Chapter 2) WE (13) PE (22) PH (12) OP (22) WP (12) QS (16) EF (16) Defining criteria for evaluation of PIs Selection and Ranking of suitable PIs for SMWU with ELECTRE 1 method Evaluation of selected PIs with MCDA (Scoring Matrix) WE1 Applicability Understandability Measurability Comparability WE2 --- WEm Determination of criteria weights using AHP Performance Indicators for SMWU 73 Table 4.1 Selected PIs through initial screening WE WATER RESOURCES AND ENVIRONMENTAL PH PHYSICAL WE-1 No. of days of water restriction (%) PH-1 Metering level (%) WE-2 Average Daily Per Capita Domestic Water Consumption PH-2 Degree of automation (%) WE-3 Average Day Demand / Existing Water License Capacity PH-3 Raw water storage capacity (days) WE-4 Energy consumption in KWH (D&T) PH-4 Treated water storage capacity at ADD (hrs) WE-5 Impact of pipe flushing on aquatic life PH-5 Treatment Plant Capacity WE-6 Disposal of backwash water (% Residuals) PH-6 Pumping Utilization (%) WE-7 Sector vise availability of water resources (domestic, industrial, etc.) PH-7 Remote control degree (%) WE-8 GHG emissions from routine transport fuel emissions PH-8 Pump Station Energy Consumed KWH/ Total Pump Station HP WE-9 Per Capita Water Consumption (Overall) PH-9 Hydrant Density (No/Km) PE PERSONNEL (STAFFING) PH-10 Valve density (No/Km) PE-1 Number of in house metering field FTEs1 / 1000 meters PH-11 Metering density (No/ 1000 service connections) PE-2 Water quality monitoring personnel (No/ 1000 tests/ year) PH-12 Treatment plant capacity (%) PE-3 Water resources and catchment management employee (No/106m3/year) OP OPERATIONAL PE-4 Number of field FTEs*/ 100km length OP-1 Service connection rehabilitation (%) PE-5 Number of field FTEs*/ 1000 ML treated water OP-2 Replaced valves (%/year) PE-6 No of Lost Hours due to Field accidents/ 1000 field labour hours – (D)2 OP-3 Mains Replaced (%/year) PE-7 No of Lost Hours due to Field accidents/ 1000 field labour hours – (T)3 OP-4 Mains Rehabilitation/ Renovation* (%/year) PE-8 No. of sick days taken per field employee- (D) OP-5 Hydrant Inspection (per year) PE-9 No. of sick days taken per field employee- (T) OP-6 Leakage (%/ year) PE-10 Total overtime field hours/ Total Paid field hours – (D) OP-7 Cleaning of storage tanks (per year) PE-11 Total overtime field hours/ Total Paid field hours – (T) OP-8 Non-Revenue Water (L/ connection/ day) PE-12 Personnel Training (Hours/employee/year) OP-9 No of Main Breaks (No./ 100Km) PE-13 Working accidents (No/100 employee/year) OP-10 Inoperable or leaking hydrants (%) PE-14 No of Field accidents with lost time/ 1000 field labour hours – (D) OP-11 Residential Customer Reading Efficiency PE-15 No of Field accidents with lost time/ 1000 field labour hours – (T) OP-12 Operational Meters PE-16 Total available field hours/ total paid field hours – (D) OP-13 Infrastructure Leakage Index (ILI) PE-17 Total available field hours/ total paid field hours – (T) OP-14 Network Inspection (per year) PE-18 % of field employee eligible for retirement per year – (D) OP-15 Pump Inspection (per year) PE-19 % of field employee eligible for retirement per year – (T) OP-16 Apparent losses per connection PE-20 Average Work Experience ratio OP-17 Apparent losses per system input volume PE-21 Employees per connection (No/1000 connections) OP-18 Real losses per connection (l/ connection/ day w.s.p) PE-22 Employees per volume of water supplied OP-19 Real losses per main length (l/ Km/ day w.s.p) FE FINANCIAL AND ECONOMIC OP-20 % of Inoperable of Leaking Valves FE-1 O&M Cost ('000)/ Km Length ($/Km) OP-21 Customer reading efficiency FE-2 O&M cost of water treatment ($/ Million liters of treated water) OP-22 Power Failure FE-3 Revenue per unit volume of supplied water ($/m3) WP WATER QUALITY AND PUBLIC HEALTH FE-4 Water rate for a typical size residential connection using 250 m3/year WP-1 No of Boil-Water Advisory Days FE-5 Operating cost coverage ratio WP-2 Cumulative Length Cleaned as % of System Length FE-6 Debt service ratio (%) WP-3 Average Value of Turbidity in WDS (NTU) FE-7 NRW by volume WP-4 No of Total Coliform Occurrences in WDS FE-8 Liquidity (Current ratio) WP-5 THMs in water distribution system (mg/L) FE-9 5 year running average capital reinvestment/ replacement value – (D) WP-6 Residual chlorine in distribution system (mg/L) FE-10 Cost of O&M of fire hydrants/ total number of fire hydrants WP-7 Turbidity of treated water (NTU) FE-11 Metering O&M cost WP-8 No of total coliform occurrences in Treated water FE-12 Pump station O&M cost ('000)/ total pump station horsepower WP-9 Concentration of Nitrates in treated water (mg/L) FE-13 Cost of Customer Communication/ Population Served WP-10 Aesthetic water quality tests carried out (%) FE-14 Cost of water quality monitoring/ population served ($/ person) WP-11 Microbiological water quality tests carried out (%) FE-15 Chemical Cost / ML Treated ($/ million liters of treated water) WP-12 Chemical water quality tests carried out (%) FE-16 Water Revenue per employee QS QUALITY OF SERVICE 1 Full time Employees QS-1 Billing complaints (%) 2 Distribution system QS-2 Other complaints and quarries (%) – Service connection/ leakage 3 Treatment QS-3 Number of water pressure complaints/ 1000 people served QS-4 Number of water quality complaints/ 1000 people served QS-5 Total response to reported complaints (%) QS-6 Number of Unplanned System Interruptions/ 100 Km main length QS-7 Unplanned Maintenance Hours/ Total maintenance hours (%) QS-8 Population coverage (%) QS-9 Quality of water supplied QS-10 Number of water quality complaints by reason/ 1000 served QS-11 Total complaints per connection (No/1000/year)** QS-12 Continuity of supply (%) QS-13 Aesthetic test compliance QS-14 Microbiological test compliance QS-15 Physical-chemical test compliance QS-16 Radioactive test compliance 74 Table 4.2 Scoring system and definition of criteria Score Criteria Description C1 Applicability/ Relevance 1 Very Low Seems to be Irrelevant for SMWU. 2 Low Has low relevance to SMWU. 3 Average Average applicability for SMWU. 4 High Highly applicable for performance assessment of SMWU. 5 Very High Has to be included, extremely important for SMWU. C2 Understandability 1 Low PI is difficult for everyone to understand and interpret. 3 Average PI is understandable to utility personnel but might not be understandable to public. 5 High PI is understandable to both the public and the utility personnel. C3 Measurability 1 Very low Both the data variables are measured at very low frequency. 2 Low Both the variables are measured at lower frequencies. 3 Average Some of the variables are absolute and some are monitored at low frequency. 4 High Some of the variables are absolute and some are monitored at high frequency. 5 Very High Values of all the variables are known with their absolute values, e.g., fixed physical assets. C4 Comparability 1 Low PIs have rarely been used by the similar utilities. 3 Average PIs have been used by the water utilities outside the region. 5 High PIs have been used by the utilities in region. Multicriteria Decision Analysis (MCDA) 4.2.24.2.2.1 Analytic Hierarchy Process The analytic hierarchy process (AHP), developed by Saaty (1980), formulize the human intuitive understanding of a complex problem with the help of a hierarchical structure. However, instead of using AHP for a complex decision making problem, in this research it is used to determine the weights of above mentioned criteria for selection of PIs. The step-by-step approach is described below: Step 1: Pairwise comparison matrix The first step is to set the preferences concerning the four selected evaluation criteria to develop the pairwise comparison matrix ‘A’. The nine point intensity scale established by Satty and Vargas (1991) varies between 1 and 9 (degree of preferences) for equal importance to extreme importance shown in Table 4.3 is used for pairwise comparison. In present research all the criteria have their specific importance for the selection of PIs; therefore, even numbers are also used. 75 Table 4.3 Evaluation scale used in pairwise comparison Scale Degree of preference 1 Equal importance 3 Moderate importance of one factor over another 5 Strong or essential importance 7 Very strong importance 9 Extreme importance 2,4,6,8 If a compromise judgment is required. Step 2: Normalized comparison matrix In this step, normalized comparison matrix is formed. Normalization is done by diving each value of pairwise comparison matrix by the sum of the corresponding column. The corresponding rating (weights ’w’) are determined by averaging the values in each row of the normalized matrix. Step 3: Consistency Analysis Next step is to check the consistency of the original preference rating. In this regard, first maximum eigenvalue ‘λ’ needs to be calculated using the following equation:   ni iiwAwn 1max1 [4.1] and then the consistency index (CI) can be calculated as:  1max nnCI  [4.2] The numerator in equation [4.2] is a measure of the deviation of the inconsistent matrix from the consistent comparison matrix, and provides the basis to determine CI showing the consistency of the expert’s estimates (CABAŁA 2010). Now calculate the random index (RI), which is essentially CI of a randomly generated pairwise comparison matrix. The RI values proposed by Saaty (1980) are given in Table 4.4. Order of matrix ‘n’ is four in this study; and thus RI is 0.9. Table 4.4 Random indices established by Saaty (1980) n1 1 2 3 4 5 6 7 8 9 10 RI 0.0 0.0 0.58 0.9 1.12 1.24 1.32 1.41 1.46 1.49 1n = order of matrix 76 Finally calculate the consistency ratio (CR) as: RICICR  [4.3] Generally, a value of CR equal to or less than 0.1 is considered acceptable. An example of the application of above mentioned method to determine the weights of relevance, comparability, measurability, and understandability for the selection of PIs is enclosed as Appendix-B. 4.2.2.2 Elimination and Choice Translating Reality The trade-offs among criteria values are desirable due to their relative impermanence in the overall decision making framework (Figure 4.2); therefore compensatory methods are required to solve this type of decision making problem. The ELECTRE 1 (a compensatory method) based on outranking relation theory has been used here as a suitable MCDA method. This method was first introduced by Benayoun et al. (1966), after that it was used in several decision making problems such as water resource management, infrastructure management, material selection, transportation, etc. (Zardari et al. 2010, Coutinho-Rodrigues et al. 2011, Pang et al. 2011, Anton and Grau 2004). The other methods such as ELECTRE III, IV, and IS are based on fuzzy ranking approach (Figueira et al. 2005); however, in this research average values of the ranks have been used for simplicity. ELECTRE 1provides the opportunity to develop visual networks maps of outranking relationships between different alternatives (PIs), which is convenient for the decision makers in the final selection of PIs. Moreover, unlike other non-compensatory methods, in ELECTRE 1 method, weights are the coefficients of importance and not the criteria substitution rates (Milani et al. 2006). Secondly, all the evaluation criteria are defined on ordinal scale in Table 4.2. Under such situation, when the substantial preferences between various alternatives (PIs in this study) based on small differences of evaluations cannot be established, though accumulation of numerous small differences may become significant. The preference structure using ELECTRE 1 method can be generated using discrimination thresholds (indifferences and preferences) (Figueira et al. 2005). In this study the outranking relationships between the alternatives (PIs) are distributed at different levels. The DM can encompass the desired (most important) levels with the help of decision makers boundary (DMB). Moreover, PIs can be finally ranked according to their preferences based on net concordance and net discordance indexes. 77 In ELECTRE 1 method, concordance and discordance indexes (two types of indices pair-wise comparison) are formulated to form outranking relationships between alternatives (Ap & Aq; where p, q=1,2……m and p q). The indexes can be considered as a measure of satisfaction and dissatisfaction of a DM while giving preference to a specific alternative over the others. If X1, X2,…….., Xn are the criteria for evaluation of alternatives, then xij will be the value assigned to the degree of alternative Ai with respect to the criteria Xj. The following steps were followed after determining the weights (i.e., w1, w2,….wn) to each criteria using AHP method. Step1: Normalization of weighted matrix Depending on the type of criteria (i.e., benefit or loss; for benefit higher the better and vice versa), develop the normalization matrix “Rij”. In case of mixed criteria, the values of the cost criteria will be inversed. miijijijxxR12 , i=1,2,……,n and j=1,2,……, m [4.4] Each value in normalized matrix will be multiplied with the criteria weight, and the normalized weighted matrix will attain the following from: nmnmmnnijwrwrwrwrwrwrWRV..........................................22111212111 [4.5] where all the values in the Vij matrix range between 0 and 1. Step 2: Develop Concordance and Discordance Sets As described earlier, the sets of criteria is divided into concordance and discordance sets for pairwise comparison between any two alternatives (Ap and Aq). Therefore, the resultant concordance interval set C(p,q) consists of all the attributes for which Ap is preferred over Aq, and can be presented as:  qjpj vvjqpC ),( [4.6] 78 where vpj and vqj are the weighted normalized ratings (from matrix given in equation 4.5) of the alternatives Ap and Aq respectively with respect to the jth attribute. The discordance set D(p,q) as the complement of Cpq consists of all the attributes for which Ap is worse than Aq and can be written as:  qjpj vvjqpD ),( [4.7] Step 3: Calculate Concordance and Discordance Indexes The concordance index (Cpq) shows the relative power of each concordance set by describing the degree of confidence in the pairwise decisions, such as Ap Aq (i.e., Ap outranks Aq), and can be defined as:  ),( qpCj jpq wC [4.8] Therefore, Cpq is essentially the sum of the weights of attributes in equation [4.6]. The power of discordance set can be quantified by Discordance index Dpq, which shows the degree of disagreement in the decision Ap Aq, and can be written as:  mjqjpjqpDjqjpjpqvvvvD1),( [4.9] The numerator in equation [4.9] contains the attributes contained by equation [4.7]. Step 4: Defining the Outranking Relationships The dominance of alternative Ap over Aq will be stronger in case of higher Cpq and lower Dpq. The results of ELECTRE 1 method in this study of selecting PIs for SMWU are summarized as the outranking relationships between two indicators. The calculated concordance and discordance indexes are compared with the means of these indexes as: CC pq  [4.10a] 79 and DDpq  [4.10b] where C and D are the averages of all the Cpq and Dpq respectively. The outranking relationship between two alternatives will only hold true when both the relations given in equation (4.10a&b) will be satisfied. The PIs may have different types of relationships among each other based on the conditions of equation 4.10 a&b. If both the equations hold true alternative p is better than (outranks) q; if both holds untrue, alternative p is indifferent to q; and if one holds true and the other does not, it mean p is incomparable to q. Figure 4.3 shows an example of these outranking relationships. The decision makers’ boundary has been established based on the existing needs and data availability in the participating utilities in this study. Later, with improvements in the benchmarking process, the SMWU can include more PIs by extending the DMB. Conversely, smaller utilities can initiate the performance assessment process by encompassing less (top level) PIs with DMB. Step 5: Overall Ranking of Performance Indicators The net outranking relationships can be developed by calculating net concordance index (Cp) and net discordance index (Dp) for each alternative. Cp measures the degree to which the dominance of an alternative over other alternatives exceeds the dominance of other (competing) alternatives over the alternative (i.e., Ap), and can be calculated as:   mkmk kppkpCCC1 1 ; k  p [4.11] In the same way, the Dp estimates the relative weakness of Ap relating to other alternatives and can be written as:   mkmk kppkpDDD1 1 ; k  p [4.12] The overall preference (ranks) can be established with higher Cp and lower Dp values for each alternative. Higher Cp and lower Dp values will get the higher ranks. Final ranking is done based on the ranking of net concordance and net discordance values estimated from equations [4.11] and [4.12]. 80 Figure 4.3 Outranking relations of water resources and environmental PIs showing DMB 4.3 Application of MCDA – An Example of Water Resources and Environmental PIs Estimation of Criteria Weights using AHP 4.3.1 The weights of the selected criteria were determined with the help of AHP. The pairwise comparison matrix for estimation of weights is shown in Table 4.5. The rating scheme given in Table 4.3 is used. The normalized comparison matrix is presented in Table 4.6. The weights of relevance, comparability, measurability, and understandability came out to be 0.5, 0.25, 0.15, and 0.1 respectively. The values of consistency index (CI) and consistency ratio (CR) were found to be 0.007, and 0.01 respectively. A CR value of less than 10% affirms the consistency check. Table 4.5 Pairwise comparison matrix for weight estimation using AHP Relevance Comparability Measurability Understandability Relevance/ Importance 1 2 4 5 Comparability 1/2 1 2 3 Measurability 1/3 1/2 1 2 Understandability 1/5 1/ 3 1/ 2 1 Note: Consistent variability assumption; i.e., the values in this table are the averages of the rating values given by the decision makers WE 2 WE 1 WE 6 WE 3 WE 5 DMB WE 4 WE 7 WE 8 WE 9 Legend: WE 2 WE 1 WE 1 is better than WE 2 PIs grouped at a level Decision maker’s boundary (DMB) All the PIs below this arrow are outranked by all the PIs above it. Note: Same legend is applicable to all groups of PIs. WE 2 WE 1 WE 1 is indifferent to WE 2 WE 2 WE 1 WE 1 is incomparable to WE 2 81 Table 4.6 Normalized comparison matrix for weight estimation using AHP Relevance Comparability Measurability Understandability Relevance/ Importance 0.51 0.52 0.53 0.45 Comparability 0.26 0.26 0.27 0.27 Measurability 0.13 0.13 0.13 0.18 Understandability 0.10 0.09 0.07 0.09 Development of Outranking Relationships using ELECTRE 4.3.2 Initially 9 PIs were identified for the functional component of water resources and environmental (WE). The scoring matrix developed (by the scoring system given in Table 4.2) based on experience judgment for all the categories is presented in Table 4.7. However, as an example only scores for WE category are considered for the following calculations. All the values are higher the better. Normalized weighted matrix using equations [4.4] and [4.5] is presented in Table 4.8. As a result of equation [4.6] and [4.7] to the values given in Table 4.8, the concordance and discordance interval sets are presented in Table 4.9. 82 Table 4.7 The scoring matrix along with criteria weights DM Ranking Performance Indicator (PI) (C1) Relevance/ Importance (C2) Comparability (C3) Measurability (C4) Understandability Weights 0.48 0.25 0.16 0.11 WE WATER RESOURCES AND ENVIRONMENTAL WE-1 No. of days of water restriction (%) 5 5 5 5 WE-2 Average Daily Per Capita Domestic Water Consumption 5 5 4 5 WE-3 Average Day Demand / Existing Water License Capacity 4 3 4 5 WE-4 Availability of water resources (%) 3 3 4 3 WE-5 Impact of pipe flushing on aquatic life 4 3 3 3 WE-6 Disposal of backwash water (% Residuals) 4 4 4 5 WE-7 Sector vise availability of water resources (domestic, industrial, etc.) 3 2 4 2 WE-8 GHG emissions from routine transport fuel emissions 2 4 4 2 WE-9 Per Capita Water Consumption (Overall) 2 3 4 3 PE PERSONNEL PE-1 Number of in house metering field FTEs1 / 1000 meters 4 4 4 5 PE-2 Water quality monitoring personnel (No/ 1000 tests/ year) 4 3 4 3 PE-3 Water resources and catchment management employee (No/106m3/year) 4 3 5 3 PE-4 Number of field FTEs*/ 100km length 4 3 5 5 PE-5 Number of field FTEs*/ 1000 ML treated water 4 3 4 5 PE-6 No of Lost Hours due to Field accidents/ 1000 field labour hours – (D)2 4 2 5 5 PE-7 No of Lost Hours due to Field accidents/ 1000 field labour hours – (T)3 4 2 5 5 PE-8 No. of sick days taken per field employee- (D) 4 3 5 5 PE-9 No. of sick days taken per field employee- (T) 4 3 5 5 PE-10 Total overtime field hours/ Total Paid field hours – (D) 4 3 4 5 PE-11 Total overtime field hours/ Total Paid field hours – (T) 4 3 4 5 PE-12 Personnel Training (Hours/employee/year) 4 3 5 3 PE-13 Working accidents (No/100 employee/year) 2 3 5 3 PE-14 No of Field accidents with lost time/ 1000 field labour hours – (D) 2 3 5 5 PE-15 No of Field accidents with lost time/ 1000 field labour hours – (T) 2 3 5 5 PE-16 Total available field hours/ total paid field hours – (D) 2 3 4 5 PE-17 Total available field hours/ total paid field hours – (T) 2 3 4 5 PE-18 % of field employee eligible for retirement per year – (D) 2 2 4 5 PE-19 % of field employee eligible for retirement per year – (T) 2 2 4 5 PE-20 Average Work Experience ratio PE-21 Employees per connection (No/1000 connections) 2 3 5 3 PE-22 Employees per volume of water supplied 2 3 5 3 PH PHYSICAL ASSETS PH-1 Metering level (%) 5 4 5 4 PH-2 Degree of automation (%) 5 3 5 3 PH-3 Raw water storage capacity (days) 5 4 4 3 PH-4 Treated water storage capacity at ADD (hrs) 4 3 4 5 PH-5 No of days treatment plant operated greater than 90% of its total capacity 4 3 4 5 PH-6 Pumping Utilization (%) 4 4 4 2 83 Table 4.7(Cont’d) The scoring matrix along with criteria weights DM Ranking Performance Indicator (PI) (C1) Relevance/ Importance (C2) Comparability (C3) Measurability (C4) Understandability Weights 0.48 0.25 0.16 0.11 PH-7 Remote control degree (%) 4 3 5 3 PH-8 Pump Station Energy Consumed KWH/ Total Pump Station HP 2 3 3 5 PH-9 Hydrant Density (No/Km) 3 3 5 2 PH-10 Valve density (No/Km) 3 2 5 2 PH-11 Metering density (No/ 1000 service connections) 2 3 5 3 PH-12 Treatment plant capacity (%) 3 5 4 2 OP OPERATIONAL OP-1 Service connection rehabilitation (%) 5 5 4 5 OP-2 Replaced valves (%/year) 5 3 5 3 OP-3 Mains Replaced (%/year) 5 3 5 3 OP-4 Mains Rehabilitation/ Renovation* (%/year) 5 3 3 3 OP-5 Hydrant Inspection (per year) 5 3 5 3 OP-6 Leakage (%/ year) 5 5 4 3 OP-7 Cleaning of storage tanks (per year) 4 5 4 3 OP-8 Non-Revenue Water (L/ connection/ day) 5 3 3 5 OP-9 No of Main Breaks (No./ 100Km) 5 3 5 5 OP-10 Inoperable or leaking hydrants (%) 2 3 3 5 OP-11 Residential Customer Reading Efficiency 4 5 4 3 OP-12 Operational Meters 4 5 4 3 OP-13 Infrastructure Leakage Index (ILI) 5 1 1 3 OP-14 Network Inspection (per year) 3 3 3 3 OP-15 Pump Inspection (per year) 3 3 3 3 OP-16 Apparent losses per connection 3 3 1 3 OP-17 Apparent losses per system input volume 3 3 1 3 OP-18 Real losses per connection (l/ connection/ day) 3 3 1 3 OP-19 Real losses per main length (l/ Km/ day) 3 3 1 3 OP-20 % of Inoperable of Leaking Valves 2 3 3 5 OP-21 Customer reading efficiency 2 3 4 3 OP-22 Power Failure 2 3 4 2 WP WATER QUALITY AND PUBLIC HEALTH WP-1 No of Boil-Water Advisory Days 5 5 5 4 WP-2 Cumulative Length Cleaned as % of System Length 4 4 4 5 WP-3 Average Value of Turbidity in WDS (NTU) 5 3 4 4 WP-4 No of Total Coliform Occurrences in WDS 5 3 4 4 WP-5 THMs in water distribution system (mg/L) 5 3 3 4 WP-6 Residual chlorine in distribution system (mg/L) 5 3 4 4 WP-7 Turbidity of treated water (NTU) 5 3 4 4 WP-8 No of total coliform occurrences in Treated water 5 3 4 4 WP-9 Concentration of Nitrates in treated water (mg/L) 5 3 3 4 WP-10 Aesthetic water quality tests carried out (%) 3 3 3 2 WP-11 Microbiological water quality tests carried out (%) 3 3 3 2 WP-12 Chemical water quality tests carried out (%) 3 3 3 2 QS QUALITY OF SERVICE QS-1 Billing complaints / 1000 connections 5 5 4 3 QS-2 Service connection complaints/ 1000 people served 4 3 4 3 QS-3 Number of water pressure complaints/ 1000 people served 4 5 4 5 84 Table 4.7(Cont’d) The scoring matrix along with criteria weights DM Ranking Performance Indicator (PI) (C1) Relevance/ Importance (C2) Comparability (C3) Measurability (C4) Understandability Weights 0.48 0.25 0.16 0.11 QS-4 Number of water quality complaints/ 1000 people served 4 5 4 5 QS-5 Total response to reported complaints (%) 4 5 4 3 QS-6 Number of Unplanned System Interruptions/ 100 Km main length 4 3 4 5 QS-7 Unplanned Maintenance Hours/ Total maintenance hours (%) 4 3 3 5 QS-8 Population coverage (%) 4 5 3 4 QS-9 Quality of water supplied 4 5 4 3 QS-10 Number of water quality complaints by reason/ 1000 served 2 2 4 5 QS-11 Total complaints per connection (No/1000/year)** 3 4 4 2 QS-12 Continuity of supply (%) 2 5 4 3 QS-13 Aesthetic test compliance 3 3 3 3 QS-14 Microbiological test compliance 3 3 4 3 QS-15 Physical-chemical test compliance 3 3 3 3 QS-16 Radioactive test compliance 2 2 2 2 FE FINANCIAL AND ECONOMIC FE-1 O&M Cost ('000)/ Km Length ($/Km) 5 3 4 5 FE-2 O&M cost of water treatment ($/ Million liters of treated water) 4 4 4 5 FE-3 Revenue per unit volume of supplied water ($/m3) 5 4 4 3 FE-4 Water rate for a typical size residential connection using 250 m3/year 3 4 5 5 FE-5 Operating cost coverage ratio 4 3 4 4 FE-6 Debt service ratio (%) 4 3 4 4 FE-7 NRW by volume 4 3 4 4 FE-8 Liquidity (Current ratio) 2 3 4 4 FE-9 5 year running average capital reinvestment/ replacement value – (D) 2 3 4 5 FE-10 Cost of O&M of fire hydrants/ total number of fire hydrants 2 3 4 5 FE-11 Metering O&M cost 2 3 4 5 FE-12 Pump station O&M cost ('000)/ total pump station horsepower 2 3 4 5 FE-13 Cost of Customer Communication/ Population Served 2 3 3 5 FE-14 Cost of water quality monitoring/ population served ($/ person) 2 4 4 5 FE-15 Chemical Cost / ML Treated ($/ million liters of treated water) 2 4 4 5 FE-16 Water Revenue per employee 2 2 4 3 85 Table 4.8 The normalized weighted matrix DM Ranking Performance Indicator (PI) (C1) Relevance/ Importance (C2) Comparability (C3) Measurability (C4) Understandability Weights 0.48 0.25 0.16 0.11 WE-1 No. of days of water restriction (%) 0.214 0.049 0.068 0.108 WE-2 Average Daily Per Capita Domestic Water Consumption 0.214 0.049 0.054 0.108 WE-3 Average Day Demand / Existing Water License Capacity 0.171 0.030 0.054 0.108 WE-4 Availability of water resources (%) 0.128 0.030 0.054 0.065 WE-5 Impact of pipe flushing on aquatic life 0.171 0.030 0.041 0.065 WE-6 Disposal of backwash water (% Residuals) 0.171 0.040 0.054 0.108 WE-7 Sector vise availability of water resources (domestic, industrial, etc.) 0.128 0.020 0.054 0.043 WE-8 GHG emissions from routine transport fuel emissions 0.085 0.040 0.054 0.043 WE-9 Per Capita Water Consumption (Overall) 0.085 0.030 0.054 0.065 Table 4.9 Concordance and discordance interval sets for performance indicators in WE category Concordance Interval Set C (1,2) = {1,2,3,4} C (1,3) = {1,2,3,4} C (1,4) = {1,2,3,4} C (1,5) = {1,2,3,4} C (1,6) = {1,2,3,4} C (1,7) = {1,2,3,4} C (1,8) = {1,2,3,4} C (1,9) = {1,2,3,4} C (2,1) = {1,2,4} C (2,3) = {1,2,3,4} C (2,4) = {1,2,3,4} C (2,5) = {1,2,3,4} C (2,6) = {1,2,3,4} C (2,7) = {1,2,3,4} C (2,8) = {1,2,3,4} C (2,9) = {1,2,3,4} C (3,1) = {4} C (3,2) = {3,4} C (3,4) = {1,3,4} C (3,5) = {1,3,4} C (3,6) = {1,3,4} C (3,7) = {1,2,3,4} C (3,8) = {1,3,4} C (3,9) = {1,2,3,4} C (4,1) = 0 C (4,2) = 0 C (4,3) = {1,2} C (4,5) = {1,2,4} C (4,6) = {1,2} C (4,7) = {1,2,4} C (4,8) = {1,4} C (4,9) = {1,2,4} C (5,1) = 0 C (5,2) = 0 C (5,3) = {1,2} C (5,4) = {1,2,3,4} C (5,6) = {1,2} C (5,7) = {1,2,4} C (5,8) = {1,2,4} C (5,9) = {1,2,4} C (6,1) = {4} C (6,2) = {3,4} C (6,3) = {1,2,3,4} C (6,4) = {1,2,3,4} C (6,5) = {1,2,4} C (6,7) = {1,2,3,4} C (6,8) = {1,2,3,4} C (6,9) = {1,2,3,4} C (7,1) = 0 C (7,2) = {3} C (7,3) = {3} C (7,4) = {3} C (7,5) = {3} C (7,6) = {3} C (7,8) = {1,3,4} C (7,9) = {1,3} C (8,1) = 0 C (8,2) = {3} C (8,3) = {2,3} C (8,4) = {2,3} C (8,5) = {2,3} C (8,6) = {2,3} C (8,7) = {2,3,4} C (8,9) = {1,2,3} C (9,1) = 0 C (9,2) = {3} C (9,3) = {2,3} C (9,4) = {3,4} C (9,5) = {3,4} C (9,6) = {3} C (9,7) = {2,3,4} C (9,8) = {1,3,4} Discordance Interval Set D (1,2) = 0 D (1,3) = 0 D (1,4) = 0 D (1,5) = 0 D (1,6) = 0 D (1,7) = 0 D (1,8) = 0 D (1,9) = 0 D (2,1) = {3} D (2,3) = 0 D (2,4) = 0 D (2,5) = 0 D (2,6) = 0 D (2,7) = 0 D (2,8) = 0 D (2,9) = 0 D (3,1) = {1,2,3} D (3,2) = {1,2} D (3,4) = {2} D (3,5) = {2} D (3,6) = {2} D (3,7) = 0 D (3,8) = {2} D (3,9) = 0 D (4,1) = {1,2,3,4} D (4,2) = {1,2,3,4} D (4,3) = {3,4} D (4,5) = {3} D (4,6) = {3,4} D (4,7) = {3} D (4,8) = {2,3} D (4,9) = {3} D (5,1) = {1,2,3,4} D (5,2) = {1,2,3,4} D (5,3) = {3,4} D (5,4) = 0 D (5,6) = {3,4} D (5,7) = {3} D (5,8) = {3} D (5,9) = {3} D (6,1) = {1,2,3} D (6,2) = {1,2} D (6,3) = 0 D (6,4) = 0 D (6,5) = {3} D (6,7) = 0 D (6,8) = 0 D (6,9) = 0 D (7,1) = {1,2,3,4} D (7,2) = {1,2,4} D (7,3) = {1,2,4} D (7,4) = {1,2,4} D (7,5) = {1,2,4} D (7,6) = {1,2,4} D (7,8) = {2} D (7,9) = {2,4} D (8,1) = {1,2,3,4} D (8,2) = {1,2,4} D (8,3) = {1,4} D (8,4) = {1,4} D (8,5) = {1,4} D (8,6) = {1,4} D (8,7) = {1} D (8,9) = {4} D (9,1) = {1,2,3,4} D (9,2) = {1,2,4} D (9,3) = {1,4} D (9,4) = {1,2} D (9,5) = {1,2} D (9,6) = {1,2,4} D (9,7) = {1} D (9,8) = {2} 86 Using equations [4.8] and [4.9] concordance and discordance indexes are calculated and presented as the following matrixes: The calculated concordance and discordance indexes presented above are compared with the means of these indexes C and D . The values of the means were found to be 0.630 and 0.476 for concordance and discordance respectively. The outranking relationships for each alternative (indicator) were found by comparing Cpq and Dpq values in the above matrixes with the mean values using equations [4.10a] and [4.10b]. The results are shown in Figure 4.3. The net outranking relationships have been developed by calculating net concordance index (Cp) and net discordance index (Dp) for the selected indicators within the DMB using equations [4.11] and - 1.000 1.000 1.000 1.000 1.000 1.000 1.000 1.000 0.837 - 1.000 1.000 1.000 1.000 1.000 1.000 1.000 0.252 0.415 - 0.891 0.891 0.891 1.000 0.891 1.000 C = 0.000 0.000 0.585 - 0.837 0.585 0.837 0.272 0.837 0.000 0.000 0.585 1.000 - 0.585 0.837 0.837 0.837 0.252 0.415 1.000 1.000 0.837 - 1.000 1.000 1.000 0.000 0.163 0.163 0.163 0.163 0.163 - 0.891 0.639 0.000 0.163 0.273 0.273 0.273 0.273 0.524 - 0.748 0.000 0.163 0.273 0.415 0.415 0.163 0.524 0.891 - - 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 1.000 - 0.000 0.000 0.000 0.000 0.000 0.000 0.000 1.000 1.000 - 0.120 0.140 1.000 0.000 0.06 0.00 D = 1.000 1.000 0.880 - 1.00 1.00 0.26 0.31 0.23 1.000 1.000 0.860 0.00 - 1.00 0.15 0.12 0.13 1.000 1.000 0.000 0.00 0.70 - 0.00 0.00 0.00 1.000 1.000 1.000 0.74 0.85 1.00 - 0.31 0.43 1.000 1.000 0.94 0.79 0.88 1.00 0.69 - 0.70 1.000 1.000 1.00 0.77 0.87 1.00 0.57 0.30 - 87 [4.12]. The results are presented in Table 4.10. Final ranking of the alternatives can also be established by plotting the net concordance vs. net discordance. Table 4.10 Net Outranking of selected indicators Indicator Net concordance Net discordance Ranking of net concordance Ranking of net discordance Final ranking of indicators WE1 2.660 -4.000 1 1 1 WE 2 2.007 -2.000 2 2 2 WE 3 -1.137 2.000 4 4 4 WE 5 -2.557 3.212 5 5 5 WE 6 -0.973 0.788 3 3 3 4.4 Development of Indicators The results of MCDA for each category of PIs are described in the following sections. The discussion is limited to the selected PIs confined within the DMB in this study. Details of the remaining PIs outside the DMs boundary are given in Chapters 2 and 3. Water Resources and Environment (WE) Indicators 4.4.1 The results of MCDA showing outranking relations amongst different WE indicators are presented in Figure 4.3. Liter per capita per day (lpcd) is a very important indicator to assess the present and future water requirements of any WSS, but needs to be carefully used. The ‘lpcd’ on the basis of total water requirement would be misleading. For example, in one of the utility (took part in this study), 95% of all connections belong to single family residents; whereas, 23% of the total water consumption is used to meet requirements of irrigation use which are 1.4% of the total connections. Similar water consumption patterns are common in utilities operating in Okanagan basin; where a small % of irrigation users are consuming substantial portion of total water consumption. Therefore, “lpcd for domestic users” was found to be more appropriate indicator to assess the per capita water consumption. According to NWWBI 2010 public report, per capita water consumption for residential consumers varies between 168 to 593 lpcd in Canada (AECOM 2012). Water restrictions (i.e., sprinkler regulations) from 0 to 365 days in a year (depending on rainfall intensity and availability of water resources) have been promulgated by numerous water utilities to conserve limiting freshwater resources all around the world. These restrictions need to be effectively implemented even in the regions with ample water resources in order to achieve long-term sustainability of water 88 resources. Water license issued by the concerned regulatory authority is another important indicator to ensure sustainable use of water resources. This indicator compares the average daily demand with the allocated water license capacity (in terms of % of utilized capacity) to foresee the future needs. Canadian water utilities have been using less than 1% to 75% of the allocated capacity (AECOM 2012). The utilities with the higher values need short term or long-term planning to improve their source water availability along with implementation of their water conservation strategies. SMWU may have lesser overall water demands and population growth rates in comparison to LMW. Despite this fact, a utility approaching towards maximum allocated license capacity should evaluate all the possible source water alternatives to meet its future water requirements. New source selection may increase requirements of personnel for catchment management, and more complex water treatment operations (due to inferior water quality than the existing source). Consequently, utility management would need to include additional PIs under other categories as well (i.e., personnel, operational, quality of service, water quality). Two indicators for environmental sustainability have been included in DMB shown in Figure 4.3. The first one is disposal of backwash water or the percentage residuals of water treatment plant into the receiving water body. Residual (waste) generated from water treatment facilities may contain toxic substances such as aluminum and manganese due to use of coagulants in such facilities, which can cause adverse impacts on aquatic life. Therefore, to avoid such adverse impact these residuals should be properly monitored and managed using this indicator (ENR 1987). The second indicator is the impact of pipe flushing on aquatic life in environmental category is developed in this study. Flushing on periodic basis is an important activity to keep the mains healthy and to provide safe water to the community by removing the biofilm, and corrosion tubercles. The flushed water may have higher chlorine concentrations or other pollutants. The indicator is based on a field observation of the operational staff of a participating utility, when chlorinated water used for flushing of main enters into the natural surface drains having aquatic ecosystems. The utilities should avoid routine flushing programs during dry seasons, when water volumes are low in receiving freshwater bodies. According to 2011 Water Research Foundation report of “Energy Efficiency Best Practices for North American Drinking Water Utilities”, SMWU might not be aware of their potential to manage energy budget. In general, 80 to 90% of the energy is used in transmission and distribution of water from the treatment plant or source to the consumer. The SMWU should carefully observe the efficiency of their pumping units for energy management. Therefore, to implement an efficient energy management plan, the 89 first step is to keep a record of energy consumption for the transmission, distribution, and treatment in terms of kilowatt hour (KWH). This indicator is also included in the DMB under WE category in Figure 4.3. However, the data for this PI is presently not available for performance assessment. Personnel/ Staffing (PE) Indicators 4.4.2 Human resources of a water utility play an important role to meet the performance objectives just like any other organization. As SMWU have relatively smaller spatial boundaries than LWU; sometimes operational staff might be allocated to more than one specific task. For example, the field full time employees (FTEs) responsible for service connection repairs will also look into the issues related to hydrant leakage, main breaks, and catchment management. Therefore, careful selection of appropriate indicators in this category is needed by avoiding, to include too many detailed (activity specific) indicators, and also lumping too many into one on the other extreme. Metering is one of the most important operational components of any water utility. Metering is directly associated with the customer satisfaction, reduction and control of water loss, and efficient billing system. FTEs working in the field to ensure efficient metering system came out to be the top level indicator in this category (Figure 4.4). In subsequent level, “FTEs per 100Km of pipe lengths” was found to be a useful indicator to compare the total field staffing strength with the other utilities. According to NWWBI 2012 public report, the value of this indicator ranges between 1.5 and 10.4 for the year 2010 in large Canadian utilities. This large variation might be due to higher number of customer complaints and operational issues in older systems. Other indicators in this level according to summary of multicriteria analysis results shown in Figure 4.4 are, numbers of working hours lost by the field accidents and sick leaves. However, these indicators need to be considered separately for distribution and treatment, as some of the SMWU are operating without the conventional treatment facility primarily due to availability of either the freshwater sources with acceptable lower turbidities (e.g., WSSs in Okanagan Basin) or the groundwater supplies. Secondly, SMWU are also facing financial challenges to install conventional treatment for all of their WSSs; this can increase the water rates considerably as well. In such case, utility requires fewer field personnel to maintain only residual chlorine levels. Assessment of the utility’s performance and/ or comparison with other utilities on combined basis of such indicators might be misleading. 90 Figure 4.4 Outranking relations of personnel PIs showing DMB Indicators of personnel allocated to water quality monitoring, water treatment, and management of water resources and catchment were ranked at third level in Figure 4.4. In case of surface water sources (like the water utilities operating in Okanagan Basin), catchment management is an important activity. The personnel are responsible for maintenance of drainage channels and natural surface slopes to ensure efficient runoff collection. Numbers of water quality monitoring personnel also depend on type and level of treatment facilities, source water quality, and age of the distribution mains. Therefore, performance comparison amongst utilities should be carefully carried out. DMB PE 14 PE 15 PE 17 PE 16 PE 19 PE 18 PE 21 PE 13 PE 20 PE 22 PE 4 PE 6 PE 1 PE 7 PE 8 PE 9 PE 3 PE 5 PE 2 PE 10 PE 11 PE 12 91 Indicators of overtime hours and training of personnel were found to be at level 4 as per MCDA results shown in Figure 4.4. Training of personnel in SMWU is an extreme necessity, as highly skilled and qualified personnel are difficult to hire and retain in smaller towns. Therefore, the locals (established in the utility area) can be hired and trained to get long-term benefits out of them. This indicator can be compared with other SMWU in terms of number of training hours attended per employee during the assessment year. Physical (PH) Indicators 4.4.3 This category of indicators apprises about the performance of physical assets of the water utility. The outranking relations in this category distributed at different levels circumscribed by the DMB are shown in Figure 4.5. The indicators of level of metering and automation were found to be the most important ones. The metering level (i.e., percentage of metering) plays a significant role for assessing NRW, and also a step towards sustainability by implementing scale based water pricing defined by water consumption. The automation of various components replaces the man from the different operations in a WSS. Therefore, higher degree of automation means lesser need of operational staff. This indicator is primarily important for developed countries, where even the SMWU possess higher degree of automation. For example, both the utilities from Okanagan Basin participated in this study revealed that almost all of the physical components (i.e., pumps, motors, flow measuring devices, treatment units, etc.) of their water systems are fully automated. The PIs found at level 2 in Figure 4.5 are associated with capacities of different components of WSS. The PIs include capacity of raw water storage reservoirs, and capacity of treated water reservoirs. No outranking relationships were observed between these indicators means these indicators are not mutually comparable and thus equally important. Use of these indicators for SMWU (relying on clear surface water sources) also needs care in defining the type of reservoirs. For example, the water source having primary chlorination as the only treatment, the storage reservoirs after chlorination should be included under treated water storage reservoirs, even though the water has not been treated with conventional treatment units. Third indicator included in level 2 is remaining capacity of water treatment facility. All these indicators are important for long-term planning to enhance the capacities of these physical assets with population growth, and resulting increase in water demands by various sectors including domestic, industrial, commercial, public, and agricultural. Particularly, the remaining water treatment plant capacity needs to 92 be assessed. Due to higher maximum daily demand or peak hourly demand factors than those used in design of the treatment facility, this indicator may reveal the future need of additional unit much earlier than the planned year. According to NWWBI (2010) data treatment plant capacity is calculated in terms of number of days the treatment plant operated at more than 90% of its capacity during a year. It was found that the values range between 0 and 213 days, however the median value was around 8; these results show that most of the Canadian utilities have excess treatment capacity to meet future needs. Figure 4.5 Outranking relations of physical PIs showing DMB The indicators found at third level in Figure 4.5 are the degree of remote control and utilization of pumps. Degree of remote control was given importance in decision making process because the utility managers thought that the new equipment installed in future might be remote controlled. Low pressure zones are not very uncommon in hilly or rolling terrains such as Okanagan Basin under study. Therefore, pumping utilization is perceptibly an important indicator for pressure management in low pressure zones and meeting demand during days of high consumption. However, the data for these PIs is not currently available for PA. Operational (OP) Indicators 4.4.4 The indicators in this category are amongst the most important PIs to ensure satisfactory performance of a water utility. The results of MCDA in the form of outranking relationships for the PIs in operational PH 1 PH 2 PH 3 PH 4 PH 5 PH 6 PH 7 PH 8 PH 9 PH 12 PH 10 PH 11 DMB 93 category are summarized in Figure 4.6. The number of main breaks per 100Km of main length, and the percentage of service connections gone through the rehabilitation process were found as top level indicators. Both the indicators are important for control of water loss and to ensure desirable quality of service to the consumers. In Canadian utilities, the number of breaks range from 1 to 20 per 100Km of pipe length, these results reflect large variations of pipe conditions around the country (AECOM 2012). Most portion of the supplied water is lost through leaking service connections. It was observed by the participating utilities as well with large number of complaints and consequent repairs of service connections. However, one important finding needs to be mentioned here that it is important to differentiate between the service connection repair and complaint originated through an in-house plumbing problem. Field staff of the participating utilities revealed that most of the complaints were associated with the later reason instead of the actual service connection leakage. At the subsequent level, the PIs of mains subjected to leakage, inoperable hydrants, and non-revenue water are placed in Figure 4.6. Non-revenue water (NRW) will be calculated in units of ‘liter/ connection/day’ based on the perception that main water loss occur at service connections (Hamilton et al. 2006). NRW might not be a suitable indicator of water loss where unauthorized water consumption is high, but can be considered as a useful financial indicator (Kanakoudis and Tsitsifli 2010). In this study, all the participants agreed on the usefulness of Infrastructure Leakage Index (ILI), but showed their reservations about limitations associated with apparent and real water losses estimations. According to them, presently no monitoring structure is available to estimate accurate values of such detailed water losses, due to metering inaccuracies, data handling errors, leakage from transmission and distribution mains, leakage and overflow from storage reservoirs, etc. Therefore, the indicators of ILI, apparent losses, and real losses came out to be at lower levels from MCDA in Figure 4.6, mainly due to the measurability criteria. At level 3, the indicators of percentage of mains replaced, rehabilitated, and renovated during the assessment period were found. It is essentially a stepwise asset management process of prioritizing water mains depending on their condition, age and the environmental conditions around them. For an individual water main the process starts with renovation, i.e., the pipe is found structurally sound and the limited damage can be repaired with simple methods such as sealing. In SMWU, mostly the condition of smaller diameter pipes are worse due to cracked joints or hydrogen sulfide corrosion, and thus needs rehabilitation such as, slip-lining with a new pipe; interior lining with cements or installing cured in-place liners and others (NRC 2003). 94 Figure 4.6 Outranking relations of operational PIs showing DMB Finally, pipe replacement is recommended in case of even worse pipe conditions. Therefore, it is important to consider these indicators separately even in the case of SMWU; however, it was decided to join two indicators of rehabilitation, and renovation into one based on the observation that in smaller sized mains mainly renovation is done. The other two PIs observed at this level are the percentage of valves replaced, and hydrant inspection. Hydrant inspection indicator is calculated as % of hydrants inspected during a year, the hydrant inspected more than once will be considered as many times as it was inspected; therefore, the value can be higher than 100%. A wide range of values between 6% and 500% has been reported for Canadian water utilities in NWWBI (2010) public report. OP 14 OP 15 OP 20 OP 18 OP 17 OP 16 OP 22 OP 21 OP 19 OP 1 OP 9 OP 10 OP 8 OP 6 DMB OP 2 OP 3 OP 5 OP 4 OP 7 OP 12 OP 11 95 Indicators found at level 4 in Figure 4.6 are cleaning of storage tanks, meter reading efficiency, and percentage of operational meters. The first one is important from public health security point of view, whereas, the second one ensures the accurate billing and physical working of meters. The third indicator is important to identify the structural condition of meters by estimating the out-of-order meters during the assessment period. Water Quality and Public Health Indicators 4.4.5 Comparatively a shorter list of PIs was selected through initial screening covering major health related indicators. Number of boil water advisories during the assessment period is found to be the top level indicator outranking all others in Figure 4.7. This indicator is an indirect measure of water quality status of a WSS. According to Water Canada (2013), the WSSs in British Columbia have gone through maximum number of boil water advisories as compared to other provinces. Moreover, most of the water utilities in British Columbia are SMWU with population less than 50,000. The reasons of such advisories in Okanagan Basin (study area) as reported by Interior Health Canada (2013) include source water contamination, flushing of hydrants, construction, repair and maintenance works, equipment failure, and inadequate treatment. Turbidity and total coliforms in both the treated water and the distribution systems, and residual chlorine in the distribution system were placed at level 2 (Figure 4.7). All the five indicators at this level were found indifferent from each other, which show their equal importance in this category. Both turbidity and fecal coliforms (FCs) are defined as pollution indicators by USEPA (2013), their presence in surface water is unavoidable. They are removed using the conventional water treatment processes including coagulation, sedimentation, filtration and disinfection. Average turbidity ranges between 0.01NTU and 1.38NTU in Canada, which shows overall acceptable aesthetic water quality; however, at the maximum 25 days with total coliforms were reported in NWWBI (2010) public report. Residual chlorine is added to avoid any possibility of recontamination through cross-connection within the distribution system. Moreover, in case of higher turbidity levels at the source, chlorine dose is increased. Higher concentration of residual chlorine may react with naturally occurring organic matter and consequently increase possibility of disinfection by products (DBPs) including Trihalomethanes (THMs). Higher concentrations of THMs cause negative impacts on human health including cancer. USEPA has limit the total THMs concentration less than 80 parts per billion (ppb) in treated water (USEPA 2013). Nitrates are also associated with human health risks, and its maximum allowable 96 concentration should be less than 45mg/L in drinking water (Health Canada 2013). The PIs covering the concentration of THMs and Nitrates, along with an important indicator to protect public health (i.e., cleaning/ flushing of mains) came out to be at level 3 in water quality and public health indicators as shown in Figure 4.7. Figure 4.7 Outranking relations of water quality and public health PIs showing DMB It is important to mention here that none of the indicator in this category can be considered as less important. They outrank each other due to higher or lower monitoring frequency, such as higher in case of turbidity and lower in case of THMs and nitrates; but all are important to reduce or eliminate public health risk from drinking water. The indicators proposed by IWA (2006) in terms of percentage of aesthetic, microbiological or chemical tests carried out (level 4 indicators, outside the DMB) and their compliance with standards do not provide clear condition of water quality. For example, under aesthetic tests categories, if pH is monitored at higher frequency than water colour, and complying always with standards will show an overall satisfactory picture of water quality, even though colour might not always be complying with the standards. These may be suitable in case of several sampling locations and higher monitoring frequencies such as LWU. WP 12 WP 3 WP 4 DMB WP 1 WP 6 WP 7 WP 8 WP 5 WP 9 WP 2 WP 10 WP 11 97 Quality of Service (QS) Indicators 4.4.6 Quality of service provided to the public should be efficient, and needs to be maintained through the service period to ensure satisfaction of customers. The outranking relationships between different PIs in this category are presented in Figure 4.8. Top Level-1 indicators in this category were found to be customer’s complaints regarding low pressure, deteriorated water quality, and billing related issues. It is important to mention a field observation from participants of water utilities that in most of the cases pressure and water quality complaints are originated due to plumbing issues within the building line (beyond the service connection), and not due to the distribution system inefficiencies. Therefore, special care is needed to include only the complaints generated by the distribution system failure in the performance assessment process. In this connection, a comprehensive customer complaints work order considering all the possibilities and types of complaints could be extremely helpful. Based on the observations from work orders, a detailed model to manage risk of customer satisfaction is developed in Chapter 7. Unplanned interruptions, and unplanned maintenance hours were placed at Level-2 in the outranking process by decision makers. People in developed countries understand about the need of maintenance of their water systems; nevertheless, the maintenance hours should planned and the customers must be informed well before the start of maintenance operations. Higher number of unplanned maintenance hours may lead to larger number of complaints, which will affect the performance of the utility in the benchmarking process. Response to customer’s complaints, and quality of water supplied were found at Level-3 in the outranking process by the decision makers. The important parameters in this regard are type of complaint, percentage of complaints responded, duration to resolve the complaints, actual reason of complaint at site, and percentage of the complaints resolved. Customer satisfaction is directly associated with efficient response to their complaints. In case of privately owned water systems, the cost of water is usually very high such as in United Kingdom and Wales (OFWAT 2012). In such systems, usually an intensive mechanism is adopted to ensure quickest possible response to the complaints. In case of SMWU, higher response times are common; however almost 100% of the complaints have been responded to ensure customer’s satisfaction in the participating utilities. 98 Quality of water supplied is an indicator (QS9 at Level-3) which can be calculated by checking the compliance of water quality (aesthetic, microbiological, and chemical) with the applicable standards. If the data is available QS9 can be replaced with the detailed PIs (QS10, 11, and 12). Figure 4.8 Outranking relations of quality of service PIs showing DMB Financial and Economic (FE) Indicators 4.4.7 According to ABD (1997), any project is financially sustainable if there are sufficient funds available (from both the user charges and/ or the budget sources) to meet all its resource and financial obligations. According to WB (2012), it is more important for a smaller water utility to compare its operating costs with the revenues it is generating, and whether or not it is serving its debts. A list of 46 financial indicators has been provided by IWA (2006) by splitting the cost according to type of cost (i.e., main functions of the water utility, technical functions, etc.), which might not be essential for SMWU for QS 11 QS 13 QS 14 QS 15 QS 12 QS 16 QS 10 QS 1 QS 3 QS 4 QS 6 QS 7 QS 8 QS 5 QS 9 QS 2 DMB 99 assessing its financial sustainability. The DMB encompassing 7 important financial indicators is shown in Figure 4.9. Operation and maintenance (O&M) distributed over total main length (for benchmarking) and water rates were found to the top Level-1 indicators as per MCDA results. In case of SMWU, denominator might be misleading due to relatively shorter overall main length and higher O&M cost due to less economies of scale on the other hand. Similar is the situation with next two indicators at Level-2, which are O&M cost of water treated per million liters of treated water, and revenue per unit volume of supplied water. Comparison with larger utilities for these PIs should be carefully done for performance benchmarking. Figure 4.9 Outranking relations of financial and economic PIs showing DMB Indicators at Level-3 are essentially different types of ratios between expenditure, revenues and debts (Figure 4.9). Operating cost coverage as the ratio between total annual operational revenues and total FE 15 FE 12 FE 11 FE 10 FE 16 FE 13 FE 14 FE 8 FE 9 FE 1 FE 4 FE 2 FE 3 FE 5 FE 6 FE 7 DMB 100 annual operating cost provides the information about financial performance of utility operations. Other two PIs included in this Level-3 are debt service ratio and liquidity ratio. The ability to pay debt service has become critical for any utility, as debt has become an important instrument to capitalize utility operations. Liquidity ratio is the ratio between the current assets and the current liabilities. Finally, NRW as percentage of total input volume was also included as an indicator of economic loss of water. NRW may not present a realistic picture of water losses in the distribution systems; however it provides the information of volume of water which was lost without being charged. 4.5 Final Ranking of Selected Indicators It can be seen from Figures 4.2 to 4.8 that preferences amongst different PIs within the levels (dashed boundaries) and then the overall DMB was defined for each category. The concept of net outranking relationships using complementary ELECTRE analysis has been explained above in step 5, section 2.4 and in the calculation example of water resources and environmental indicators in attached annexure. Relative dominance and relative weaknesses of each PI with respect to the other PIs in the same functional component have been established in terms of concordance and discordance indices using equation (4.11) and (4.12), respectively. The final preferences for all the categories of PIs are shown in Figure 4.10 (a-g). The solid line in the plots shown in Figure 4.10 is drawn for final ranking of the PIs, i.e., PIs with almost similar concordance and discordance indices can be ranked based on their relative distance from this line. The final ranks of selected PIs (within the DMB in Figures 4.2 to 4.8) are presented in Table 4.11. For performance assessment, the ranking of these PIs can be revaluated for weight estimation based on the relative importance (given by the decision makers) of the PIs within the functional component. Instead, the rankings shown in Table 4.11 have been developed from the results of ELECTRE method where other criteria such as comparability, measurability, and understandability have also been considered for selection of PIs. 4.5 Utilization of Selected Indicators By adopting good record keeping practices (which is not very rare in water utilities under study), and effectively utilizing this data the selected PIs in Table 4.3 can be calculated. However, the conventional methods of performance benchmarking such as regression analysis (based on ample historical data for each indicator) cannot be used in the start for SMWU. Therefore, the indicators for which benchmarking data is available, the performance of each indicator can be compared and can be assigned a relative rank. 101 For others, the utility can establish its own benchmarks using literature or experts knowledge. Using suitable methods, weights of the PIs under each category can be determined. Table 4.11 Final ranking of selected PIs under DMB for SMWU Water Resources & Environmental Personnel Physical Operational Water Quality & Public Health Quality of Service Financial & Economic Rank PIs1 Rank PIs Rank PIs Rank PIs Rank PIs Rank PIs Rank PIs 1 WE 1 1 PE 1 1 PH 1 1 OP 9 1 WP 1 1 QS 1 1 FE 4 2 WE 2 2 PE 4 2 PH 2 2 OP 1 2 WP 3 2 QS 3 2 FE 1 3 WE 6 3 PE 6 3 PH 3 3 OP 10 3 WP 4 3 QS 4 3 FE 34 WE 3 4 PE 8 4 PH 5 4 OP 8 4 WP 6 4 QS 6 4 FE 2 5 WE 5 5 PE 9 5 PH 4 5 OP 6 5 WP 7 5 QS 7 5 FE 5 6 WE 4 6 PE 7 6 PH 7 6 OP 2 6 WP 8 6 QS 8 6 FE 6 - - 7 PE 5 7 PH 6 7 OP 3 7 WP 5 7 QS 5 7 FE 7 - - 8 PE 3 - - 8 OP 5 8 WP 9 8 QS 9 - - - - 9 PE 11 - - 9 OP 4 9 WP 2 9 QS 2 - - - - 10 PE 10 - - 10 OP 7 - - - - - - - - 11 PE 2 - - 11 OP 11 - - - - - - - - 12 PE 12 - - 12 Op12 - - - - - - 1Description of each indicator can be seen in Table 4.1 A conceptual cognitive map as an example to estimate the water resources and environmental sustainability index is shows in Figure 4.11. The figure shows that increase in some PIs has positive impact (WE1 number of days of water restrictions) on WEI, whereas some of the PIs need to be reduced or controlled (WE4 Energy consumption) to improve the sustainability. The different types of data variables required to calculate the PIs are also in Figure 4.11. By limiting the PIs to the most essential ones in this study reduces the data collection efforts of SMWU. However, the assigning weights required great care, one important indicator with poor performance can significantly reduce the index value. On the other hand, if a relatively smaller weight is allocated to an important indicator and the resultant index is showing a higher value (acceptable performance), the results could be misleading. This situation should primarily be avoided in case of water quality indicators, where all the PIs should meet water quality standards. The detailed description of use of these selected PIs for performance benchmarking of SMWU is given in Chapter 5. 102 Figure 4.10 Net Concordance (C) and discordance (D) indexes for all seven categories of PIs; (a) Water resources and environment; (b) Personnel; (c) Physical; (d) Operational; (e) Water quality and Public Health; (f) Quality of Service; (g) Financial and Economic -6-4-20246-6 -4 -2 0 2 4-8-6-4-20246-10 -5 0 5 10C D -4-3-2-101234-10 -5 0 5 10-6-4-202468-13 -8 -3 2 7 12-6-4-20246-10 -5 0 5 10-3-2-101234-8 -3 2 7-4-3-2-1012345-8 -3 2 7WE 1 WE 2 WE 6 WE 2 WE 3 WE 2 WE 5 WE 2 WE 4 WE 2 D C (a) PE 1 PE 4, 6, 8 &9 PE 7 PE 5 PE 3 PE 11 PE 2 PE 10 PE 12 (b) C D PH 1 PH 2 PH 3 PH 5 PH 4 PH 7 PH 6 (c) C D OP 9 OP 1 OP 10 OP 6 OP 8 OP 2, 3, 5 OP 7, 11, 12 OP 4 (d) C D WP 1 WP 3, 4, 6, 7, 8 WP 5 & 9 WP 2 (e) C D QS 3 & 4 QS 1 QS 6 QS 7 QS 8 QS 5 & 9 QS 2 (f) C FE 4 FE 1 FE 3 FE 2 FE 5, 6 & 7 D (g) 103 Figure 4.11 An example of cognitive map for estimation of water resources and environmental sustainability index 4.6 Summary Existing performance indicators (PIs) systems developed for large water utilities need to be re-evaluated for SMWU. In Chapter 3, 114 potential PIs were identified in the water resources and environment, personnel, operational, physical, water quality, quality of service and financial categories. These PIs are evaluated against applicability, understandability, measurability and comparability criteria using the Elimination and Choice Translating Reality (ELECTRE) outranking method for multicriteria decision analysis. The criteria weights and scoring of PIs were done through group decision making. The results revealed that ELECTRE is a suitable method when the preferences between various alternatives based on small differences of evaluations cannot be established. The network maps based on outranking results provides an opportunity to the utility management to encompass the most suitable PIs based on data availability, and specific needs of their utility. WEI WE 3 WE 2 WE 6 WE 1 WE 5 - + - + - + + + + - A1 A2 A11 E1 A4 A5 A6 A7 A8 Legend: Data variables Performance Indicators Performance Index Performance Index E1 WE 4 WEI WE 4 A12 D 36 - + W3 W2 W6 W1 W5 W4 104 Chapter 5 Inter-utility Performance Benchmarking Model (IU-PBM) A part of this chapter is published in ASCE’s Journal of Water Resources Planning and Management as an original research article titled “Inter-utility Performance Benchmarking Model (IU-PBM) for Small to Medium Sized Water Utilities: Aggregated Performance Indices” (Haider et al. 2015b). 5.1 Background Towards sustainable performance, the first step is to evaluate the existing performance of all the functional components of the water utility using suitable PIs. The general concept of performance benchmarking of a water utility is to compare its performance with benchmarks (or guidelines and standards) established by the regulatory agencies, and through cross-comparison with other utilities (Marques and Witte 2008, Alegre et al. 2006). Based on the results of performance benchmarking, rational decisions for effective asset management can be taken. Nevertheless, performance benchmarking of water utilities has always been a daunting task for water utility’s management. Literature review conducted in Section 2.8 of Chapter 2 reveals that substantial data is required for most of the benchmarking methods, which is only possible where there are several utilities participating over a long-time; this certainly is not the case of Canadian SMWU. Conventionally, a comparison of a particular utility’s performance has been made with the best and worst performing utilities to calculate a performance score for each PI as (Stahre et al. 2008):        1090)(\"\" valueworstPIvaluebestPIvalueworstPIvalueactualPIPIScore [5.1] The above equation [5.1] produces a performance score ranging between 10 and 100, where 10 is for worst performing utility and 100 is for the best performing utility. Equation (5.1) calculates a minimum score of ‘10’ for the worst performing utility; this might be due to the fact that even such underperforming utility is operational and performing its routine functions. The equation does not compare the performance of a utility with the desirable standards, which could be outside the range of the performance scores of the participating utilities. When the comparison is being done for few utilities with the possibility that none of them is performing satisfactorily, the score from equation [5.1] could be misleading. Moreover, the equation [5.1] follows a straight line which can be erroneous to calculate the PI 105 score for an average performing utility. For example, in Figure 5.1 the performance score using the equation [5.1] for the water utility with an average performance (for a particular PI) comes out to be 30 instead of 50. Firstly, it is important to consider the relative performance of the utilities by calculating the performance gap from the benchmark in terms of a performance level (or score). This concept is explained in Figure 5.2. One utility performing better than the other one but has a PI value slightly less than the benchmark could be motivated to further improve its performance for the coming year. Likewise, the best performing participating utility should also be rationally compared with the benchmark, i.e., it is possible that the best one itself is just approaching the benchmark. On the other hand, the best utility will need to maintain its performance with an even higher value than the benchmark. This type of comparison can only be made with the help of a benchmarking approach (with limited data) which can cover the entire variation of performance shown in Figure 5.2. Secondly, while establishing the BTFs, the data from larger water utilities reported in NWWBI public reports is used keeping in view the less economies of scale in SMWU. Thirdly, the model calculates the aggregated performance indices, for top level management, which rationally reflect the performance of each functional component in terms of its closeness to the most desirable performance and remoteness from least desirable performance. Finally, for the proof-of-concept, the proposed model has been validated with a case study of two medium sized water utilities in Okanagan Basin, British Columbia, Canada. The primary objective of this chapter is to develop a simple (with most relevant selected PIs), albeit comprehensive (covering all the performance categories), inter-utility performance benchmarking model (IU-PBM) for SMWU in Canada. The model adequately addresses the existing research gaps. The model calculates the aggregated performance indices which rationally reflect the performance of each category in terms of its closeness to the most desirable solution and remoteness from least desirable solution. The proposed model has been applied on a case study of two medium sized (10,000 < Population < 50,000) water utilities in Okanagan Basin, British Columbia, Canada. 106 Figure 5.1 Graphical description of Equation [5.1] showing misleading calculation of performance score Figure 5.2 Relative performance of water utilities in terms of performance gap between the calculated PI values and benchmarks using performance score 024681012min Avg MaxCalculated value of PI PI value from participating water utilities 30 10 50 100 Worst Best Misleading (PS) using equation [5.1] Average Performance Score (PS) Participating utility with average performance should get a PS of 50 Straight line behavior of equation [5.1] 10 50 100 Worst Best Average Performance Score Utility’s Performance ‘PI value’ Range Established Benchmark Value of PI Utility performing better than the benchmark Additional tolerance available- Maintain Utility performing slightly worse than the benchmark Critical condition -Improve Utility performing much less than the benchmark Large Performance Gap – Major improvements required Small performance Gap 107 5.2 Approach and Methodology 5.2.1 Performance Benchmarking Modeling Approach A modeling approach for the proposed IU-PBM for SMWU is shown in Figure 5.3. For performance benchmarking, out of 62 selected PIs in Chapter 4, a set of 47 PIs (based on existing data availability) grouped into seven performance categories covering all the essential functional components of a water utility, i.e., water resources and environment, personnel, physical, operational, water quality and public health, quality of service, and economics and finance. The performance indicators have either commensurate or non-commensurate values (i.e., percentage or ratio), which are calculated with the use of data variables. These calculated values of PIs are then compared with the performance benchmarks using the benchmarking transformation functions (BTFs) to assess the performance gap in terms of performance level. These performance levels are then aggregated to develop performance indices. Finally the proposed model has been implemented on a case study of Okanagan Basin, British Columbia (BC), Canada. The details of different components of modeling approach shown in Figure 5.3 are presented in the following sections. 5.2.2 Benchmarking Transformation Functions Due to lack of historic performance data of SMWU in Canada, their performance cannot be assessed using a conventional metric benchmarking method (i.e., based on cross comparison amongst several utilities). Consequently, an effort has been made to develop different BTFs (linear, polynomial, logarithmic, and exponential) based on NWWBI public report (NWWBI-PR) (AECOM 2012), literature values (for the PIs not included in NWWBI), and expert opinion. These BTFs transform the calculated value of all the 47 PIs into performance level between 10 and 100, with 10 being very poor and 100 being very good. For some PIs, an increasing trend (benefit criterion), whereas for others a decreasing trend (cost criterion) is desirable. This approach accommodates all possibilities for utilities; i) performing much worse than the established benchmarks; ii) performing close (e.g., slightly higher or lower) to the benchmark, and iii) which have been performing equal or better than the benchmarks (Figure 5.2). 108 Figure 5.3 Modeling approach of performance benchmarking model for SMWU Participating Water Utilities Data Variables - Water resources and environmental - Personnel - Physical assets - Operational - Demographic - Service - Financial Importance ranking of performance indicators Literature review and initial screening of PIs for SMWU (Chapter 2 and 3) CSA (2012) OFWAT (2012) ADB (2012) WB (2011) NWC (2011) NRC (2010) IWA (2006) AWWA (2004) Selection of 47 PIs using Multicriteria Analysis for SMWU (Chapter 4) Establishing relationships to calculate performance level (Benchmarking Transformation Functions) - NWWBI Public Reports - Linear - Polynomial - Exponential - Power - Literature - Expert judgment Transformation of PIs into Performance Level ranging between 10 and 100 using benchmarking relationships Aggregating Indices – TOPSIS Application WEI PEI PHI OPI WPI QSI FEI Calculation of 47 PIs (Performance Evaluation) WP (6) WE (4) PE (7) OP (9) PH (4) QS (11) FE (6) Performance weights of PI using Simos’ Method Calculated value of PIs 109 5.2.3 Performance Aggregation Indices The performance levels of individual PIs obtained from BTFs might not be desirable by the senior managers and decision makers. In general, utility managers are more interested in developing composite indices to save their time and efforts which are required to evaluate the individual PI (Galar et al. 2014). A performance index combines information obtained by calculating several PIs into one final score; it consists of a weighting process and an aggregation process. The weighting process is required to determine the importance weights of all PIs under each category; and the aggregation process is finally applied to combine the performance level with their respective weights. 5.2.4 Simos’ Method for Estimating the Weights of PIs In order to develop the aggregation performance indices, the PIs have to be weighed between 0 and 1 depending on their relative importance under each category. In this research, Simos’ method is used for this purpose due to its simple and easily interpretable procedure (Marzouk et al. 2015). The methods initiates with the ranking of PIs by the decision makers (i.e., utility managers and experts). For analysing these ranks, a table containing seven columns (C1 to C7) can be constructed. C1 corresponds to number of PIs in each of category; while the PIs are described in second column. C3 lists the average frequency of the importance ranks scored by the decision makers from least important to the most important PI in ascending order. Subsequently, the PI with the maximum average frequency is given the higher Simos’ rank in C4. PIs with the same average frequency are allocated the same rank. C5 shows the number of PIs, and the non-normalized weights of the PIs are presented in C6. The weights of all the PIs, which are essentially the positions of Simos’ rank already listed in C4 will be estimated in C7. The responses are attached in Appendix B. 5.2.5 Aggregating Performance Indicators using TOPSIS Method In this research, the following performance indices are used to state the performance of the functional components of SMWU.  Water resources and environmental sustainability index (WEI)  Personnel adequacy index (PEI)  Physical assets efficacy index (PHI)  Operational integrity index (OPI) 110  Water quality and public health safety index (WPI)  Quality of service reliability index (QSI)  Financial and economic stability index (FEI) In order to appreciate the concept proposed in Figure 5.2 based on the relative closeness of the calculated performance score of individual PIs to the most desirable performance (100) and its remoteness from least desirable performance (10), there is a need of an aggregation approach based on the synthesizing criterion (Figueira et al. 2005). Therefore, the Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) method is used to aggregate the performance levels of the PIs to develop the above mentioned performance indices for SMWU. The indices developed by TOPSIS are based on the concept of similarity (i.e., relative closeness) to the positive-ideal solution (PIS) and the remoteness from the negative-ideal solution (NIS) (Yoon and Hwang 1995, Hwang and Yoon 1981). The method assumes that the performance level of each indicator is either a monotonically increasing or decreasing function, which means the higher value of index corresponds to higher performance. A Step-by-step procedure of the TOPSIS method is given below: Step 1: Estimation of weights of performance indicators under each performance category. This step involves application of Simos’ method, which has already been described in the above section. Step 2: Checking need for normalization. As the performance levels of all the PIs range between 10 and 100 with no units, there is no need of normalization in present study. Step 3: Development of Weighted Matrix. The weighted value of each indicator is calculated as: ijjij xwv  [5.2] where vij is the weighted value of each performance score, wij is the corresponding weight of that indicator, and xij is the performance score obtained from benchmarking relationships. 111 Step 4: Identify positive-ideal and negative-ideal solutions. The X* and X- are defined as the PIS and NIS, respectively, in terms of weighted performance scores:  ***2*1* ,........,,........., nj vvvvX      miJjvJjv ijiiji ,,.........1m n,max 21  [5.3]    nj vvvvX ,........,,........., 21     miJjvJjv ijiiji ,,.........1max,m n 21  [5.4] where J1 is the set of benefit attributes and J2 is a set of cost attributes. Step 5: Calculate the distance of each water utility from PIS and NIS In this step, the distance of all the performance levels in a performance category (for each participating utility) are measured by the n-dimensional Euclidean distance. The separation or distance of each PI from the PIS can be calculated as:   ....,,.........1,12** mivvY nj jiji  [5.5] and the distance of each performance indicator form the NIS can be calculated as:   ....,,.........1,12 mivvY nj jiji  [5.6] Step 6: Develop aggregate performance indices by calculating similarities to PIS The overall performance index of each performance category (functional component) can be calculated by using the following equation:   .,,.........1,** miYYYPiiii   [5.7] 112 As the result of equation [4.7] is a ratio, it should be multiplied by 100 in order to translate it into global performance index. 5.3 Development of Benchmarking Transformation Functions The overall performance benchmarking process commences with calculating PIs (selected in Chapter 4 under each functional component) using the required data variables. The BTFs developed for each PI are presented in Table 5.1. The performance levels are established between 10 and 100, due to the fact that an initial value of ‘0’ does not seem justifiable for an operational water utility. The adjustment of reported values in NWWBI-PR and literature has been explained with two examples of per capita water consumption (WE2, a water resources indicator) and percentage of service connection repairs (OP5, an operational indicator) in Figure 5.4a&b, respectively. In Figure 5.4a, the values reported in NWWBI-PR for large water utilities, where the maximum value of WE2 has been reported as ‘593’, could be an average value in SMWU. The value of WE2 in SMWU can go up to 900lit/capita/day (AECOM 2014). Therefore, the values are extrapolated, keeping in view the relatively lower rates and higher water consumptions (due to lower treatment levels, higher water availability, less population, less awareness about water conservation, etc.) in SMWU. In case of second example, OP5 shown in Figure 5.4b, the minimum, median, and maximum values reported in NWWBI 2012 public report were found to be convincing for SMWU as well. In the same way the BTFs for the rest of PIs have been developed in Table 5.1. Some of the selected PIs were not found in NWWBI 2012 report; nevertheless, the indicators have to be included for performance benchmarking of SMWU. For instance, service connection rehabilitation rate is an important indicator with respect to customer satisfaction and water loss control, but this has not been considered in NWWBI-PR. Hence, literature values or expert judgment have been used to develop BTFs for such PIs (refer to last column of Table 5.1). The rationale behind the development of BTFs under each performance category is described in the following sub-sections. However, the discussion is limited to the PIs for which either the NWWBI values are adjusted or the relationships have been developed on the basis of expert opinion and literature. Details about other PIs can be seen in Chapters 3 and 4. 113 Table 5.1 Benchmarking transformation functions developed for performance benchmarking for SMWU PI # Description of PI Formula/ data variables Units Transformation Function Benchmarking Transformation Functions (BTFs) R2 Sources 1. WATER RESOURCES AND ENVIRONMENT WE 1 No of days with water restrictions [days with water restrictions] Days Linear (PL)WE1 = 0.26(WE1)+10 0.99 NWWBI WE 2 Per capita domestic water consumption [(Total volume of water supplied to residential consumers in the year)/(Population served x 365)] Liter/ person/ day Linear (PL)WE2 = 126 - 0.12(WE 2) 0.99 NWWBI EOS WE 3 Remaining annual water license capacity (WLC) [(Annual water demand)/(Existing annual WLC)] x 100 % Linear (PL)WE3 = 100 -1.13 (WE 3) 0.99 NWWBI EOS WE 4 Impact of residual chlorine in flushing water on aquatic life [Distance between the discharge point to the receiving water body] meters Polynomial (PL)WE4 = 4.8 +0.088(WE4) – 0.00002(W$)2 0.99 EO + Literature 2. PERSONNEL PE 1 Field full time equivalent (FTEs) [(Number of full time equivalent employees working in the field for distribution)/(Total main length/100)] No/ 100km Linear (PL)PE1 = 30.3(PE1)-14 ; IF 1 < PE1 < 4 (PL)PE1 = 167 – 16.3 (PE1) ; IF PE1 > 4 0.97 0.98 NWWBI PE 2 Field FTEs - Metering [(Number of full time equivalent employees working in field for metering)/(Number of meters/1000)] No/ 1000 meters Linear (PL)PE2 = 172(PE2) + 10 ; IF 0 < PE2 < 0.05 (PL)PE2 = 193 – 1883 (PE2) ; IF PE2 > 0.05 0.98 0.99 NWWBI PE 3 Field accidents [(Number of lost working hours due to field accidents in a year)/(Total number of field labour hours / 1000)] Lost hours/ 1000 field labour hours/ year Polynomial (PL)PE3 = 0.073(PE3)2 - 5(PE3)+100 0.99 NWWBI EOS PE 4 Sick days taken by FTEs [(Total number of sick leaves in a year)/(Total number of employee)] Days Linear (PL)PE4 = 100- 5(PE4) 0.99 NWWBI PE 5 Water resources and catchment management employees [(Number of FTEs working in WR and catchment planning)/(total input volume/106)] No/ million m3/ year Linear (PL)PE5 = 172(PE5) + 10 ; IF 00.05 0.98 0.99 EO PE 6 Overtime of FTEs [(Total overtime field hours in a year)/(Total paid field hours in a year)]x100 % Polynomial (PL)PE6 = 0.08(PE6)2 – 5.3(PE6)+100 0.99 NWWBI EOS PE 7 Personnel training hours [(Number of training hours during a year)/(Total number of employees)] Hours/ employee/ year Linear (PL)PE7 =PE7 1.0 EO 3. PHYSICAL ASSETS PH1 Metering level [(number of connection with meters installed)/(total number of connections)] x 100 % Linear (PL)PH1 = PH1 1.0 EO PH2 Degree of automation [Number of automated control units/ number of control units] x 100 % Linear (PL)PH2 = PH2 1.0 EO PH3 Raw water storage capacity [(Net capacity of raw water reservoir)/(volume of supplied water during the year)] x 365 Days Linear (PL)PH3 = 0.3(PH3)+8 0.99 EO PH4 Treated water storage capacity [(Volume of treated water reservoir)/ (Average daily demand / 24)] Hours Polynomial (PL)PH4 = 10 - 0.011(PH4)2+2(PH4) 0.98 NWWBI 114 Table 5.1 (Cont’d) Benchmarking transformation functions developed for performance benchmarking for SMWU PI # Description of PI Formula/ data variables Formula/ Units Transformation Function Benchmarking Transformation Functions (BTFs) R2 Sources 4. OPERATIONAL OP1 Number of main breaks [(Number of main breaks)/(Total length of mains in Km/ 100)] No/ 100 km Exponential (PL)OP1 = 100 e-0.114(OP1) 0.99 NWWBI OP2 Mains replacement [(Length of mains replaced during the year) / (total mains length)] x 100 % Polynomial (PL)OP2 =11 – 153(OP2)2+260(OP2) 0.99 EO + Literature OP3 Mains rehabilitation/ renovation [(Length of transmission and distribution mains rehabilitated / renovated during the year) / (total mains length)] x 100 % Polynomial (PL)OP3 =11 – 153(OP3)2+260(OP3) 0.98 EO + Literature OP4 Non-revenue water [(System input volume) - (Annual Billed consumption)] / Number of connections Liters/ connection/ day Exponential (PL)OP4 = 100 e-0.002(OP4) 0.99 NWWBI OP5 Service connection rehabilitation [(Number of connections replaced or renovated during the year) / (total number of connections)] x 100 % Logarithmic (PL)OP5 = 20 ln (OP5) +93 0.97 NWWBI OP6 Inoperable or leaking hydrants [(Number of inoperable or leaking hydrants during the year) / (total number of hydrants)] x 100 % Exponential (PL)OP6 = 100 e-1.14(OP6) 0.99 NWWBI EO OP7 Valves replacement (Number of mains valves replaced during the assessment period x 365 / assessment period) / total number of mains valves x 100 % Polynomial (PL)OP7 = 8 – 587(OP7)2+475(OP7) 0.99 EO OP8 Hydrant inspection [(Number of hydrants inspected during the assessment year) / total number of hydrants]x100 % Polynomial (PL)OP8 =2 - 0.014(OP8)2+2.3(OP8) 0.95 NWWBI EO OP9 Cleaning of treated water storage tanks [(total volume of storage tanks cleaned during the year)/(total volume of all storage tanks)] x 100 % Linear (PL)OP9 =0.97(OP9)+10 0.99 EO 5. WATER QUALITY AND PUBLIC HEALTH WP1 Days with boil water advisories [(No of Boil-Water Advisory days) x (persons affected)] / Population served Days Linear (PL)WP1 = 100 - 48 (WP1) 0.99 NWWBI + EO WP2 Average turbidity [Average turbidity in distribution system] NTU Polynomial (PL)WP2 =2.6 (WP2)2 – 31(WP2)+100 0.99 NWWBI EO WP3 Total Coliforms occurrences [No of total coliform occurrences] No Linear (PL)WP3 = 100 - 2.3 (WP3) 0.99 NWWBI EO WP4 Residual chlorine [Average residual chlorine in distribution system] mg/ L Polynomial (PL)WP4 = 1.83 (WP4)2 – 30(WP4)+102 0.99 EO WP5 Average THMs [Average concentration of THMs in distribution system] mg/ L Exponential (PL)WP5 = 100 e-6.8 (WP5) 0.97 NWWBI EO WP6 Length of mains cleaned [(Length of mains cleaned during the assessment period x 365 / assessment period) / (total mains length)] x 100 % Polynomial (PL)WP6 = - 0.011(WP6)2 + 1.9 (WP6)+10 0.99 NWWBI EO 115 Table 5.1 (Cont’d) Benchmarking transformation functions developed for performance benchmarking for SMWU PI # Description of PI Formula/ data variables Formula/ Units Transformation Function Benchmarking Transformation Functions (BTFs) R2 Sources 6. QUALITY OF SERVICE QS1 Billing complaints [(Number of billing complaints during the AP) / (number of registered customers)] No/ 1000 connections Linear (PL)QS1 = 100 - 20 (QS1) 1 EO QS2 Pressure complaints [(Number of water pressure complaints during the AP) / (total Population served/ 1000)] No/ 1000 persons Polynomial (PL)QS2 = 5.7 (QS2)2 – 47(QS2) +100 0.95 NWWBI EO QS3 Water quality complaints [(Number of water quality complaints during the AP x 365 /AP) / (Population served/ 1000)] No/ 1000 persons Linear (PL)QS3 = 100 - 20 (QS3) 1 NWWBI EO QS4 Unplanned interruptions [(Number of unplanned interruptions during AP) / (total mains length/ 100)] No/ 100 km Exponential (PL)QS4 = 100 e-0.053 (QS4) 0.98 NWWBI QS5 Unplanned maintenance hours [(Total unplanned maintenance hours during AP)/(Total maintenance hours in AP)] x 100 % Exponential (PL)QS5 = 100 e-0.023 (QS5) 0.99 NWWBI QS6 Population coverage [(Population served by the utility)/(Total population of the area under utility)] x 100 % Linear (PL)QS6 = 2.5(QS6) - 140 1 EO + Literature QS7 Total response to complaints [(Total number of responses to reported complaints)/(total number complaints)] x 100 % Polynomial (PL)QS7 = 0.056(QS7)2 - 6.6(QS7) + 194 0.99 EO QS8 Service connection complaints [(Number of other complaints and queries during the assessment period x 365 / assessment period) / (total population served/ 10000] No/ 1000 persons Linear (PL)QS8 = 100 - 10 (QS8) 1 EO QS9 Aesthetic tests compliance [(Number of aesthetic tests complying with the applicable standards during the AP) / (total number of aesthetic tests carried out during the AP)] x 100 % Linear (PL)QS9 = 2.6(QS9) – 161 0.99 EO QS10 Microbiological tests compliance [(Number of microbiological tests complying with the standards during the AP) / (total number of microbiological tests carried out during the AP)] x 100 % Linear (PL)QS10 = 2.6(QS10) – 161 0.99 EO QS11 Physico-chemical tests compliance [(Number of physical-chemical tests complying with the standards during AP) / (total number of physical-chemical tests carried out during the AP)] x 100 % Linear (PL)QS11 = 2.6(QS11) – 161 0.99 EO 7. ECONOMICS AND FINANCE FE1 Water rates [Water rates for a typical size residential connection using 250 m3/year] $ Polynomial (PL)FE1 = 134 – 0.17(FE1) 0.99 NWWBI EOS FE2 Operation and maintenance cost (Total O&M Cost/ 1000)/ (Main length in Km) $ (‘000) / Km Linear (PL)FE2 = 112 - 3 (FE2) 0.99 NWWBI EOS FE3 Revenue per unit of supplied water [(Operating revenues-capitalized costs of the constructed assets)/(Billed consumption during the assessment year)] $ / m3 Polynomial (PL)FE3 = 250(FE3) – 50 0.99 EO + Literature FE4 Operating cost coverage ratio [(Total annual operational revenues)/( Total annual operating costs)] Ratio Polynomial (PL)FE4 = 97(FE4)2 – 43(FE4) +10 0.99 EO + Literature FE5 Debt service ratio [(Cash income)/( Financial debt service “FDS”)] Ratio Polynomial (PL)FE5 = 110(FE5)^3– 510(FE5)2+813(FE5)- 369 0.99 EO + Literature FE6 NRW by volume [(Non-revenue water) /(system input volume, during the assessment period)] x 100 % Linear (PL)FE6 = 100 – 2 (FE6) 0.99 EO + Literature 116 (a) (b) Figure 5.4 Examples of performance benchmarking relationships (a) per capita water consumption of residential consumers (WE2), a water resources indicator, (b) percentage of service connection repairs in a year (OP5), an operational indicator (PL)WE2 = 126 - 0.12 (WE2) + 125.57 R² = 0.99 020406080100100 200 300 400 500 600 700 800 900 1000Performance Level (PL) Average per capita water consumption (lpcd) Knowledge basedNWWBILinear (Knowledge based)Max. Min. Median (PL)OP5 = 20ln(OP5) + 93 R² = 0.98 0204060801000 0.2 0.4 0.6 0.8 1 1.2Performacne Level (PL) Service connection repairs (%) Knowledge basedNWWBILog. (Knowledge based)Median Min. Max. 117 5.3.1 Water Resources and Environmental Sustainability Except for WE4 “flushing of water mains”, all other PIs were included in NWWBI-PR and are adjusted in Table 5.1 for SMWU. Periodic flushing is an important activity to keep the water mains healthy and to provide safe water to the community by removing the biofilm, and corrosion tubercles. The flushed water may have higher chlorine concentrations or other pollutants. As per Canadian water quality guidelines, the sum of all reactive chlorine species concentrations should be less than 0.5mg/L for the protection of aquatic life in freshwaters (CCME 1999). According to the USEPA (1984) ambient water quality criteria, the acute toxicity values for fish species vary from 50g/L to 250g/L. Water mains in smaller utilities are usually flushed with 1500 to 2000 gallons/min of flow that contains 1.5 to 2.5 mg/L of chlorine residual. The flushing periods vary between 2 to 6 hours once or twice a year, depending on the location of water mains, raw water quality, and level of water treatment. The flushing water is usually conveyed through storm water drains to the receiving freshwater body (i.e., creek or a stream). Chlorine decays while moving in surface water (Gang et al. 2003). A more practical approach has been adopted to develop BTFs for this indicator (WE5) by considering acute toxicity limits of 250g/L. A linear relationship is developed for this indicator mapped over the length of a collection drain of 100m to 1500m. This range has been established based on an assumption that the chlorine concentration will reduce to half of its initial concentration (following first order kinetics) while flowing through 1000m collector storm water drain. Thus, the field and technical personnel should avoid the planned flushing programs during low flow (dry) periods to minimize the impact on aquatic life. However, spot flushing for smaller durations can be done on customer’s requests. 5.3.2 Personnel Adequacy Largely, in Canadian SMWU, the trends of outsourcing are almost negligible; thus, the full-time equivalents (FTEs) are distributed amongst their permanent employees. Due to small number of personnel, some of the operational personnel need to multitask. In this regard, special care has to be taken while calculating FTEs for various indicators in this category. Secondly, in large utilities, the personnel indicators are calculated to check staff productivity to maintain the minimum number of employees per km of the water mains’ length (or number of connections). In the case of SMWU, the problem is more intricate, where a minimum number of employees should be adequate to efficiently perform routine operations on one side; and, the staff productivity should be kept optimum on the other side. To deal with this, non-monotonic (increasing and then decreasing) behaviour, two linear benchmarking relationships 118 have been developed for PE1, PE2, PE5 and, PE8 to cover the entire range of performance levels between 10 and 100 (refer to Table 5.1). In order to develop the non-monotonic function, for these PIs, the function starts by increasing from minimum to median value and then continues with a decreasing function until the maximum value of NWWBI-PR. For example, in the case of Field FTEs per 100Km of water mains (PE1), a linear function is established between minimum, median and maximum reported values of 1.5, 4 and 10, respectively, to calculate performance level. It can be seen in Table 5.1 that most of the PIs in this category have been included in NWWBI-PR. Two very important PIs are additionally included in this study, i.e., water resources and catchment management employees (PE8) and personnel training hours (PE11). It was found during personal communication with participating water utilities that water resources and catchment management is an important activity for them as most of the WSSs are relying on source water quality. Training of personnel (PE11) is of supreme importance due to the fact that highly skilled and qualified personnel are difficult to hire and retain in SMWU. Therefore, the locals (preferably residing in the utility’s service area) can be hired and trained to develop a long-term association with them. To obtain performance level for this indicator, he BTF (based on expert opinion) varies from 10 to 100 hours per employee in a year. 5.3.3 Physical Assets Efficacy This performance category evaluates the efficacy of physical assets including raw and treated water reservoirs, metering level, and degree of automation. Based on the data obtained from the NWWBI-PR, a second order polynomial function with a high R2 value of 0.98 is established for PH4. The participating utilities (similar to other SMWU in Canada) primarily rely on source water quality; and the source water has been treated with primary chlorination as the only treatment. In this case, the storage reservoirs receiving this chlorinated water (without complete filtration) should be included under treated water storage reservoirs. Other important indicators not included in NWWBI-PR are metering level (PH1), degree of automation (PH2) and raw water storage capacity (PH3). Significance of metering level is indubitable (also included in IWA 2006 set of PIs) being one of the most important indicator required to estimate non-revenue water. However, it is important to include these PIs due to the fact that not all the connections are metered, nor are the entire control units are automated in SMWU like larger utilities. Linear BTFs, presented in Table 5.1, are developed for these indicators using literature values and expert opinion. 119 In smaller water supply systems, the catchment areas of surface water sources (e.g., creeks in Okanagan Basin) are sometimes small and have a limited raw water storage capacity, which should be monitored using PH3 to meet the future water requirements of growing population. Based on expert opinion, the relationship for PH3 is mapped over 10 to more than 300 days. 5.3.4 Operational Integrity Operational indicators are extremely important for the development of rehabilitation, renewal, and replacement plans for different components of utility’s physical infrastructure for effective asset management. In this category, the PIs not included in NWWBI are: percentage of mains replaced (OP2), mains rehabilitated (OP3), valves replaced (OP7), and cleaning of storage tanks (OP9). Theuretzbacher-Fritz et al. (2013) reported the annual rehabilitation rates of water mains as a percentage of the mains’ length (i.e., OP3) for around 1300 water utilities of all sizes that participated in Trans-National Water Supply Benchmarking Project in Austria and Germany. The value for this PI ranged from 0.1% to 2.0% for SMWU (with water supplied from less than 2Mm3 to 8.0Mm3). It is important to mention here that values higher than 1.0 were observed for privately owned smaller systems (generating higher revenues) with amount of water intake less than 2Mm3. Consequently, literature values are used ranging from 0.1 to 1.0 (%) against performance levels from 10 to 100, respectively. A similar approach to obtaining guidelines from literature has been adopted for OP2 and OP7. Cleaning of treated water storage tanks (OP9) is indispensable in order to maintain desirable drinking water quality in water supply. The total volume of the reservoirs (related to PH4) cleaned during the assessment year is used to calculate this indicator. For the estimation of water loss or non-revenue water (NRW), units of ‘liter/ connection/day’ have been used based on the observation that water losses primarily occurs at service connections (Hamilton et al. 2006). Recently, Lambert et al. (2014) reported that more than half of system leakage (i.e., 50 to 500 liters/ service connection/ day) in water utilities occurs at service connections. Unauthorized water consumption has not been highlighted as major issue in participating utilities. At the start of the benchmarking process in Canada (primarily for larger utilities) in 2007, the NRW ranged between 88 and 663 liter/connection/day; however, this range has been improved to 32 to 383 liter/ connection/ day for the year 2011 (NWWBI 2013). Keeping in view that there is less monitoring structure for water loss estimation and other O&M issues in Canadian SMWU, the BTF for water loss varies from 25 to 800 liter/ connection/ day (OP4, Table 5.1). 120 5.3.5 Water Quality and Public Health Safety A moderately concise list of PIs has been included for SMWU in the prior phase of this research based on availability of data, health related significance, and comparability with other utilities. Most of the PIs have been included in NWWBI-PR, except residual chlorine in distribution system (WP4). Residual chlorine is added to avoid any possibility of recontamination as a result of cross-connection within the distribution system. In addition, due to lower treatment levels in SMWU, the turbidity of distributed water to the community sometimes is higher than 1NTU. Therefore, to improve microbiological water quality, higher chlorine doses are required to be added, which result in taste and odor problems. A range from 0.1 to 4 mg/L of residual chlorine concentration in distribution system is used to map BTF for this indicator (WP4, Table 5.1). The population affected by the boil water advisories (WP1) is an indirect measure of water quality status of any water utility. The main causes of higher number of boil water advisories in the SMWU are source water contamination, flushing of hydrants, construction, repair and maintenance works, equipment failure, and inadequate treatment. Unlike SMWU, as per NWWBI 2013 public report, all of the participating utilities reported zero (nil) days with boil water notices (AECOM 2013). Consequently, based on expert opinion, the BTF mapped over values from 0 to 2 days has been developed (WP1, Table 5.1). According to Health Canada Guidelines for Drinking Water Quality, surface waters should be treated to achieve turbidity less than 0.1 NTU at all times. If it is not possible, 0.3 NTU should be maintained at 95% of times but less than 1 NTU at all times (for chemically assisted filtration). The minimum requirement in case of slow sand filtration is 1 NTU 95% of times and should never exceed 3.0 NTU (Health Canada 2012). British Columbia, Ministry of Environment established 5 NTU as the upper limit for drinking water at the consumer’s tap; however, for unfiltered water supplies, the boil water notice should be issued when the turbidity is more than 1NTU (BCMoE 1997). As per WHO (2011) drinking water quality guidelines, in small water supplies with limited resource, turbidity should not exceed 5NTU and, if at all possible, below 1NTU. Considering the above mentioned guidelines and limits reported in literature, the BTF has been established for a turbidity range from 0.1 to 5NTU mapped over 10 to 100 performance levels (WP2, Table 5.1). Health Canada (2013) established a maximum allowable concentration of Nitrates as 45mg/L for drinking water (Health Canada 2013). Higher concentrations can cause blue baby disease. The BTF to assess the performance level of this indicator (WP8, Table 5.1) from 10 to 100, a range of nitrates between 0.1 and 45mg/L has been used. 121 Health Canada (2012) has established the maximum allowable concentration of 0.1 mg/L (100 μg/L) for THMs concentration in drinking water supplies. Reported minimum, median and, maximum values of 0, 0.025 and, 0.245, respectively, in NWWBI public report have been used to develop an exponential relationship for calculating performance level for this indicator (WP7, Table 5.1). 5.3.6 Quality of Service Reliability The PIs included in IU-PBM, in addition to PIs in NWWBI-PR, are billing complaints (QS1), population coverage (QS6), total response to complaints (QS7), service connection complaints (QS8), and aesthetic, microbiological and physico-chemical tests compliance (QS9,10&11). The BTF for billing complaints covers 0 to 4.5 billing complaints per 1000 connections mapped over the 10 to 100 performance level. The performance level for pressure complaints is plotted for 0 to 10 complaints per 1000 served. Efficient response to the reported complaints (QS7) is extremely important to ensure customer’s satisfaction. This indicator includes the complaints which were resolved with an acceptable customer satisfaction level. For now, the total (percentage of) complaint responses mapped over 70 to 100% have been included in BTF for this indicator (QS7, Table 5.1). Overall compliance of the tested water samples with the promulgated water quality standards is a simple measure for public reporting. These PIs (QS9, QS10, and QS11 also recommended in IWA Manual of Best Practice 2006) include aesthetic, microbiological and physico-chemical water quality parameters (Alegre 2006), and are plotted from 65 to 100% compliance against 10 to 100 performance level. 5.3.7 Economic and Financial Stability In this category, the BTFs for revenue per unit of supplied water (FE3), operating cost coverage ratio (FE5), debt service ratio (FE6), and NRW by volume (FE7) have been established based on reported literature and expert opinion. Revenues per unit of volume supplied (FE3) has been mapped over 0.3 to 0.75 $/m3 to calculate performance score of FE3 (Lange and Hassan 2006). The reported range of O&M cost per km of water mains (FE2) is adjusted due to less economies of scale (i.e., smaller denominator) in SMWU; therefore, a linear BTF for this indicator has been developed for a range from 4 to 35$/ km. Similarly, water rates (FE1) have also been adjusted between 200 and 700$/year to establish a second order polynomial BTF for a typical residential connection using 250m3/year of water consumption (as included in NWWBI-PR). 122 Operating cost coverage (FE5) is the ratio between total annual operational revenues and total annual operating cost, and describes the financial performance of a water utility’s operations. The minimum desirable value of 1.0 shows that there are enough operating revenues available with the utility to comfortably cover the operating expenses (UNC 2013). A second order polynomial BTF has been established for this indicator for a range between 0.5 and 1.2, versus performance level varying from 10 to 100. The ability to pay debt service (FE6) has become critical for any utility, as debt has become an important instrument to capitalize utility operations. The minimum DSR ratio should not be less than 1.0, with a recommended ratio of between 1.25 and 2 to cover the possible risks due to changes in input costs (UNC 2013, WSP 2012). In result of these recommendations, a BTF has been established for a range of DSR ranging from 0.8 and 2. Finally, NRW (FE7), as percentage of total input volume, was also included as an indicator of economic loss of water. NRW may not depict a realistic picture of water losses in the transmission and distribution systems of a water utility, but it provides the information of volume of water which was lost without being charged. Nevertheless, NRW can be considered as a useful financial indicator (Kanakoudis & Tsitsifli 2010). The BTF for this indicator has been developed for a range between 0 and 50% of the system input volume. 5.4 A Case Study of the Okanagan Basin To evaluate the practicality of the IU-PBM, the framework shown in Figure 5.3 has been implemented for two medium sized water utilities in Okanagan Basin, British Columbia, Canada. The land use is diverse and almost the same in both the participating utilities, including residential, agricultural, commercial, public, and industrial with little site specific variation in the area wise distribution of each type of land use. The topography of the Okanagan area is rolling and hilly with medium to steep grades. Utility-A has been operating for 31000 persons with 10900 connections, primarily residential. The total length of water mains is 276 km. There are five water supply systems (depending on raw water source) being operating under Utility-A with both domestic and agricultural customers ranging between 150 and 5000. All of these systems are drawing water from surface water sources, including lakes and creeks. Only one of the water supply systems (i.e, relatively larger) has a conventional water treatment plant with full filtration system; whereas, chlorinated water is being supplied to the consumers, in the rest of the four systems. 123 Utility-B is comparatively smaller with 16000 residents, 6400 domestic and agricultural connections, three water supply systems, and 152 km of water mains. All three water supply systems (based on raw water source) supply chlorinated water to the community without conventional surface water treatment. Both utilities have linear assets of different pipe materials, including steel, plastic and cementitious. However, the average age of water mains in Utility-A is older than in Utility-B. Details of the data variables have not been provided in this research. Data variables for calculating PIs were collected from the utility’s technical and financial management, through personal communication, hydraulic models of water distribution systems, GIS maps, customers’ complaints forms and responses, water quality monitoring stations, financial inventories, and utility’s master plans. The step-by-step implementation of the proposed benchmarking methodology has been described below. Step 1: Evaluation of weights of performance indicators under each performance category Weights of the PIs in each performance category are calculated using the Simos’ method, as described in section 5.2. The list of selected PIs have been sent to three participating utilities in the Okanagan Basin including Utility-A and B. Utility managers and technicians ranked the PIs according to their importance from 1 to n; where n is the number of PIs in each category. Likewise, professionals working in asset management of water utilities also ranked the PIs. Results of Simos’ application are presented in Table 5.1. The first column of Table 5.1 corresponds to number of PIs in each of performance category; while the indicators are described in second column. The third column lists the average frequency of the importance ranks scored by the decision makers from least important to the most important indicator in ascending order. Subsequently, the indicator with the maximum average frequency is given the higher Simos’ rank in fourth column. PIs with the same average frequency are allocated the same rank. The fifth column shows the number of PIs. Non-normalized weights of the indicators are presented in column six. The weights of all the PIs, which are essentially the positions of Simos’ rank already listed in column four, for all the performance categories are estimated and listed in last column of Table 5.2. These final weights are used to calculate aggregated performance indices for both the participating utilities using the TOPSIS method described in Section 5.2. Due to space limitations, only the details of the quality of service index are described as an example in the following steps; nevertheless, final results for the other indices are included. 124 Step 2: Checking the need for normalization In present study, the BTFs have already been established as the benefit criteria. The performance levels of all the PIs range from 10 to 100; thus, there is no need for normalization. The non-normalized performance scores of PIs in the ‘quality of service' category are calculated using BTFs presented in Appendix B. For example, the indicator “QS4 -number of unplanned interruptions per 100km” due to water main breaks, pump failure, hydrant failure or a valve failure is calculated using the BTF developed in Table 5.1 as: (PL)QS4 = 100 e-0.053(QS4) The values of QS4 were found to be 0.36 and 3.49 for Utility A and B, respectively. Putting these values in above equation, the performance level for QS4 comes out to be: (PL)QS4 for Utility A = 98 (PL)QS4 for Utility B = 87 Likewise, the performance levels for all the remaining PIs in the quality of service category are calculated and arranged in the form of following matrix: QS1 QS2 QS3 QS4 QS5 QS6 QS7 QS8 QS9 QS10 QS11 Utility A 54.0 59.0 73.0 98.0 100.0 69.0 100.0 92.0 100.0 93.0 92.0 Utility B 57.0 91.0 68.0 87.0 70.6 100.0 100.0 49.0 10.0 28.0 99.0 125 Table 5.2 Weight estimation using Simos’ Method PI No. Performance Indicator (PI) U1-A U-B U-C E2-A E-B E-C Average of PI frequency Simos’ Rank Number of PI Non-normalized weights Normalized weights WE WATER RESOURCES AND ENVIRONMENT WE-1 No. of days of water restriction 2 2 1 3 3 4 2.5 2 1 2 0.20 WE-2 Per capita water consumption – Domestic users 3 4 3 4 4 3 3.5 4 1 4 0.4 WE-3 Existing Annual Water License Capacity - Utilized 4 3 2 2 3 2 2.7 3 1 3 0.3 WE-4 Impact of residual chlorine in flushing water on aquatic life 1 1 4 1 2 1 1.7 1 1 1 0.1 PE PERSONNEL 4 10 1 PE-1 Field FTEs - Distribution 3 7 7 7 6 7 6.2 7 1 7 0.25 PE-2 Field FTEs - Metering 2 1 4 5 3 1 2.7 2 1 2 0.07 PE-3 Field accidents 7 5 2 4 5 5 4.8 5 1 6 0.21 PE-4 Sick days taken by FTEs 6 4 5 6 7 6 5.5 6 1 5 0.18 PE-8 Water resources and catchment management employees 1 2 1 2 2 3 1.8 1 1 1 0.04 PE-9 Overtime of FTEs 4 6 6 1 4 2 3.8 4 1 4 0.14 PE-11 Personnel training hours 5 3 3 3 1 4 3.2 3 1 3 0.11 PH PHYSICAL ASSETS 7 28 1 PH-1 Metering level 2 2 2 3 2 2 2.0 2 1 2 0.20 PH-2 Degree of automation 1 1 3 2 1 1 1.0 1 1 1 0.10 PH-3 Raw water storage capacity 4 3 1 1 3 3 2.75 3 1 3 0.30 PH-4 Treated water storage capacity 3 4 4 4 4 4 3.75 4 1 4 0.40 OP OPERATIONAL 5 15 1 OP-1 No of Main Breaks 9 9 9 9 9 8 8.8 8 1 8 0.216 OP-2 Mains Replaced 5 8 1 5 1 9 4.8 5 1 5 0.135 OP-3 Mains Rehabilitation/ Renovation 4 6 3 2 6 7 4.7 4 1 4 0.108 OP-4 Non-Revenue Water 7 7 2 7 7 6 6.0 6 1 6 0.162 OP-5 Service connection rehabilitation 2 3 5 6 3 4 3.8 2 1 2 0.054 OP-6 Inoperable or leaking hydrants 8 4 8 8 8 5 6.8 7 1 7 0.189 OP-7 Replaced valves 1 5 4 1 5 2 3.0 1 1 1 0.027 OP-8 Hydrant Inspection 6 2 6 3 4 3 4.0 3 1 3 0.081 OP-9 Cleaning of treated water storage tanks 3 1 7 4 2 1 3.0 1 1 1 0.027 WP WATER QUALITY AND PUBLIC HEALTH 9 37 1 WP-1 Days with boil-water advisory 5 6 3 4 6 5 4.8 5 1 5 0.24 WP-2 Turbidity - Distribution 3 4 4 5 5 4 4.2 4 1 4 0.19 WP-3 Total Coliforms - Distribution 6 5 6 6 4 6 5.5 6 1 6 0.29 WP-4 Residual Chlorine - Distribution 4 2 5 3 3 3 3.3 3 1 3 0.14 WP-5 THMs - Distribution 2 3 1 1 2 1 1.8 2 1 2 0.10 WP-6 Cumulative length of mains cleaned 1 1 2 2 1 2 1.3 1 1 1 0.05 QS QUALITY OF SERVICE 6 21 1 QS-1 Billing complaints 3 7 3 2 1 1 2.8 2 1 2 0.03 QS-2 Pressure complaints 2 6 2 5 4 10 4.8 4 1 4 0.06 QS-3 Water quality complaints 5 9 6 9 8 8 7.5 7 1 7 0.13 QS-4 Unplanned Interruptions 9 8 9 11 10 11 9.7 10 1 10 0.16 126 PI No. Performance Indicator (PI) U-A U-B U-C E-A E-B E-C Average of PI frequency Simos’ Rank Number of PI Non-normalized weights Normalized weights QS-5 Unplanned Maintenance Hours 8 11 8 8 9 7 8.5 9 1 9 0.15 QS-6 Population coverage 1 5 1 1 2 2 2.0 1 1 1 0.02 QS-7 Total response to reported complaints 7 2 7 6 3 6 5.2 5 1 5 0.08 QS-8 Service Connection complaints 4 10 5 4 7 4 5.7 6 1 6 0.10 QS-9 Aesthetic Test compliance - Distribution 6 4 4 3 5 3 4.2 3 1 3 0.05 QS-10 Microbiological test compliance - Distribution 11 3 11 10 11 5 8.5 8 1 8 0.14 QS-11 Physical-chemical Test Compliance - Distribution 10 1 10 7 6 6 7.2 7 1 7 0.11 FE FINANCE AND ECONOMIC 11 62 1 FE-1 Water rates 3 5 6 6 4 6 5.0 6 1 6 0.29 FE-2 O&M cost per km of water mains 5 3 3 5 5 4 4.2 4 1 4 0.19 FE-3 Revenue per unit volume of supplied water 2 6 4 4 6 2 4.0 3 1 3 0.14 FE-4 Operating cost coverage ratio 1 2 2 2 1 3 1.8 2 1 2 0.10 FE-5 Debt service ratio 6 4 5 3 3 5 4.3 5 1 5 0.24 FE-6 NRW by Volume 4 1 1 1 2 1 1.7 1 1 1 0.05 6 21 1 1U= Utility; 2E=Expert 127 Step 3: Development of Weighted Matrix From the weights of the PIs listed in Appendix C, the weighted matrix can be developed using equation [5.2] by multiplying the performance levels calculated in step 2 with their corresponding weights as: PIs QS1 QS2 QS3 QS4 QS5 QS6 QS7 QS8 QS9 QS10 QS11 Weights [0.03 0.06 0.11 0.16 0.15 0.02 0.08 0.10 0.05 0.13 0.11] Utility A 1.62 3.54 8.03 15.68 15.00 1.38 8.00 9.20 5.00 12.09 10.12 Utility B 1.71 5.46 7.48 13.92 10.59 2.00 8.00 4.90 0.50 3.64 10.89 Step 4: Identify positive-ideal and negative-ideal solutions The X* and X- are defined as the PIS (100) and NIS (10) in terms of weighted performance levels using equations [5.3] and [5.4]. For example the PIS and NIS for QS1 are calculated as: PISQS1 = 100 x 0.03 = 3.0 NISQS1 = 10 x 0.03 = 0.3 In the same way, the weighed PIS and NIS are calculated for all the PIs in QOS category and presented in the following matrix: PIs QS1 QS2 QS3 QS4 QS5 QS6 QS7 QS8 QS9 QS10 QS11 PIS 3.0 6.0 11 16 15 2 8 10 5 13 11 PNS 0.3 0.6 1.1 1.6 1.5 0.2 0.8 1 0.5 1.3 1.1 Step 5: Calculating the distance of each water utility from PIS and NIS The distance of the performance scores for the quality of service category for each water utility are measured by the n-dimensional Euclidean distance. The combined distances of all the PIs from the weighted PIS values, as shown in Step 4, are calculated using equation [5] for each utility. For instance, YA* is estimated as: YA* = √((1.62 − 3)2 + (3.54 − 6)2 +⋯+ (10.12 − 11)2) = 4.4 128 Similarly, the distance from PIS for Utility B is calculated, and the final results are shown in the following matrix: YA* 4.4 YB* 13.1 And the distances of each water utility from weight of NIS are calculated from equation [6], and the results are presented in the following matrix: YA- 27.9 YB- 21.7 Step 6: Develop aggregate performance index by calculating similarities to PIS The final performance indices of the quality of service category are calculated for both the utilities by using the equation [7] and multiplying the answer with 100. For example, quality of service index (QSI) is calculated as: QSIA = (YA- / (YA* + YA-)) x 100 = (27.9 / (4.4+27.9)) x 100 = 83.6 QSIB is also calculated in the same way and the final indices are presented in the following matrix: QSIA 83.6 QSIB 62.4 Similarly, the remaining performance indices WEI, PEI, PHI, OPI, WPI, and FEI have been calculated for both the water utilities for the assessment year 2012, and the results are shown in Figure 5.5a&b. The proposed management actions against an estimated performance index are presented in Table 5.3. Figure 5.5 shows that the overall performance of both the utilities is satisfactory with performance indices lying in the ‘Medium’ or ‘High’ ranges (i.e., indices values higher than 50), except for the water quality and public health safety category in the case of Utility-B (Figure 5.5b). This is due to relatively objectionable source water quality in one of its water supply systems. The utility has changed the source in FY 2014, which has improved their WPI. The same issue has affected the QSI of Utility B as well due to lower performance levels for QS9&10. 129 (a) (b) Figure 5.5 Aggregated performance indices for all the functional components, (a) performance indices of Utility A, (b) performance indices of Utility B 020406080100WR&EnvironmentalsustainabilityPersonnel adequacyPhysical assetsefficacyOperational IntegrityWQ & PH safetyQuality of ServiceFinancial stabilityUtility B Performance020406080100WR&EnvironmentalsustainabilityPersonnel adequacyPhysical assetsefficacyOperational IntegrityWQ & PH safetyQuality of ServiceFinancial stabilityUtility A Performance100 10 High Medium Low Very Low 80 50 30 130 Table 5.3: Description of performance levels with proposed actions Index Performance Proposed action 10 - 30 Very low Urgent and detailed improvement required for several PIs in this category 30 -50 Low Detailed investigation required for underperforming PIs at intra-utility level 50 -80 Medium Careful investigations required at inter / intra-utility level to identify lacking sub-components >80 High Satisfactory performance need to be maintained at inter-utility level Utility-A, however, is showing relatively better performance overall. However, apart from quality of service and personnel adequacy categories, management of Utility-A needs to investigate the underperforming PIs to further improve the performance of their utility (Figure 5.5a). Conversely, Utility-B needs to do the same exercise for all of the performance categories. Furthermore, the indices just approaching or slightly increasing the good performance zone (i.e., 60) also needs detailed investigations for future performance management. Performance indices using the above approach describe the condition and efficiency of each functional component of SMWU. Based on the existing efficiency, the utility managers can make decisions about the essential improvements, e.g., hiring additional personnel, increase main replacement rate, increase coverage, improve metering level, implement water conservation plan, etc. Performance less than ‘Medium’ certainly need intra-utility performance assessment of respective functional component, which is described in Chapter 6. IU-PBM in this study has been developed through a comprehensive review of reported performance assessment studies of utilities around the world. Therefore, the model can be effectively used for performance benchmarking of SMWU with population fewer than 10,000 to 50,000 (using minimum number of PIs) for any region. However, IU-PBM is a data driven model based on piecewise continuous functions, and works most efficiently for the applicable ranges of PIs given in Appendix A. The managers can import the methodology, and if required, they can tweak the BTFs for site specific socio-economic and geographical conditions, and actual benchmarking data for their utilities. Likewise, the weights of the PIs can be re-evaluated following the approach used in the present study. There are always uncertainties exist in available data; this issue could be more significant in case of SMWU due to limited resources and inefficient data management practices. Presently, IU-PBM cannot handle these uncertainties and needs to be enhanced in future. 131 5.5 Summary In this chapter, inter-utility performance benchmarking model (IU-PBM) is developed for SMWU. This entails consideration of 47 performance indicators (PIs) for different performance categories, such as water resources and environment, personnel, physical assets, quality of service, water quality and public health, and financial. Calculating performance levels by simply comparing the calculate value of PI from the best and worst performing utilities in benchmarking might be misleading; because, this approach does not consider the average performing utilities in the evaluation process. The non-linear approach used in IU-PBM sufficiently addresses this issue. Therefore, 47 (linear, exponential, logarithmic and polynomial) benchmarking transformation functions have been established to translate the calculated PIs into performance levels between 10 and 100, which is based on literature, NWWBI reports and expert judgment. The weights are estimated using Simos’ method from the ranking of PIs by different water utilities in the Okanagan basin, British Columbia, Canada, and the opinion of experts working in water infrastructure management. The proposed approach accommodates wide variations in the calculated values of PIs, being mindful of the smaller economies of scale in SMWU as compared to larger water utilities. Finally, performance indices have been established by aggregating the transformed performance levels using TOPSIS method (i.e., based on the concept of relative closeness to the most desirable solution and remoteness from the least desirable solution). The IU-PBM results presented in the form of a web diagram demonstrate the utility’s performance to the top level management for pragmatic decision making. The proposed model has also been implemented for two SMWU operating in Okanagan Basin to demonstrate its practicality. 132 Chapter 6 Intra-utility Performance management Model (In-UPM) A part of this chapter is under review in Journal of Cleaner Production as an original research article titled “Intra-utility Performance management Model (In-UPM) for the Sustainability of Small to Medium Sized Water Utilities: Conceptualization to Development” (Haider et al. 2015c). If one or more than one functional component is not meeting desired LOS (i.e., not performing ‘High’) from the IU-PBM results, the utility managers can use the model developed in this Chapter for performance management of SMWU. 6.1 Background A water utility consists of different functional components (or processes): water resources and environment, personnel, physical assets, operation, quality of service, water quality and public health, and economics and finance. Each of these components consists of sub-components; for example, ‘personnel’ may comprise of staff health and safety, overtime culture, training hours, etc. For a sustainable water utility, all the functional components and their sub-components need to meet desired performance objectives. The Federation of Canadian Municipalities (FCM) and National Research Council (NRC), Canada described benchmarking as the mapping of one’s own process and subsequent comparison of your process with those of other companies with exemplary performance in a similar process (FCM/NRC, 2005). Based on inter-utility performance benchmarking results, the utility can hone in the performance of different sub-components within a functional component in order to identify the key areas for improvement; this process can be defined as Intra-utility performance assessment. Furthermore, sometimes a water utility might be operating more than one water supply systems (WSSs) at a time because of geographical limitations and availability of source water. A WSS may have a separate water source, transmission, treatment, and distribution network. It is also useful for the utility to evaluate the performance of each WSS individually for prioritizing their short- and long-term investments. In this regard, a less addressed issue for intra-utility performance assessment, so far, is the identification of the underperforming WSS within a utility. Performance indicators (PIs) are typically used to measure the performance of a program in terms of percentages or an index (score); which can be monitored at pre-defined intervals that can be compared to 133 one or more criteria or standards (Office of Public Management New South Wales [OPM] 1990). A comprehensive literature review of PIs for water utilities has been conducted in Chapter 2. Most of the existing performance assessment methods are based on involving similar water utilities over several years; which is not the case for Canadian SMWU. Moreover, aggregating all the PIs to estimate the overall performance of a functional component can eclipse the underlying processes (sub-components). In the absence of benchmarking data, an inter-utility performance benchmarking model for SMWU is developed in Chapter 5. However, at this point, there is no model/ study available for intra-utility performance assessment for SMWU, which can:  provide a systematic approach to identify the underperforming functional components, which may provide an opportunity to the utility’s mangers to make rational and timely decisions.  handle the uncertainties in data variables/ inputs and knowledge base to evaluate PIs and performance measures at the sub-component and component levels.  differentiate under and over performing WSSs within a water utility to plan short-term and long-term investments for overall performance improvement. The overall objective of this chapter is to develop an intra-utility performance management model (In-UPM) for sustainability of SMWU which can address above issues. 6.2 Establishing Performance Assessment Criteria In this research, the sustainability criteria for SMWU are defined as, “all the functional components of a water utility are desired to meet their respective performance objectives”. In order to evaluate the performance of each functional component, a hierarchical based top-down approach, consisting of different performance factors, is proposed. These factors includes performance objective of the functional component at the top, followed by primary and secondary performance measures (PMs) assessing the performance of the sub-components. The performance measures are derived from performance indicators (PIs), and the PIs are estimated from the data/ decision variables at the bottom level. All of these performance factors are listed in Table 6.1. The first four columns of Table 6.1 contain PMs and PIs for seven functional components, while the last two columns list the data and decision variables, respectively. The objective, primary, and secondary performance factors are designated as italics; whereas, the PIs (when described first time in text) are described with their corresponding numbers presented in 4th column of Table 6.1. Details of performance assessment criteria for each functional component are described in the following sections. 134 Water Resources and Environmental Sustainability 6.2.1 The functional component of ‘water resources and environmental sustainability’ is evaluated with the help of two primary level PMs of ‘source water conservation’ and ‘environmental protection’, and a PI of water license capacity (WE3), which directly adds input to the top level of the hierarchy (Table 6.1). Based on the increasing water requirements and implementation level of water conservation measures, the utility mangers should renew the water licenses of their WSSs. The first primary level PM is ‘source water conservation’, which can be evaluated by comparing the existing ‘water resources management’ practices and the implementation level of water conservation plan (WCP). The existing ‘water resources management’ practices are estimated from the PIs of water consumptions (WE2), water restrictions (WE1), watershed management employees (PE8), and water loss indicator (FE7) (refer to Table 6.1). In general, per capita water consumption decreases as the water price increases, particularly in case of consumption based billing (Whitcomb 2005). This type of billing can conserve the limited water resources and rationally recover the operation and maintenance costs as well. Usually, people living in expensive homes use more water due to several reasons, e.g., landscape irrigation, swimming pools, fixtures with higher flow rates, etc. SMWU do not fully implement a well-structured WCP, which generally contains: i) a planned water loss and a leakage control program, ii) consumption based metering and billing, iii) reducing water wastage by elimination of single-pass cooling, iv) reuse of non-contact cooling water and low-flow toilets, v) implementing building codes that mandates minimum water efficiency requirements for fixtures; and vi) public education and awareness programs. As per some recent studies, significant amount of water can be conserved with the high efficiency appliances for domestic use (Gurung et al. 2015). However, in some of SMWU, the WCPs are at some stage of implementation (i.e., developed but not implemented yet, or at the initial stages of implementation). 135 Table 6.1 Performance objectives, performance measures (PMs), performance indicators (PIs), and data variables Generation 1 – Performance objective Generation 2 - Primary PMs Generation 3&4 - Secondary PMs Generation 5&6 - Performance Indicators Data Variables / Decision Variables 1 Decision Actions P1,1,01,0 - Water resources and environmental sustainability P1,1,12,1 - Source water protection P1,2,12,1 - Environmental Protection P1,1,13,2 - Water resources management P1,1,14,3 - Restrictions, consumption, and management P1,2,23,2 - Impact of flushing water P1,1,15,4 - WE1: Water restrictions (A6)2 P1,2,15,4 - WE2: Residential water consumption (E1,A1) P1,4,15,1 - WE3: Existing water license capacity (A1,A2) P1,5,25,2 - WE4: Discharge of WTP1 residuals (A4,A5) P1,7,25,3 - WE5: Effect of flushing water on aquatic life (A7,A8) P1,3,15,2 - WE6: Implementation of water conservation plan P1,6,25,3 - WE7: Distance between flushing point and natural drain – length of storm-water drain (A12) P1,1,26,5 - FE1: Water rates (L2) (G7) P1,2,16,4 - PE8: Water resources and catchment management personnel (L2) (A1, B2) P1,3,16,3 - FE7: Non-revenue water (NRW) by volume (L2) (A1, A10) A1: Average annual demand A2: Existing annual water license capacity (WLC) A4: Amount of WTP residuals discharged into natural environment A5: Total residuals from WTP A6: Days with sprinkler regulations A7: Water volume in water body A8: Amount of flushing water A10: Revenue water A12: Distance between the flushing point and water body B2: Water resources and catchment management personnel E1: Total resident population G7: Water rate for typical residential connection A6: Increase days of water restrictions A2: Apply for new water licenses A4: Reduce or eliminate discharge of water treatment plant residuals A7: Perform flushing when flow in natural water bodies are high A8: Reduce amount of flushing water by optimizing flushing durations A12: Select flushing points located away from the natural water bodies as much as reasonably possible G7: Rationally increase water rates keeping the affordability in consideration A13: Increase implementation level of WCP P2,1,01,0 - Personnel productivity P2,1,12,1 - Personnel adequacy P2,2,12,1 -Personnel health and safety P2,3,12,1 - Working environment efficacy P2,1,13,2 - Catchment and treatment employees P2,2,13,2 - Productivity ratio P2,3,13,2 - Metering and distribution employees P2,4,23,2 - Loss due to field accidents P2,5,23,2 - Personnel healthiness P2,6,33,2 - Overtime culture P2,4,34,3 - PE1: Field FTEs4 – Distribution (D) (C1,B3) P2,5,34,3 - PE2: Field FTEs – Metering (M) (B5,C2) P2,6,44,3 - PE3: Lost hours due to field accidents (D) (B8,B9) P2,9,54,3 - PE4: Sick days per employee (D) (B1,B3) P2,1,14,3 - PE5: Field FTEs – Treatment (T) (A9,B4) P2,7,44,3 - PE6: Lost hours due to field accidents (T) (B1,B10) P2,10,54,3 - PE7: Sick days per employee (T) (B1,B4) P2,2,14,3 - PE8: Field FTEs - Water resources and catchment management (A1,B2) P2,11,64,3 - PE9: Overtime hours (D) (B1,B8) P2,12,64,3 - PE10: Overtime hours (T) (B1,B10) P2,13,34,2 - PE11: Personnel training (B1,B7) P2,3,24,3 - PE12: Staff Productivity P2,8,24,3 – PE13: Implementation of health and safety plan P2,1,26,3 - PH2: Degree of automation (L2) (C9,C10) A1: Average annual demand A9: Treated water supplied B1: Total personnel B2: Water resources &catchment personnel B3: field FTEs (D) B4: Field FTEs – Treatment (T) B5: Field FTEs – Metering (M) B7: Total training hours (D) B8: Field labour hours (T) B9: Lost hours due to accidents (D) B10: Field labour hours (T) B11: Lost hours due to accidents (T) B12: Sick leaves (D) B13: Sick leaves (T) B14: Overtime hours (D) B15: Overtime hours (T) C1: Pipes’ length C2: Meters installed C9: Total control units C10: Automated control units B2: Optimize field FTEs for water resources and catchment management B3: Optimize field FTEs for distribution system operations B4: Optimize field FTEs for treatment plant operations B5: Optimize field FTEs for metering operations B7: Increase personnel training hours in a year B9: Reduce or eliminate time lost due to field accidents during distribution system operations B11: Reduce or eliminate time lost due to field accidents during treatment plant operations B14: Reduce overtime hours for distribution system operations by optimizing the staff B15: Reduce or eliminate overtime hours for treatment plant operations by optimizing the staff C10: Convert or replace the un-automated control units with the automated ones Note: Optimization of FTEs means increase or decrease number of personnel to improve staff productivity (PE12) P3,1,01,0 - Physical systems efficacy P3,1,12,1 - Storage and treatment systems capacity P3,2,12,1 - Monitoring system Integrity P3,1,13,2 - Storage capacity P3,4,24,2 - PH1: Metering level (C2,C6) P3,5,24,2 - PH2: Degree of automation (C9,C10) P3,1,14,3 - PH3: Raw water storage capacity (A1,C4) P3,3,14,2 - PH4: Treatment plant capacity (D1) P3,2,14,3 - PH5: Treated water storage capacity (A1, C5) P3,1,25,4 - WE2: Residential water consumption (L2) (A1,E1) P3,2,25,4 - WE6: Implementation level of WCP3 (L2) P3,1,16,5 - FE1: Water rates (L3) (G7) A1: Average annual demand A3: Average daily demand C2: Meters installed C4: Capacity of raw water reservoirs C5: Capacity of treated water storage reservoirs C6: Total number of service connections C9: Total number of control units C10: Automated control units D1: Days WTP operated greater than 90% capacity G7: Water rate for typical residential connection C2: Increase metering C10: Convert or replace the un-automated control units with the automated ones C4: Increase capacity of raw water storage D1: Increase treatment plant capacity by adding additional units; careful judgment is required to evaluate maximum demand G7: Rationally increase water rates keeping the affordability in consideration to reduce water consumption A13: Increase implementation level of WCP to improve the remaining storage capacity for future needs 136 Table 6.1 (Cont’d) Performance objectives, performance measures (PMs), performance indicators (PIs), and data variables Generation 1 – Performance objective Generation 2 - Primary PMs Generation 3&4 - Secondary PMs Generation 5&6 - Performance Indicators Data Variables / Decision Variables Decision Actions P4,1,01,0 - Operational Integrity P4,1,12,1 - Distribution system integrity P4,2,12,1 - Distribution system performance P4,3,12,1 - Distribution network productivity P4,1,13,2 - Distribution system maintenance P4,2,13,2 - Delivery point maintenance P4,3,13,2 - Inspection and cleaning routine P4,4,23,2 - Distribution system failure P4,1,14,3 -Rehabilitation and replacement of pipes P4,10,45,3 - OP1: Pipe breaks (C1,D12) P4,2,15,4 - OP2: Pipes replaced (C1,D8) P4,3,15,4 - OP3: Pipes rehabilitated (C1,D7) P4,12,25,2 - OP4: Non-revenue water (A1,C6,D11) P4,6,25,3 - OP5: Service connection rehabilitation (C6,D10) P4,11,45,3 - OP6: Inoperable hydrants and valves (C12,D13) P4,4,15,3 - OP7: Valves replaced (C13,D9) P4,8,35,3 - OP8: Hydrants inspection (C12,D6) P4,9,35,3 - OP9: Cleaning of storage tanks (C5,D4) P4,7,25,3 - OP10: Operational meters (C6,D14) P4,13,35,2 - OP11: Network efficiency (A3,C1) P4,14,35,2 - OP12: Customer density (C1,E1) P4,1,15,4 - OP13: Average pipe age (C14) P4,5,15,3 - OP14: Implementation level of risk based pipes’ rehabilitation and replacement plan A1: Average annual demand A3: Average daily demand C5: Total capacity of treated water storage reservoirs C6: Total number of service connections C12: Total number of hydrants C13: Total number of valves C14: Average pipe age D4: Volume of the treated water reservoirs cleaned D6: Hydrants inspected during D7: Lengths of mains rehabilitated D8: Lengths of mains replaced D9: Number of replaced valves D10: Number of service connections repaired D11: Annual billed metered consumption D12: Mains failures/ breaks D13: Number of leaking hydrants D14: Operational meters D4: Increase frequency of cleaning the treated water reservoirs to at least once a year D6: Increase hydrant inspection frequency D7: Increase optimally the rehabilitation rate of water mains D8: Increase optimally replacement rate of water mains D9:Increase replacement rate of faulty or leaking valves to increase distribution system integrity D10: Increase service connection inspection rate to detect minor or major repairs in order to improve system integrity and reduce water loss D14: Increase implementation level of risk based pipes’ rehabilitation and replacement plan Note: Optimization of rehabilitation and replacement of water pipes means that it should be done with risk planning keeping in view the pipe age, and hydraulic and structural integrity. P5,1,01,0 – Safe drinking water provision P5,1,12,1 - Public health safety P5,2,12,1 - Water quality adequacy P5,1,23,2 - Water quality of distribution systems P5,2,23,2 - Water quality of treatment systems P5,1,14,2 - WP1: Boil water advisories (D16,D17,E1) P5,4,14,3 - WP2: Average turbidity in distribution (D18) P5,5,14,3 - WP3: Total coliforms occurrences in distribution system (D19) P5,3,14,2 - WP4: residual chlorine at consumer’s end (D21) P5,7,24,3 - WP5: Average turbidity of treated water (D2) P5,8,24,3 - WP6: Total coliforms occurrences in treated water (D23) P5,6,14,3 - WP7: Average trihalomethanes in distribution system (D20) P5,9,24,3 - WP8: Nitrates in treated water (D24) P5,2,14,2 - WP9: Length of mains cleaned (C1,D15) C1: Main length E1: Total resident population D15: Mains length cleaned D16: Number of days with BWAs D17: Population affected by BWAs D18: Turbidity in distribution system D19: Number of total coliform occurrences D20: Concentration of THMs in distribution system D21: Residual chlorine in distribution system D22: Turbidity in treated water D23: Total coliform occurrences in treated water D24: Nitrates in treated water D15: Increase flushing practice of pipes D16: Reduce or eliminate boil water advisories days by increasing level of treatment or changing source water D18: Reduce turbidly in distribution systems by increasing level of treatment or improving/ changing source water D19: Reduce or eliminate number of total coliforms in distribution systems by optimizing chlorine dose D20: Reduce or eliminate THMs in distribution systems by optimizing chlorine dose and increasing level of treatment D21: Optimize chlorine dose in distribution systems D22, D23, D24: Improve treatment plant operations, i.e., Efficient O&M, timely backwashing, etc. P6,1,01,0 - Quality of service P6,1,12,1 - Customer service reliability P6,2,12,1 - Customer satisfaction level P6,1,13,2 - Customers information level P6,2,13,2 - Water quality compliance P6,3,13,2 - Response to complaints P6,4,13,2 - Complaints related to system integrity P6,8,24,2 - QS1: Billing complaints (E2,F2) P6,9,44,3 - QS2: Pressure complaints (F1,F3) P6,10,44,3 - QS3: Water quality complaints (F1,F4) P6,1,14,3 - QS4: Unplanned interruptions (C1,D25) P6,2,14,3 - QS5: Unplanned maintenance hours (D26,D27) P6,7,34,3 - QS6: Time of response to complaints (F8) P6,6,34,3 - QS7: Total response to reported complaints (F6,F7) P6,11,44,3 - QS8: Service connections complaints (F1,F5) P6,3,24,3 - QS9: Aesthetic tests compliance (D29,D30) D25: Unplanned interruptions D26: Unplanned maintenance hours D27: Total maintenance hours D29: Aesthetic test carried out D30: Compliance of aesthetic tests with standards D31: Microbiological test carried out D32: Compliance of microbiological tests D33: Physico-chemical test carried out D34: Compliance of physico-chemical tests E2: Total registered customers F1: Resident population served by the utility D25: Duration of unplanned interruptions should be reduced by efficiently resolving the cause of interruption D26: customers should be well informed (at least 48 hours) before maintenance activates D30,D32,D34: The water quality objective that at least 97% of all the samples should comply with applicable water quality standards by improving source water quality or improving level of treatment F2: Efficient response to billing complaints 137 Table 6.1 (Cont’d) Performance objectives, performance measures (PMs), performance indicators (PIs), and data variables Generation 1 – Performance objective Generation 2 - Primary PMs Generation 3&4 - Secondary PMs Generation 5&6 - Performance Indicators Data Variables / Decision Variables Decision Actions P6,4,24,3 - QS10: Microbiological tests compliance (D31,D32) P6,5,24,3 - QS11: Physico-chemical tests compliance (D33,D34) F2: Billing complaints and queries F3: Pressure complaints F4: Water quality complaints F5: Service connection complaints F6: response to complaints F7: Total number of complaints F3, F4, F5: Efficient response to reported complaint after careful identification of the root cause of the complaint (problems on homeowner side should also be identified and customers should be guided to solve the issue) F6: it should be ensured that every reported complaint was responded in due time frame P7,1,01,0 - Economic and financial viability P7,1,12,1 - Economic stability P7,2,12,1 - Revenue collection efficiency P7,3,12,1 - Operational cost compliance P7,1,13,2 - Customer water affordability P7,2,33,2 - Operation and maintenance cost sustainability P7,2,14,3 - FE1: Water rates (G7) P7,7,24,3 - FE2: O&M Cost (D) per Km of water mains (C1,G2) P7,4,24,2 - FE3: Revenue per unit of water supplied (D11,G1) P7,8,24,3 - FE4: O&M cost (T)/ Million liters (A9,G3) P7,6,34,2 - FE5: Operating cost coverage ratio (G2,G4) P7,1,14,2 - FE6: Debt service ratio (G5,G6) P7,5,24,2 - FE7: Non-revenue water (%) (A1,A10) P7,3,14,3 - FE8: Affordability (E1,G4,G8) A1: Average annual demand A9: Treated water supplied A10: Volume of NRW C1: Main length D11: Annual billed water consumption E1: Total resident population G1: Operating revenues G2: Total O&M Cost during the assessment period G3: Total O&M cost of water treatment G4: Total annual operational revenues G5: Total annual net income G6: Financial debt service G7: Water rate for a typical residential connection G1: Increase operating revenues by increasing water rates rationally, reducing NRW, and implementing WCP G2: reduce O&M cost of water distribution by optimizing energy and implementing WCP G3: reduce O&M cost of water treatment by optimizing energy and implementing WCP G4: Increase operational revenues by reducing NRW G5: Increase total annual net income by increasing population coverage, customers’ density, network efficiency, efficient metering, etc. G7: Increase or decrease water rates based on water conservation strategies and customers affordability 1 The letters used for data/ decision variables are: A – water resources and environmental data, B – Personnel data, C – Physical assets data, D – Operational and monitoring data, E – Demographic data, F – Data related to customers services, G – Financial data 2 The data used to calculate the PIs in given within brackets at the end of each PI, for detailed calculations, interested readers are referred to Haider et al. (2014a & c) 138 The other primary level PM is ‘environmental protection’, which is derived from a secondary level PM, i.e., ‘impact of flushing water’, and a PI of discharge of residuals of water treatment plant (WE4). Periodic flushing is an important activity to keep the water mains healthy and to provide safe water to the community by removing the biofilm, and corrosion tubercles. However, the flushed water may have higher chlorine concentrations or other pollutants. As per Canadian water quality guidelines, the sum of concentrations of all the reactive chlorine species should be less than 0.5mg/L for the protection of aquatic life in freshwaters (Canadian Council of Ministers of the Environment [CCME] 1999). In this study, the impact of flushing water is evaluated on two bases: i) the distance between the flushing point and water body, and ii) the dilution available in the natural water body. Details of these PIs are discussed in Chapter 5. Chlorine naturally decays (following first order kinetics) while moving in surface water (Gang et al. 2003). It is assumed that the length of storm water conduit or drain of more than 1000 meters will cause low risk. Whenever possible, the field and technical personnel should avoid the planned flushing programs during low flow (dry) periods to minimize the impact on aquatic life. However, spot flushing for smaller durations can be done upon customer’s requests. Personnel Productivity 6.2.2 The functional component of ‘personnel productivity’ is evaluated from three primary level PMs, including ‘personnel adequacy’, ‘personnel health and safety’, and ‘working environment efficacy’. The first PM of Personnel adequacy is estimated from secondary level PMs, i.e., ‘catchment and treatment employees’, ‘metering and distribution employees’, and ‘personnel productivity ratio’ (PPR). Due to the fact that generally in SMWU, personnel are responsible for more than one specific task, individual PIs need to be carefully calculated based on man-hours. Conventionally, PRR is calculated on the basis of million gallons of consumed water per full-time equivalent (FTE) employees. Finding and retaining the skilled staff in SMWU is difficult as compared to the large utilities. Adequate staff is essential for any water utility in order to perform routine operation and maintenance activities and to satisfy the customers. However, there is a need to optimize the staff strength to achieve desirable PPR. The other primary level PM is ‘personnel health and safety’, which is derived from two secondary level PMs of ‘personnel’s healthiness’ and the ‘loss due to field accidents’ using relevant PIs (PE3, PE4, PE6 and PE7) (refer to Table 6.1). In addition, it includes an important PI of implementation level of a comprehensive health and safety plan (PE13). 139 The third primary level PM is ‘working environment efficacy’, which depends on secondary level PM of ‘overtime culture’, and annual training hours (PE11) (Table 6.1). The ‘overtime culture’ should be discouraged by increasing the number of pertinent personnel. According to Dembe et al. (2005), evidence in United Sates revealed that overtime and extended work schedules affect the health and wellbeing of workers. Moreover, the personnel training becomes imperative for SMWU, because it is difficult to find trained staff at first, and then to retain them as there is a tendency for employees to migrate to larger cities. Physical Assets Efficacy 6.2.3 The functional component of the ‘physical systems efficacy’ is appraised from two primary level PMs, including ‘storage systems capacity’, and ‘monitoring systems integrity’. ‘Storage capacity’ is estimated from the storage capacities of raw water and treated water reservoirs, and the remaining capacity of treatment facilities using relevant PIs (PH3, PH4, and PH5). The second primary level PM, ‘monitoring systems integrity’, depends on two PIs, metering level (PH1) and degree of automation (PH2) (Table 6.1). The metering level is one of the most important indicators for accurate assessment of NRW, and is an effective tool for optimizing consumption based billing. During the past several years, water utilities, including SMWU, in developed countries have been automating their control units, such as pressure releasing valves, booster pumps, chlorinators, pumping stations, and treatment facilities. Degree of automation increases the operational reliability of these control units, improves emergency response, and also enhances the staff productivity. Operational Integrity 6.2.4 The functional component of ‘operational integrity’ evaluates the maintenance, performance, and productivity of water supply system using three primary level PMs. The first PM ‘distribution system maintenance’ depends on the performance of three sub-components; which are ‘system integrity’, ‘delivery point maintenance’, and ‘inspection and cleaning routine’ (Table 6.1). The first sub-component assesses the integrity of assets using the indicators of rehabilitation and replacement (R/R) of pipes of water mains (OP2 & OP3), replaced valves (OP7), and implementation level of risk based R/R plan (OP14). Although, in SMWU, the utility mangers have been conducting some level of asset management based on available resources and general observations, they may not have adequately implemented the risk based renewal and replacement plans. Moreover, the age of water mains (OP13) also contributes to 140 ‘system integrity’ as younger pipes may not need to be replaced in their starting life. The second sub-component, a secondary level PM, ‘delivery point maintenance’ is associated with the percentage of service connections rehabilitated (OP5) and operational meters (OP10). For the reason that leaking service connections are one of the major sources of water loss in a WSS (Hamilton et al., 2006), and operational meters ensure accurate water loss measurements and billing system. The second primary level PM, ‘distribution system performance’ depends on a secondary level PM of ‘distribution system rate of failure’ and an indicator of NRW (OP4). The former assesses from the PIs for the number of main breaks per 100km (OP1) and percentage of inoperable or leaking hydrants (OP6) (Table 6.1). Large utilities have been improving in NRW while participating in NWWBI, i.e., in the beginning of benchmarking process since 2007, the NRW ranged between 88 and 663 liter/connection/day; this range has been improved to 32 to 383 liter/ connection/ day for the year 2011 (AECOM 2013). Similar benchmarking approach is required for SMWU. The third primary level PM, ‘distribution network productivity’ is an indirect measure and is estimated from two exogenous PIs, including network efficiency (OP11) and customers’ density (OP12). OP11 is the ratio between the average daily demand and the total length of water mains; while the OP12 is the number of customers per km of water main length. Higher values of both the PIs result in higher distribution network productivity. Marques and Monteiro (2001) reported a wide variation in customer density of the water utilities in Portugal ranging from 5 to 250 customers per km vs. the network efficiency between 2 and 136m3/km/day. Based on the data obtained from SMWU in the Okanagan Basin, the following relationship is developed in this study: Network efficiency = 4.865(Customer density) 0.593 [6.1] These PIs (OP11 & OP12) are related to planning of the utility’s WSSs; and cannot be controlled by the utility managers during operations. However, for future extensions, these considerations can be taken into account to improve the production process of a water utility. Provision of Safe Drinking Water 6.2.5 The functional component of ‘public health protection’ is derived from ‘water quality adequacy’ and ‘public health safety’, as primary level PMs. ‘Public health safety’ is assessed based on three PIs: persons affected from boil water advisories (WP1), the average concentration of residual chlorine in water 141 distribution system (WP4), and the frequency of flushing of water mains (WP9) (Table 6.1). The primary reasons for boil water advisories (highest amongst all provinces) in SMWU in British Columbia include: source water contamination, flushing of hydrants, construction works, repair and maintenance works, equipment failure, and inadequate treatment (Interior Health Canada 2013). Due to lower treatment levels in SMWU and higher source water turbidity (i.e., higher than 1 NTU), higher chlorine doses are required to be added, which result in taste and odor problems. Flushing of water mains (WP9) is a routine practice in SMWU; primarily, to remove settled sediments in pipes, which could not have been removed at the source due to lower treatment levels. The second primary level PM ‘water quality adequacy’ is assessed based on two sub-components of distribution and treatment systems, respectively. It is important to assess the performance of both the treatment and distribution systems separately in order to identify the underperforming WSS in the utility. There are situations when not all the WSSs in SMWU have full scale filtration systems, and others have varying (i.e., poor, fair, and good) source water quality. These PMs are estimated from different water quality parameters, such as water turbidity, occurrences of total coliforms, trihalomethanes, and nitrates. British Columbia, Ministry of Environment established 5 NTU as the upper limit for drinking water at the consumer’s tap; however, for unfiltered water supplies, the boil water notice should be issued when the turbidity is more than 1NTU (British Columbia Ministry of Environment [BCMoE] 1997). Turbidity higher than 1NTU is not an exception in SMWU; therefore, WP2 is mapped between 0 and 5NTU to accommodate variations in source water quality. The water quality of treatment systems is assessed from three PIs, including average turbidity (WP5), occurrence of total coliforms (WP6), and average concentration of nitrates (WP8) in treated water. As per Health Canada Guidelines for Drinking Water Quality, surface waters should be treated to achieve turbidity less than 0.1 NTU at all times; and if it is not possible, 0.3 NTU should be maintained 95% of the times with less than 1 NTU at all times (for chemically assisted filtration). In case of slow sand filtration, the minimum requirement is 1 NTU (for 95% of the times) and should always be less than 3.0 NTU (Health Canada 2012). The reported occurrences of total coliforms in treated water vary from 0 to 7 in NWWBI public report for 2012; thus, WP6 is mapped over 0 and 10. A maximum allowable concentration of Nitrates of 45mg/L (i.e., 10mg/L as nitrate-nitrogen) has been established for drinking water by Health Canada (2013). Higher concentrations can cause blue baby syndrome in infants younger than 6months. In general, the concentration of nitrates in most of the undisturbed groundwater in British Columbia is as low as 1mg/L; consequently, higher concentrations (than 3mg/L) testify of the presence of human activities. As per a study conducted by the Ministry of Environment during 1977-1993, the 142 concentration of nitrates was higher than 45mg/L in 1.5% of the groundwater samples collected from British Columbia (BCGWA 2007). Quality of Service 6.2.6 One of the primary sustainability objectives of any water utility is to provide a reliable, responsive, and affordable service to their customers, Therefore, this functional component, ‘quality of service possesses’ is evaluated based on two primary level PMs, including ‘customer service reliability’ and customer satisfaction level. The ‘customer service reliability’ depends on ‘customer information level’ and ‘water quality compliance’. Continuity of water supply is extremely important to achieve customers’ satisfaction. Therefore, at times of maintenance and rehabilitation activities, the customers should be informed in advance about any possible discontinuity; they would be able to plan their activities accordingly. The ‘customer information level’ is estimated using two indicators of unplanned interruptions (QS4) and unplanned maintenance hours (QS5). An unplanned interruption means any event when the customer is left without water without a 48 hours notification; this includes situations where the duration of a planned interruption exceeds the initially notified duration (Queensland Urban Utilities [QUU] 2012). Such interruptions can occur due to all kinds of system failure, such as a water main break, un-operational booster pump station, treatment plant failure, etc. Maintenance activities such as flushing of water mains and hydrants repair should also be notified in advance to ensure service reliability. For second sub-component, Marques and Monterio (2001) reported ‘water quality compliance’ criteria for water utilities in Portugal, i.e., violations higher than 3% of the total samples analyzed show weak performance. In the present study the relevant PIs are obtained from IWA Manual of Best Practice (Alegre et al. 2006) to assess aesthetic, microbiological, and physico-chemical water quality compliance (QS9, QS10, and QS11). The second primary level PM of ‘customer satisfaction level’ depends on different types of complaints, and how and when these were responded. Once reported, all the complaints should be resolved as early as possible, or at scheduled times a per customer convenience. In this regard, ‘customer satisfaction level’ is assessed in terms of ‘complaints related to system integrity’, the ‘response to the reported complaints’, and the billing complaints (QS1) (Table 6.1). The ‘complaints related to system integrity’, include pressure complaints (QS2), water quality complaints (QS3), and complaints related to service connections (QS8). Furthermore, to ensure the customers’ satisfaction, the responses to the complaints is evaluated in 143 terms of: i) percentage of the calls responded (QS7), and ii) efficient response time (QS6). QS6 can be described as the time in which the complaints were resolved with an acceptable customer satisfaction level depending on nature and extent of the problem. For example, response to an emergency leak should be immediate, within 2 to 4 hours, whereas a non-emergency leak should be completed within 24 hours, or an on/off request for plumbing repair is expected to be completed at a scheduled time. Economic and Financial Viability 6.2.7 The functional component of ‘economic and financial viability’ is evaluated with the help of three primary level PMs, which are ‘economic stability’, ‘revenue collection efficacy’ and ‘operational cost compliance’. The first sub-component, ‘economic stability’, of a water utility depends on its ability to cover the repayments and principle on debt with affordable water rates to the customers. Thus, it is estimated from ‘customer water affordability’ (a secondary level PM) and the PI of debts service ratio (DSR) (FE6). Due to the growing deficit of infrastructure needs in Canada, local governments are struggling with their existing asset management practices; as a result, water rates are rising (Stinchcombe 2013). This strategy may impact lower income customers, particularly in SMWU. As per Water Canada (2013), an average household is spending 0.9 percent of their total expenditures on water, which is lower than the international standards of affordable water supplies. However, the situation for low income households is different, because the share of income for water increases in comparison to declining income (Bodimeade and Renzetti 2013). Affordability is defined as the average revenues per capita (in the water utility) as a percentage of per capita gross national income (GNI) (Berg and Danilenko 2011). According to Statistics Canada (2013), the average per capita income is 36,323$ per annum; and the population of British Columbia is 4.61 million for the year 2014. Knowing operational revenues per capita for a water utility, affordability can be determined. The second primary level PM, ‘revenue collection efficiency’ depends on two PIs, revenues per unit of water supplied (FE3), and the percentage of NRW (FE7) (Table 6.1). Revenues collected in a water utility are important to ensure reliable water supply through effective asset management, and to meet the financial requirements of the utility as well. NRW, as percentage of total input volume needs to be minimized in order to increase operational revenues. According to Kanakoudis and Tsitsifli (2010), NRW may not depict a realistic picture of water losses in the transmission and distribution systems, but it can be considered as a useful financial indicator. 144 The third primary level PM is ‘operational cost compliance’ is evaluated from a secondary level PM of ‘operation and maintenance cost sustainability’ and an indicator of operating cost coverage ratio (FE5). Former, separately evaluates the distribution and treatment costs. Later, operating cost coverage is the ratio between the total annual operational revenues and the total annual operating cost. It ensures the financial viability and a value of higher than 1.0 describes that there are enough operating revenues available for the utility’s operations (UNC 2013). 6.3 Modeling Approach The proposed modeling approach for development of In-UPM is presented in Figure 6.1. The step-by-step procedure of In-UPM for performance management of SMWU is given as following: Step 1: Performance criteria and development of conceptual hierarchical structure A conceptual hierarchical based top-down approach is used to assess the performance of all the functional components using different performance factors explained in section 6.2. The basic building blocks of the conceptual model are shown in Figure 6.2. A ‘family’ consists of a performance factor ‘parent’ (components, or sub-components) and the contributing factors ‘children’ (sub-components, and PIs). The performance factor with no children is called the ‘basic performance factor’, which in this case are the PIs. One family may consist of one, two, or three children, and each of these children can further possess children in next generation. Moreover, a parent may have children in multiple generations, for example, when an important PI directly inputs to the higher generations in the hierarchy (Figure 6.2). In Figure 6.2, the notation Pi,j,k l,m represents a performance factor, where i is the number of the component (e.g., personnel, operational, etc.), j is the ordinal number of the performance factor in the current generation, k is the ordinal number of the parent (in the previous generation), and l and m are the orders of the performance factor in current and previous generation, respectively. Data variables are the inputs used to calculate the PIs. In some cases, the data variables are the decision variables, e.g., response time to complaints, frequency of flushing of water mains, etc. (refer to last column of Table 6.1). Some data variables are the fixed parameters and remain constant for a specific assessment period, such as pipe length, population, service connections, etc. The decision (and data) variables are represented with a triangle ‘∆’; whereas, the data variable (only) are designated by a circle ‘○’ (Figure 5.2). Level-2 PIs, included in a functional component are the ones which are originally Level-1 PIs of another component, but can influence the performance of this component as well. 145 Figure 6.1 Methodology for performance management of SMWU Step 1: Selection of PIs and development of hierarchical structure for performance assessment of all the functional components Step 2: Establishing performance evaluation criteria - defining performance factors as functional components, sub-components, performance indicators, and data variables and their interactions Step 3: Knowledge acquisition from public reports, literature, and expert opinion  Translate calculated PIs into performance scores ranging between 1 and 10 using transformation functions  Defining universe of discourse (UOD) as low, medium, and high Step 4: Performance assessment of each functional component and sub-components Data/ decision variables Performance evaluation - Estimation of Performance indicators Low: Performance is much lower than the established benchmarks and standards Medium: Performance is lower than the established benchmarks and standards High: Performance is equal or higher than the established benchmarks and standards OR OR Performance inference using Fuzzy rule based modeling Step 5: Performance management at utility and system level D: Components with ‘Low’ performance needs detailed investigations required at system level D: If all or any of the functional components are performing ‘Medium’, then Improve performance of relevant PIs with careful observations at utility level or system level D: If all the functional components are performing ‘High’, then Maintain performance at utility level WSS - I WSS - II WSS - n All the functional components and sub-components are evaluated for each WSS in the Utility ...... Identify the under-performing WSS 146 Figure 6.2 A conceptual hierarchical structure for performance assessment of SMWU - An example of functional component of water resources and environmental sustainability P1,1,01,0 P1,1,12,1 P1,2,12,1 P1,1,13,2 P1,1,14,3 P1,1,15,4 P1,2,15,4 P1,2,16,4 P1,1,26,5 P1,3,16,3 P1,3,15,2 P1,4,15,1 P1,5,25,2 P1,2,23,2 P1,6,25,3 P1,7,25,3 Families Generation 1: Performance objective Generation 2: Primary Performance Measures Generation 3 & 4: Secondary Performance Measures Generation5&6 Performance Indicators (Level I & II) Data/ Decision Variables Data variables only Data/ decision variables P i, j, k l, m Component Child Parent Generation Previous Generation Level 2 PIs 147 Step 2: Preparing performance evaluation criteria The second step is the establishing performance assessment criteria (such as Table 6.1) for all the functional components, which are already described in detail in section 6.2. Step 3: Knowledge acquisition The PIs are estimated using the data variables mentioned in Table 6.1. These values needs to be compared with the benchmarks and standards established (through a well-organized benchmarking process for SMWU) to translate them into performance scores ranging between 1 and 10. In the absence of benchmarking data for SMWU, the transformation functions for all the PIs have been established through knowledge acquired from literature, NWWBI public reports for large utilities, and experts’ opinion. The reference system of universe of discourse (UOD) for all the PIs is given in Table 6.2; where each PI is fuzzified into three linguistic constants, such as Low, Medium, and High for the corresponding performance scores ranges of 0 to 4, 3 to 7, and 6 to 10, respectively. For performance assessment, the desired benchmarks or standards are mapped at ‘7’ or higher performance scores. The performance objective of a functional component at the top of the hierarchy (described in Figure 6.2 and Table 6.1) varies between 10 and 100. Step 4: Performance assessment of each functional component Step 4 is performance assessment (Figure 6.1), which is followed by the calculation of PIs using data variables (column 4 and 5 in Table 6.1). The deductive approach is used to infer the performance of the functional components based on the UOD defined for each PI in Table 6.2 and the hierarchical structure explained in Figure 6.2 and Table 6.1. Step 4.1: Inferencing using Fuzzy Rule Based Modeling (FRBM) Technique As described earlier, due to the limited performance benchmarking data, the UODs for PIs are established using data of NWWBI public reports, published literature, and expert knowledge. These uncertainties are addressed in the development of In-UPM with the help of fuzzy set theory. Fuzzy-based techniques are commonly used where inputs are subjective human expertise, judgment, and intuitions (Sadiq et al. 2009). Furthermore, for performance inference at different component and sub-component levels, the fuzzy rule-based modeling (FRBM) technique is used in this research. The fuzzy set theory was first developed by Zadeh (1978) to methodically incorporate human reasoning in decision making. The linguistically defined model can deal with the qualitative and imprecise/uncertain knowledge in the form of if-then rules (Mamdani 1977). In FRBM, the fuzzy rules can be established as 148 ‘if-then rules’, such as ‘If antecedent proposition, then consequent proposition’. Typically, the expression is articulated as inference such that if the fact (premise, hypothesis, antecedent) is known, then the conclusion (consequent) can be inferred (Ross 2004). The antecedent proposition is a fuzzy proposition of the type ‘X is A’; where X is a linguistic variable and A is a linguistic constant, e.g. low, medium, high, etc. In contrast to classical set theory, where elements of a set may have ‘0’ or ‘1’ membership, fuzzy sets allow one to define membership values as real numbers between an interval of [0, 1]. For details, the readers are referred to Ross (2004). The linguistically defined model can deal with the qualitative and imprecise/uncertain knowledge in the form of if-then rules such as (Mamdani, 1977): Ri: If X is Ai then Y is Bj i = 1, 2, …, L; j = 1, 2, …, N [6.2] where Ri is the rule number i, X is the input (antecedent) fuzzy variable, Ai is a fuzzy subset corresponding to an antecedent linguistic constant (one of L in set A), Y is the output (consequent) fuzzy variable, and Bj is a fuzzy subset corresponding to a consequent linguistic constant (one of N in set B). Due to the above mentioned uncertainties in the present work, X might not be exactly equal to say ‘medium’, instead it has a membership of, for example, A2(x) = 0.5 to low and A3(x) = 0.5 to medium. The full relationship between X and Y, according to rule i, can be computed either by using fuzzy implications or fuzzy conjunctions. In the proposed framework, the Mamdani method is applied, where the conjunction A  B can be computed by a minimum (and type t-norm or conjunctive) operator, which elucidates that “it is true that A and B simultaneously hold”. Each rule defined in equation [6.2] corresponds to a fuzzy relation denoted by Ri (X  Y)  [0, 1]. Ri = Ai  Bj i.e., Ri (x, y) = Ai (x)  Bj (y) [6.3] The minimum (AND) operator of Equation [6.3] can be applied to the Cartesian product space for all the possible pairs of X and Y. The union of all the fuzzy relations Ri encompasses the entire model; it is given by the disjunction A  B (union, maximum, or type, s-norm) operator of the L individual rules: 149  )()(max),(.,.,,...,2,11yxyxeiRR ji BALiRLii    , Ri (i = 1, …, L) [6.4] The relationship Ri is symmetrical and can be inverted; therefore, the entire rule-set can be arranged as fuzzy rule set R. Thus, the equation [6.4] can also be stated as: y = x o R [6.5] where the output can be computed by applying the operator ‘o’ (max-min composition) to an antecedent proposition. If A’ is an input fuzzy number, which is mapped on fuzzy subset A; and B’ is an output fuzzy number which is mapped on a fuzzy subset B, then:  ),()(max)( ' yxxy RAXB   [6.6] By substituting R (x, y) from equation [6.4], the equation [6.6] can be written as:      )()()(maxmax)( ',...,2,1 yjxixy BAAXLiB  [6.7] If  )x()x(max iA'AXi   is defined as the degree of fulfillment (of the antecedent) of the i-th rule, the output fuzzy set of the linguistic model presented in equation [6.2] can be stated as:  )(max)(,...,2,1yjy BiLiB  [6.8] Equations [6.3] to [6.8] represents the algorithm known as the Mamdani inference for a single input single output (SISO) model, which can be further extended to multi inputs single output (MISO) model. For instance, a MISO model with two inputs can be written as: 150 i = 1, 2, …, L Ri,j: If X1 is Ai and X2 is Cj then y is Bk; j = 1, 2, …, M [6.9] k = 1, 2, …, N Virtually, the above model is an extended case of SISO model, where the antecedent proposition is obtained as the Cartesian product of fuzzy sets A and C, hence the degree of fulfillment can be states as:            2'21'1, 21 maxmax xxxx CCXAAXji ji  [6.10] The MISO model can be easily extended to Q antecedents. The most commonly used fuzzy rule-based systems are Mamdani (1977) and Takagi-Sugeno-Kang (TSK) (1985) models. The Mamdani model describes the consequent part using linguistic variables, while in TSK model the input variables are defined by the linear combinations (Sadiq et al. 2009). A typical fuzzy rule-based system consists of four components: fuzzifier, rule-base, inference engine, and defuzzifier. The fuzzifier governs the degree of membership of a crisp input in a fuzzy set through functions known as membership functions. As all the PIs and PMs are independent and not mutually exclusive, fuzzy operator ‘AND’ is used in this research. The rule-base describes the fuzzy relationships between the input and the output variables; subsequently, the output is determined based on the degree of membership specified by the fuzzifier. The inference engine generates the consequent rules using the membership functions. Finally, a defuzzifier converts fuzzy outputs into crisp values. Although the inputs are crisp, the approximation of the outcome is dependent upon the accuracy of the rule set, the inference technique and the selection of membership functions. The defuzzifier generates a discrete (crisp) output of B, i.e., B’, described in equation [6.2] and [6.9]. Actually, this crisp output approximates the deterministic characteristics of the fuzzy reasoning process based on the output fuzzy set Bk(y). This helps to solve the real world problems by converting the uncertainty into an applicable action. In In-UPM model, the following defuzzification, center of area method (SOM) is used:      dyyydyyBDefuzBkBk'. [6.11] 151 For simplicity, fuzzy trapezoidal membership functions are used to define the UOD under uncertainties. The elements (a, b, c, and d) defining UOD of the memberships functions shown in Figure 6.3, for all the PIs, are given in Table 6.2. As all the PIs and PMs are independent and not mutually exclusive, the fuzzy operator ‘AND’ is used in this research. An example of the desired rule for the functional component of environmental protection (P1,2,12,1), a primary level PM of the first component (Table 6.1) is set as, ‘If discharge of treatment plant residual ‘P1,5,25,2’ is high and the impaction of flushing water ‘P1,2,23,2’ is high, then the environmental protection ‘P1,2,12,’1 is low’. A total of 720 rules have been established in Appendix C for all the functional components and sub-components, to develop In-UPM. Simulink in MATLAB is used for inferencing performance of the functional components using FRBM. Figure 6.3 Standard trapezoidal membership functions used in this study; e.g., b1, b2, b3 and b4 are used to define the ranges of fuzzy numbers for ‘Medium’ b2  xA b1 b3 b4 Low Medium High 152 Table 6.2 Universe of discourse (UOD) of performance indicators PI No. Performance Indicator (PI) units Universe of Discourse (UOD) Low Medium High WE WATER RESOURCES AND ENVIRONMENTAL WE1 Water restrictions Days 0 to 100 80 to 220 180 to 360 WE2 Per capita water consumption - residential Liter/ day < 200 to 450 350 to 750 650 to > 1000 WE3 Existing water license capacity (exhausted) % 0 to 35 25 to 70 60 to 100 WE4 Discharge of water treatment plant residuals % 0 to 8 5 to 18 15 to 25 > WE5 Effect of flushing water on aquatic life – available dilution in receiving water body Ratio 0 to 6 4 to 10 8 to >20 WE6 Implementation level of WCP Linguistic 0 to 0.4 0.3 to 0.75 0.6 to 1 WE7 Distance between flushing point and natural drain – length of storm-water drain meter 0 to 600 400 to 1000 800 to >1500 PE PERSONNEL PE1 Number of field FTEs - Distribution No of FTEs/ 100km 0 to 3 2 to 5 4 to >10 PE2 Number of Field FTEs - Metering No of FTEs/ 1000 meters 0 to 0.03 0.02 to 0.06 0.05 to >0.1 PE3 No of lost hours due to field accidents - Distribution No of lost hours/ 1000 field labour hours 0 to 9 6 to 18 15 to >30 PE4 No. of sick days taken per field employee- Distribution Days 0 to 8 6 to 14 12 to >20 PE5 Number of field FTEs - Treatment No of FTEs/ 1000ML 0 to 0.3 0.2 to 0.6 0.5 to >1 PE6 No of lost hours due to field accidents - Treatment No of lost hours/ 1000 field labour hours 0 to 9 6 to 18 15 to >30 PE7 No. of sick days taken per field employee - T Days 0 to 7 5 to 13 11 to >18 PE8 Water resources and catchment management employee No/106m3/year 0 to 0.035 0.025 to 0.07 0.06 to > 0.1 PE9 Total overtime field hours - Distribution % 0 to 8 6 to 15 12 to >30 PE10 Total overtime field hours - Treatment % 0 to 8 6 to 15 12 to >30 PE11 Personnel Training Hours/employee/year 0 to 40 30 to 70 60 to >100 PE12 Staff Productivity MG/ FTE <60 to 100 90 to 140 120 to >200 PH PHYSICAL ASSETS PH1 Metering level % 0 to 30 20 to 60 5o to 100 PH2 Degree of automation % 0 to 40 30 to 70 60 to 100 PH3 Raw water storage capacity Days 0 to 120 80 to 220 180 to >300 PH4 Treatment plant capacity (exceeding 90%) Days 0 to 20 15 to 50 40 to >100 PH5 Treated water storage capacity Hours 0 to 12 8 to 30 20 to >40 OP OPERATIONAL OP1 Main breaks No/ 100km 0 to 4 2 to 10 8 to >20 OP2 Mains replaced % 0 to 0.25 0.15 to 0.45 0.3 to >1 OP3 Mains rehabilitated % 0 to 0.25 0.15 to 0.45 0.3 to >1 OP4 Non-revenue water (NRW) Liter/ connection/ day 0 to 200 100 to 500 400 to >1000 OP5 Service connection rehabilitations % 0 to 0.2 0.1 to 0.4 0.3 to >1.2 OP6 Inoperable of leaking hydrants % 0 to 0.4 0.2 to 1.0 0.8 to >2 OP7 Valves replaced % 0 to 0.1 0.075 to 0.2 0.15 to >0.5 OP8 Hydrants inspection % 0 to 50 30 to 100 80 to >200 OP9 Cleaning of storage tanks % 0 to 30 20 to 70 60 to 100 153 PI No. Performance Indicator (PI) units Universe of Discourse (UOD) Low Medium High OP10 Operational meters % 0 to 40 30 to 70 60 to 100 OP11 Network efficiency m3/ km/ day <40 to 70 60 to 110 100 to >150 OP12 Customers density persons/ km <50 to 80 70 to 100 90 to >120 OP13 Average pipe age years 0 to 30 20 to 60 45 to >80 OP14 Implementation level of water mains rehabilitation/ replacement plan Linguistic 0 to 0.4 0.25 to 0.75 0.6 to 1 WP WATER QUALITY AND PUBLIC HEALTH WP1 Boil water advisories Days 0 to 4 3 to 7 6 to >10 WP2 Average turbidity is distribution system NTU 0 to 1.5 1 to 3 2.5 to >5 WP3 Total Coliform occurrences in distribution system No. 0 to 15 10 to 25 20 to >40 WP4 Average residual chlorine in distribution system mg/L 0 to 1 0.75 to 2.5 2 to >4 WP5 Average turbidity of treated water NTU 0 to 0.5 0.3 to 1 0.8 to >1.5 WP6 Total Coliform occurrences in treated water No. 0 to 4 3 to 7 6 to >10 WP7 Average Trihalomethanes (THMs) mg/L 0 to 0.1 0.075 to 0.15 0.125 to >0.3 WP8 Average nitrates in treated water mg/L as N 0 to 5 3 to 10 8 to >10 WP9 Length of pipes cleaned (flushed) % 0 to 40 30 to 70 60 to >100 QS QUALITY OF SERVICE QS1 Billing complaints No./ 1000 customers 0 to 2 1.5 to 3.5 3 to >5 QS2 Pressure complaints No./ 1000 persons 0 to 1 0.7 to 2 1.5 to >3 QS3 Water quality complaints No./ 1000 persons 0 to 2 1.5 to 3.5 3 to >5 QS4 Unplanned interruptions No./ 100km 0 to 10 7 to 20 18 to >40 QS5 Unplanned maintenance hours % 0 to 30 20 to 50 40 to >60 QS6 Average time of response to complaints Hours <4 to 8 8 to 24 24 to >48 QS7 Total response to reported complaints % <70 to 85 80 to 95 90 to 100 QS8 Service connection complaints No./ 1000 persons 0 to 4 3 to 7 6 to >10 QS9 Aesthetic water quality tests compliance % <75 to 85 80 to 95 90 to 100 QS10 Microbiological water quality tests compliance % <75 to 85 80 to 95 90 to 100 QS11 Physico-chemical water quality tests compliance % <75 to 85 80 to 95 90 to 100 FE FINANCIAL AND ECONOMIC FE1 Water rates for a typical residential customer $ <200 to 450 350 to 600 550 to >700 FE2 Operation and maintenance cost of distribution 000’$ / km <5 to 18 15 to 25 22.5 to >35 FE3 Revenue per unit of water sold $ / m3 <0.3 to 0.45 0.4 to 0.6 0.55 to >0.75 FE4 Operation and maintenance cost of treatment $ / ML <70 to 150 100 to 250 200 to >400 FE5 Operating cost coverage ratio Ratio <0.5 to 0.8 0.75 to 1.05 1 to >1.2 FE6 Debt service ratio Ratio <1 to 1.3 1.2 to 1.6 1.5 to >2 FE7 Non- revenue water % 0 to 12 10 to 20 18 to >30 FE8 Affordability (% of income spent on water bills) % 0 to 0.8 0.5 to 1.3 1 to >1.4 154 Step 4.2: Sensitivity Analysis Sensitivity generally refers to the variability in the model’s output with the changes in the model’s input(s). It ranks the inputs based on their relative contributions to the variability and uncertainty of the output (United States Environmental Protection Agency [USEPA] 2001). Several techniques for sensitivity analysis have been developed, including differential analysis, response surface methodology and factorial design, Monte Carlo analysis, statistical methods and variance decomposition procedures (Helton et al. 2006). Sensitivity analyses are performed to deal with inherent uncertainties in various influencing factors. Random variables are generated assuming normal distribution for the ranges of PIs (UOD) established in Table 6.2. The data generated from the Monte Carlo simulations is used to determine the Spearman correlation coefficients to compare all the inputs (PIs) using statistical software R, for each functional component. The Spearman’s rank correlation coefficient is useful for non-linear models (USEPA 2001), and thus used in this study. The sensitivity analysis results are used to evaluate the percentage contribution of each indicator on the performance of the functional component. The PIs with higher contributions need to be given importance in decision making for performance management. Step 4.3: Performance assessment results It can be observed from Figure 6.1 that the output of FRBM, showing ‘low’ performance for a component or a sub-component depicts that one or more PIs miss the desirable benchmarks. ‘Medium’ output shows that the performance is lower but certainly not very far from the established benchmarks. ‘High’ performance results show satisfactory performance of the component or sub-component. It is important to mention here that in In-UPM the performance objective will not show ‘High’ performance, when even a single PI or PM is underperforming. Step 5: Performance management at utility and system level Based on performance assessment results in Step 4, the utility managers (decision makers) should take rational improvement actions (Figure 6.1). When a functional component is performing ‘High’, the utility needs to ‘Maintain its existing performance’. In case of ‘Medium’ performance, the utility managers should cautiously move from the top to the bottom of the hierarchy for the respective functional component, to identify the underperforming PMs and finally PIs for improvement actions; if required the investigations can be extended to system level for costly and major decisions. In case of ‘Low’ performance, detailed investigations are required for each WSS operating within the utility to identify the root causes of such a low performance. Such detailed investigations will help the utility managers to take rational, timely and cost effective decisions to the underperforming WSS. 155 There could be more than one possible decision (variables) for one performance level; conversely, one decision variable may improve the performance of more than one functional component. In this way, the proposed In-UPM, based on top-down approach, will enable the decision makers to start with the primary PMs, and then isolate the underperforming variables at the lower hierarchical levels for rational improvement actions. In-UPM is applicable to study both the temporal and spatial characteristics of the SMWU. Furthermore, the model can be used for any assessment period (i.e., quarterly, six-monthly, annual); however, conventionally a one year assessment period is preferable. 6.4 A Case Study of Okanagan Basin Okanagan Basin 6.4.1 The Okanagan basin is a narrow strip that spans from Armstrong (BC, Canada) to the United States Border. The basin is almost 200 km long and spread over 8,000 km2 area; it consists of four cities including, Vernon, Kelowna, Penticton, and Osoyoos. Each of these cities supplies water through more than one water utilities; most of these utilities are small to medium with serving population less than 50,000 (OBWB 2014). Similar to other SMWU in Canada, most of the utilities in Okanagan have not been participating in NWWBI so far, and thus facing different challenges to achieve desirable level of service. Water utilities in Okanagan have the lowest per person water supply in Canada. Conversely, the average daily per capita water consumption is 675 liters, which is more than double the Canadian average (OBWB 2014). Most of these utilities are facing water quality issues due to their reliance on source water quality and low treatment levels (i.e., primarily disinfection); it is evident from the highest number of boil water advisories in the province of British Columbia across Canada (Interior Health Canada 2013). To the best of our knowledge, presently, most of these utilities are not evaluating their performance using a well-structured methodology. Therefore, to evaluate the practical application of the proposed modeling framework shown in Figure 6.1, In-UPM is applied for performance management of a medium sized water utility in Okanagan Basin. 156 Analysis and Results 6.4.2 Selected water utility is presently serving 16000 residents. It is one of the fastest growing utilities in Okanagan basin and responsible for providing water supply to 6400 domestic and agricultural customers. The land use in the utility service area is diverse, including residential, agricultural, commercial, public, and industrial. The topography of the Okanagan area is rolling and hilly with medium to steep grades. There are three WSSs in the utility’s service area (Figure 6.4). All these systems primarily depend on source water quality, and supply chlorinated water to the community without conventional surface water treatment. The water distribution systems consist of pipes with different materials, such as cementitious, steel, and plastic. The pipe sizes range from 50mm to 900mm with an average age of less than 25 years; however, pipes as old as 80 years of age are still in service. The utility has not experienced frequent pipe breaks in the past. The performance assessment results of In-UPM, for the assessment year 2012, for all the functional components are shown in Figure 6.5. The utility needs to maintain its performance only when it lies in the light colour region, which means that all the PIs are either meeting or exceeding the desirable standards and benchmarks, i.e., performance is ‘High’. It can be seen in Figure 6.5 that the utility needs to improve its performance in all the functional components. However, the components of ‘provision of safe drinking water’ (P5,1,01,0) and ‘quality of service’ (P6,1,01,0) need to be investigated in detail for all the WSS operating under utility’s jurisdiction. It is important to mention here that the spider diagram shown in Figure 6.5 is useful for top level management to evaluate the utility’s performance. The technical management need to make improvement decisions after detailed investigations. On the other hand, the performance in blue and red regions does not necessarily means that all the sub-components are performing ‘low’. It can be seen in Figure 6.6 that the results of the primary and secondary level PMs for ‘quality of service’ (P6,1,01,0) reveal that the lacking sub-component (at secondary level) is ‘water quality compliance’ (P6,2,13,2), which sent a signal to the higher primary level showing low ‘customer service reliability’ (P6,1,12,1), which was finally transmitted to the top (objective) level (refer to Table 6.1). This component needs to be investigated, in detail, for all the WSSs in the utility. Moreover, from Figure 6.6, the utility management shall be encouraged to see the better performing sub-components; for example, i) the number of complaints related to system integrity are not very high and the utility efficiently responded to all the reported complaint, ii) customers were timely informed about the maintenance activities, iii) 157 customers are overall satisfied with the utility’s response time to their complaints. Based on these results, the utility management will be motivated to improve the underperforming sub-components. Figure 6.4 Reported pressure, water quality and service connection complaints FY 2012 for different WSSs in the utility under study WSS-I WSS-II WSS-III Storage reservoir receiving creek water till FY 2013 New intake and transmission line receiving lake water since FY 2014 Lake 158 Figure 6.5 In-UPM results for the utility for assessment year 2012 Figure 6.6 Results of primary and secondary level PMs for ‘quality of service’ component Water resources andenvironmentalsustainabilityPersonnelproductivityPhysical systemsefficacyOperational IntegrityPublic healthprotectionQuality of serviceEconomic andfinancial viabilityComponent Performance Level0 100 High Medium Low 0 10 Customer service reliabilityCustomer satisfaction levelCustomers information levelWater quality complianceResponse to complaintsComplaints about system integrityPrimaryperformancemeasuresSecondary performance measuresRevise Improve Maintain 159 Sensitivity Analysis 6.4.3 Monte Carlo simulations (5000 runs) are performed in Simulink; and the results are analysed using the methodology described in Step 4.2, Section 6.3. The sensitivity analyses results for all the functional components are shown in Figure 6.7a-g. In order to evaluate the relative importance of the PIs for ‘quality of service’ (P6,1,01,0), the results presented in 6.7f in terms of percentage contribution of the PIs to the performance objective can be used to rank the PIs to facilitate decision makers. For example, the PIs of ‘water quality compliance’ (P6,2,13,2) (P6,3,24,3, P6,4,24,3, and P6,5,24,3) are ranked as ‘1’, followed by P6,2,14,3 (QS5) at rank ‘2’, P6,8,24,2 (QS1), P6,9,44,3 (QS2), P6,10,44,3 (QS3), and P6,6,34,3 (QS7) at rank ‘3’, and finally P6,1,14,3 (QS4), P6,11,44,3 (QS8) and P6,7,34,3 (QS6) at rank ‘4’ (refer to Table 6.1 for details of PIs). The utility managers can take guidance from these ranks to prioritize the improvement actions. Therefore, the PIs related to ‘water quality compliance’ should be given high priority to improve the performance of this component. Similarly, Figure 6.7 can be used to rank the PIs for other functional components. (a) Water resources and environmental sustainability -40 -30 -20 -10 0 10 20 30WE.1PE.8FE.1FE.7WE.6WE.3WE.4WE.5WE.7% Contribution 160 (b) Personnel productivity (c) Physical assets efficacy (d) Operational integrity -20 -10 0 10 20 30 40 50FE-1WE-6PH-3PH-4PH-1PH-2% Contribution -10 -5 0 5 10 15 20PE-5PE-8PE-1PE-2PE-12PH-2PE-3PE-6PE-4PE-7PE-9PE-10PE-11PE-13% Contribution -15 -10 -5 0 5 10 15 20 25 30OP-7OP-14OP-13O-P2OP-3OP-5OP-10OP-8OP-9OP-4OP-1OP-6OP-11OP-12% Contribution 161 (e) Provision of safe drinking water (f) Quality of service (g) Economic and financial viability Figure 6.7 Sensitivity analysis results for all the functional components -20 -15 -10 -5 0 5 10 15 20 25FE-6FE-1FE-8FE-3FE-7FE-5FE-2FE-4% Contribution -25 -20 -15 -10 -5 0 5 10WP-9WP-1WP-4WP-2WP-3WP-7WP-5WP-6WP-8% Contribution -10.000 -5.000 0.000 5.000 10.000 15.000 20.000 25.000 30.000QS-4QS-5QS-9QS-10QS-11QS-7QS-6QS-2QS-3QS-8QS-1% Contribution 162 Performance Management using In-UPM 6.4.4 Based on the performance assessment results, the ‘quality of service’ (P6,1,01,0) component is investigated in detail to identify the underperforming system and its respective sub-components. The utility under study in the above section consists of three WSSs with a population of 540, 1020, and 15,540 serving 538, 407, and 6217 customers, respectively. In Figure 6.4, the number of pressure, water quality, and service connection complaints (P6,9,44,2, P6,10,44,3, and P6,11,44,3 from Table 6.1) received during 2012 are shown for each of the WSS with the help of the GIS map. It can be seen in Figure 6.4, that the water quality complaints were only reported from the customers of WSS-II and WSS-III. The results of the secondary level PMs for all the three WSSs, for the assessment period 2012 prior to the implementation of improvement actions, are shown in Figure 6.8a. The results show that the water quality in WSS-I is better than the other two systems due to the fact that it obtains relatively better quality water directly from Okanagan Lake, and thus receives negligible water quality complaints. Hence, the utility managers needed to improve the water quality for the other two WSSs. In this regard, a real time scenario is described here. From the past water quality monitoring data, it can be inferred that the source water quality was inadequate in one or more of the WSSs, and the customers are consistently complaining about the water quality. Therefore, in 2013, they planned and executed a source water improvement (in terms of both the quality and the quantity) project by transporting the higher quality lake water to one of their main surface water reservoir (previously receiving inflow from a surface water creek). As a result, the improved source water is now being supplied water to WSS II and III. The performance assessment results after implementing the source water improvement project are shown in Figure 6.8b, when the water quality compliance has been improved. Finally, analysis is performed at the utility level after implementing the improvement actions (at the system level), and the results are shown in Figure 6.9. These results depicts the improved performance for the year 2014, or, in other words, after improving decision variables, D18, D30, D32, and D34 (Table 6.1). It can be seen that this decision also improved the overall water quality, and thus also improved the performance, from ‘Low’ to ‘Medium’, of the ‘public health protection’ component. It is important to mention here that the improvement actions are only implemented on the decision variables related to water quality; all the other variables were already performing at ‘Medium’ or ‘High’ level. However, the performance assessment results, recommend further improvements for all the functional components at the utility level. The In-UPM can be applied to manage the underperforming sub-components at both the 163 utility and/ or system levels depending on the performance assessment results for a given assessment period. (a) Before source water improvement (b) after source water improvement Figure 6.8 Secondary level performance measures for service reliability and customer satisfaction in three systems within the water utility 6.5 In-UPM for Complex Decision Making Facing limited human and financial resources and absence of performance benchmarking process, the SMWU are unable to assess sustainability performance. For effective future planning and decision making, these limited resources should be optimally spent on their underperforming components and sub-components. Generally, PIs have been used for performance reporting and cross-comparison only, but not in planning and decision making. The results of In-UPM application in above sections provide strong evidence that it provides a structured approach to the utility managers for effective decision making. Although, In-UPM primarily benefits tactical level decision making, the empirical demonstration of its results can also assist in strategic level decision making. The contribution of In-UPM for complex strategic and tactical level decision making is briefly described below with the help of examples. Customersinformation levelWater qualitycomplianceResponse tocomplaintsComplaints aboutsystem integrity -lower the betterWSS IWSS IIWSS IIICustomersinformation levelWater qualitycomplianceResponse tocomplaintsComplaintsabout systemintegrity - lowerthe betterWSS IWSS IIWSS III 164 Figure 6.9 In-UPM results showing overall performance of the utility for year 2014 after the implementation of improvement action Firstly, use of PIs for cross-comparison with similar utilities (i.e., inter-utility benchmarking) only depicts the relative performance of variables; which, neither addresses the interactions between different PIs, nor the factual impacts of these estimated values on the performance of utility’s processes. For example, generally in SMWU, personnel are responsible for more than one specific task; and the individual PIs (personnel) are calculated based on man-hours for different activities, such as catchment management, distribution, treatment and metering. In addition, finding and retaining the skilled staff in SMWU is difficult as compared to the large utilities. Therefore, this is desired to optimize the staff strength in order to perform routine operation and maintenance activities and to efficiently respond the customer complaints. Decisions, based on conventional benchmarking, about increasing or decreasing the number of staff might be misleading. In this regard, In-UPM evaluates the staff adequacy with relevant PIs, and uses the PPR to optimize the staff strength. PPR is estimated in terms of water used per field FTE employee; and the degree of automation of control units, higher values of which will reduce the personnel requirements to a certain degree. In this way, In-UPM facilitates the manager to sustainably manage their human resources. Secondly, when none of the components are performing satisfactorily (i.e., ‘High’ performance), decision actions for further improvement from ‘medium’ to ‘high’ become more critical. This research provides an Water resources andenvironmentalsustainabilityPersonnel productivityPhysical systemsefficacyOperational IntegrityPublic healthprotectionQuality of serviceEconomic and financialviabilityComponent Performance Level0 100 High Medium Low 165 approach for effective decision making by evaluating performance of different water supply systems individually. For example, Figure 6.8b shows that there were more complaints reported in WSS-III, related to system integrity (pressure, service connections). In this regard, the utility managers can further investigate the root causes of such complaints and can plan a routine inspection program for service connections in the relevant area (i.e., WSS-III) only, instead of the utility as a whole to optimally utilize their limited human and financial resources. Thirdly, without comparing the performance before and after the implementation of decision actions, the utility cannot assess the impacts of performance improvement actions, and thus unable to rationally justify the benefits of their spending. Usually, the utility managers in SMWU take decisions without conducting detailed performance assessment, based on their experience and observations, which might be rational, at times, for extremely underperforming components (i.e., ‘low performance’), for example, source water improvement in the case study mentioned above. However, managers certainly need a quantitative rationale for their improvement actions before actual implementation. In this regard, the results of In-UPM can help the utility managers at strategic level in obtaining financial approvals from governmental agencies and can satisfy their customers and general public as well. Furthermore, such quantitatively demonstrated results, as shown in Figure 6.6, not only identify the underperforming processes, but also motivate the managers for consistent efforts by showing the processes with high performance. Similarly, the In-UPM can be used for performance management of all the functional components in SMWU to achieve overall sustainability objectives. 6.6 Summary In this chapter, an intra-utility performance management model (In-UPM) is conceptualized and developed for effective decision making. The deductive approach of In-UPM provides an opportunity for the higher management of SMWU to appraise the overall sustainability performance of different functional components of their utilities. Subsequently, they can hone in the performance of different processes (sub-components) within the component in order to identify the key areas for improvements. A hierarchical based top-down approach is used; starting from overall sustainability performance objectives of the functional components at the top, followed by primary and secondary performance measures of the sub-components, and indicators (basic building blocks) receiving inputs from data/ decision variables at the bottom. 166 Generally, SMWU lack in data collection and inventory management, and thus the decisions are made in uncertain environment. The issues related to data scarcity are addressed by utilizing benchmarking data from larger utilities, peer-reviewed literature, and expert elicitation from local municipalities. For performance inferencing, In-UPM can handle such uncertainties using fuzzy rule based modeling. The model is robust enough to deal with temporal and spatial variations, i.e., it can assess the performance of the utility as a whole, and/ or different water supply systems, individually, operating within a utility for any assessment period. The application of In-UPM on the case study of a medium sized water utility in Okanagan Basin and its three water supply systems, before and after implementing improvement actions, affirms its pragmatic application for sustainable performance management of SMWU. 167 Chapter 7 A Risk-based Customer Satisfaction Management Model A part of this chapter is under review in Risk Analysis journal as an original research article titled “Customer Satisfaction Management Framework for Small to Medium Sized Water Utilities: A Risk-based Approach” (Haider et al. 2015d). After managing the performance at intra-utility level, the utility management also needs to check their customer satisfaction before implementing major improvement actions. A risk based model is conceptualized and developed in this chapter to serve the purpose. 7.1 Background Water utilities are mandated to provide a reliable, responsive, and affordable service to their customers (USEPA, 2008). Unlike other products, customers do not necessarily have the same flexibility in selecting their water supplier (KWR, 2008). Due to this constraint, for acceptable quality of service, the utility managers should maintain and operate their infrastructure in order to ensure safe and adequate water supply to the consumers. In addition, the utility should respond efficiently to the complaints, failure to do so may result in dissatisfaction of customers. To achieve these objectives, the utility needs to: i) evaluate the complaints received for identifying the underperforming components of its water supply systems (WSSs); and ii) plan the human, technical and financial resources for improvements accordingly. The literature review in Chapter 2 reveals that the existing approaches might not be sustainable for SMWU on periodic bases where limited personnel expertise and financial constraints are main challenges. The customer satisfaction (CS) assessment is a more complex activity for SMWU. Firstly, a water utility may receive different types of complaints, such as poor water quality, low pressure, service connection (SC) repairs, and general complaints. Each type of complaint may have more than one driving factor (i.e., root cause), and varying magnitudes of the problem, e.g., minor leak (a small broken nozzle of SC) vs. major emergency break (entire service line, buried 2 meters deep, needs replacement). Secondly, the CS depends on the duration between the time of the report and response up to the complete resolution of the complaint. This implies, in addition to the response time from the utility’s field staff on telephone, these complaints require different duration for resolution depending on the nature and extent of the problem. Thirdly, all of the existing methods evaluate the CS based on the customer experience only. None of these includes the observations and findings of the operational field staff, which should essentially be included in the decision making process for effective improvement actions. Evaluating CS from the number of 168 reported complaint, the responded complaints, and the customer interviews may mask the underlying root causes of the occurrence of complaints, and do not provide any guidance for improvement actions. Small to medium sized water utilities (SMWU) in Canada are facing serious challenges to achieve their desired performance objectives, including but not limited to lower water treatment levels, lack of trained operational field staff, inefficient data management, and financial limitations (Haider et al., 2014a&b). As a result, such utilities are sometimes lacking in the sustainable and adequate quality of service to their customers, and, thus, are unable to achieve their CS goals. Above mentioned management strategies developed solely on performance benchmarking or customer interviews might not be technically and/ or financially a sustainable solution for SMWU. The main purpose of this research is to develop a risk-based model, primarily based on the evaluation of customer complaints or managing CS in SMWU. The inherent assumption of the proposed approach is that if a utility receives fewer complaints, it implies that the customers are satisfied with the utility’s performance. 7.2 Risk Assessment The concept of risk has been extensively used in business management, medical sciences, environmental assessment, and engineering applications (Blackhurst et al. 2008, Causins et al. 2004). The commonly used risk assessment techniques in water management studies are hazard and operability (HAZOP), root cause analysis (RCA), Failure mode effects and criticality analysis (FMECA), failure mode effect analysis (FMEA), fault tree analysis (FTA), event tree analysis (ETA), reliability block diagram (RBD), human reliability analyses (HRA), Markov analysis, and Bayesian networks (Hokstad et al. 2009). In the previous studies, the risk assessment methods addressed operational and water safety issues for water utilities (Lindhe et al. 2009, Rosén et al. 2008). Its application for managing CS has not been reported in literature so far. In the research, managing risk of CS is developed using RCA and FMEA. A short summary of these methods have been presented below. Root Cause Analysis (RCA) 7.2.1 RCA is a structured technique to investigate and categorize the root causes of occurrence of an event (VHA 2011). The ‘Fishbone’ diagram (also known as a cause and effect diagram) is an efficient tool to visually perform RCA. The final adverse event is shown at the head (or mouth) of the fish. Primary failure types contribute to the main fish bone. Different possible modes of each failure, secondary bones, 169 then contribute to these primary bones. Lastly, causes of each failure mode, smaller bones, contribute to secondary bones, and complete the Fishbone diagram. Failure Mode Effect Analysis (FMEA) 7.2.2 Mitchell (1999) described the risk as likelihood of loss (failure) and the implication of that loss (consequence) for the individual or organization. Therefore, the risk can be quantified in terms of risk score (RS) as a product of probability of occurrence (P) and the consequence of the failure (C) as: RS = Probability of occurrence Consequence [7.1] Later, a third dimension was introduced, which defined the causal pathways leading to the failure event or the ability to detect the risk (Ritichie and Brindley 2007). As a result, the standard FMEA technique evaluates all the possible failure modes (FMs) for their P, C, and detectability (D) (Carbane and Tippett, 2003). The method generates a risk priority number (RPN) for each FM using the following equation: RPN = Probability of occurrence  Consequence  Detectability [7.2] Traditionally, FMEA determines the criticality of a FM in terms of a risk priority number (RPN) ranging between 1 and 1000 (Abdelgawad and Fayek, 2010). In order to deal with the uncertainties associated with data limitations and vagueness in expert opinions, fuzzy set theory can be integrated with FMEA. Fuzzy-FMEA 7.2.3 Fuzzy logic, first developed by Zadeh (1978) supports human reasoning; it is useful when the data is limited and the model parameters need to be defined linguistically based on experts judgment. In this research, all the risk factors (i.e., P, C, and D) are linguistically defined as ‘very low’, ‘low’, ‘medium’, ‘high’, and ‘very high’. Figure 1 presents the trapezoidal fuzzy numbers, and their elements defining universe of discourse (UOD) for all three risk factors between 1 and 10. Fuzzy multiplication operator is used to calculate a fuzzy RPN value for each FM. Each membership function contains four elements to define its UOD; for instance, a value of ‘Very Low’ can be define with the help of c1, a1, b1, and d1 (1,1,2,3) as shown in Figure 7.1. A fuzzy RPN number can be calculated by multiplying all the parameters: for example, if ‘P’ and ‘C’ are ‘Low’ and ‘D’ is ‘Medium’, from Figure 7.1 the corresponding fuzzy RNP value can be calculated as: 170 (2, 3, 4, 5) × (2, 3, 4, 5) × (4, 5, 6, 7) = Fuzzy RPN 2 × 2 × 4, 3 × 3 × 5, 4 × 4 × 6, 5 × 5 × 7 = 16, 45, 96, 175 The next step is defuzzification to convert the fuzzy RPN into a crisp RPN. There are various techniques for defuzzification, each technique extracts different levels of information from the fuzzy numbers and may give different defuzzified values (Tesfamariam and Sadiq 2006). Chen’s ranking method (1989) generalizes the results for the trapezoidal fuzzy number and has been frequently used in risk assessment studies (Sadiq and Khan 2006). In this study, the same method is used for ranking of failure modes. The graphical illustration of Chen’s method is shown in Figure 7.2. The FMs with higher UT(x) values will be given preference for risk mitigation using the following relationship:          iiiiiiT caLLCdbLLLdxUminmaxmaxminmaxmin 121)( [7.3] where Lmin and Lmax are the minimum and maximum possible RPN values of 1 and 1000 respectively, and ai, bi, ci, and di are described in Figure 7.1 and 7.2. The final RPN is estimated by multiplying UT(x), from equation (7.3), with 1000. The defuzzified RPN came out to be ‘102’ for the above example. Figure 7.1 Standard trapezoidal membership functions used in this study Very Low 1 0 Low Medium High Very High 1 c2(2) 3 4 d2(5) 8 9 10 6 7 a2(3) b2(4) 171 Figure 7.2 Chen (1985) defuzzification method for trapezoidal fuzzy numbers 7.3 Modeling Approach In this study, a customer focused risk based model is developed to manage the risk of CS in SMWU. The proposed risk management framework shown in Figure 7.3 obtains the information from customer complaints and evaluates the risk using RCA and fuzzy FMEA. The global objective of the framework is to mitigate all the FMs up to the acceptable risk. The step-by-step approach is described below: Step 1: Establish baseline: Collect data The proposed framework initiates with the collection of baseline data (customer complaints work orders) to understand the state of the problem. A work order generated in response to a customer complaint contains: i) name and address of the customer, ii) complaint registration time and date, iii) reported complaint either in written form (email or posted), or telephonic conversation, which is later typed by the personnel responsible for attending phone calls, iv) the time and date when the field staff reached the customer’s address, v) notes taken by the field staff based on actual site situation, and iv) the time when the complaint was resolved. This information provides the basis of detailed risk assessment. a2 c1 d1 a1 b1 c2 d2 b2 Lmin Lmax  xA Lx 172 Figure 7.3 Risk-based modeling approach for assessment of customer satisfaction Step 1: Collection of baseline data (Customers complaints and Work Orders) Step 2: Evaluation of customers’ complaints (failure modes) (Situation analysis/ Risk identification) Step 3: Identification of causes of customers’ complaints (Root Cause Analysis) Step 4: Defining probability of occurrence, severity and detectability, and acceptable risk (Knowledge Acquisition under uncertainty) Step 5: Quantification of risk (RPN) (Fuzzy Failure Mode Effect Analysis ‘FMEA’) Is the risk acceptable for all modes of failure? Step 6: Identification and ranking of important failure modes (Risk Prioritization) Step7: Application of improvement actions (Risk Mitigation) CUSTOMER SATISFACTION YES NO 173 Step 2: Risk identification: Evaluate customer complaints All the work orders need to be carefully examined to determine the distribution of different types of complaints and their associated reasons. A detailed inventory would be extremely helpful to arrange different types of complaints for the data collected in Step 1. The inventory should include the main finding obtained by the field staff in the form of notes during their field visits to resolve the complaints; these notes may include the customer responses as well in terms of their satisfaction. This can be accomplished either through experience judgment, or possibly the customers verbally expressed their feelings for the field staff or the utility’s personnel attending their phone calls, which were also recorded in the work orders. Step 3: Root cause analysis: Investigate causes of complaints Each type of complaint can be originated due to multiple causes. Therefore, detailed root cause analysis (RCA) is required to identify all possible FMs for detailed risk analysis. Step 4: Knowledge acquisition: Define risk factors All of the risk factors (P, C, and D) may contain different types of inherent uncertainties. Although, P might be a calculated value; however, it is also available for a limited period. Therefore, all three factors would be defined linguistically using experts experience and judgment. Generally, there is no rule of thumb for defining the threshold for RPN. For example, it is very difficult for the decision makers to establish priority of 160 RPN over 140 RPN. However, it is important for the decision makers to establish an acceptable risk for any project. Step 5: Risk assessment: Perform fuzzy-FMEA In the proposed FMEA framework, the failure event is considered once the complaint has been reported. However, a complete FM depends on the duration between the time of the report and response, up to the time when the complaint is completely resolved. The ‘P’ refers to the frequency of each type of reported complaints over a stated period. The ‘C’ of failure refers to the field notes obtained by the operational personnel during the field visit and expert opinion on response time and time to resolve the complaint. The ‘D’ is defined as the implementation level of possible risk mitigation actions. The first risk factor ‘P’ (from equation 7.2) can be estimated from the number complaints received by the utility during the assessment period. For example, if there were 5 water quality (WQ) complaints reported in a year, its probability of occurrence would be 1.37%. 174 Once, the event has occurred, the ‘C’ in terms of customer satisfaction, depends on several factors: i) type of complaint, i.e., a pressure complaint may possess higher consequence than the complaint related to a minor repair of the SC; ii) response time, which again results in different satisfaction levels depending on the nature of complaint, e.g., minor water leak vs. emergency major water leak; and iii) time to resolve the complaint, which is specifically related to the extent of complaint, e.g., a large repair requiring deep excavation to replace the entire service line may need more than a day to completely resolve the issue. An efficient response time can be defined as the time in which the complaint was responded to with an acceptable CS level. For example, response to an emergency leak should be immediate, within 2 to 4 hours, a non-emergency leak within 24 hours, and an on/off request for plumbing repair is expected to be completed at a scheduled time. The third factor ‘D’ is defined as the implementation level of risk mitigation measures. This factor also depends on different factors: i) implementation of routine maintenance programs, ii) availability of adequate staff for quick response, iii) automated equipment for emergency response, iv) information to customers about planned interruptions, and utility’s policies about water restrictions, pressure management, flushing of plumbing lines after the flushing of water mains, boil water advisories etc., and v) treatment levels of existing facilities. Step 6: Risk ranking: Perform prioritization The results of fuzzy-FMEA (defuzzified RPN) using equation [7.3] for all the FMs under each category of complaints can be compared with the acceptable risk value. If the RPN of more than one FM would be higher than the established threshold risk, all the FMs should be ranked to prioritize risk mitigation actions. Pareto analysis is a tool which is used to identify the most relevant sets of data to concentrate further analysis. Risk clustering is an approach used to logically group the observation on the basis of similarity in their characteristics, reducing the level of heterogeneity in the data (Zsidisin and Ritchie 2009). Both of these tools can be used to group and rank the set of FMs for risk management. Step 7: Risk management: Mitigate and reduce risk Different risk mitigation actions can be implemented to anticipate the resultant impacts through re-estimating the RPN of the concerned FMs using expert judgment. It is proposed in the framework shown in Figure 7.3, that the mitigation actions should continue to apply until all the FMs have RPN values less than or equal to an acceptable risk. The FMs with an RPN less than or equal to the acceptable risk won’t need any further mitigation action and thus confirm the CS (Figure 7.3). 175 7.4 Okanagan Case Study Study area 7.4.1 The water utilities in Okanagan have the lowest per person water supply in Canada; conversely, the average daily per capita water consumption is 675 liters, which is more than double the Canadian average (OBWB 2014). Moreover, these utilities are facing WQ issues due to their reliance on source WQ and low treatment levels (i.e., primarily disinfection); this is evident from the highest number of boil water advisories in the province of British Columbia across Canada (Interior Health Canada 2013). The study area is a medium sized water utility in Okanagan Basin, British Columbia, Canada shown in Figure 6.4. Similar to other SMWU in Canada, this utility has not yet participated in NWWBI, and faces different challenges to achieve a desirable level of service. The utility currently serves 16000 residents, and is one of the fastest growing utilities in the Okanagan basin. It supplies water to 6400 domestic and agricultural customers. The land use is diverse, including residential, agricultural, commercial, public, and industrial; and the topography is rolling and hilly with medium to steep grades. The utility has three WSSs in its service area (Figure 6.4). All these systems primarily depend on source water, and supply chlorinated water without conventional treatment (i.e., filtration). The water distribution systems consist of 50mm to 900mm diameter water mains with different materials (i.e., cementitious, steel, and plastic). Although, some of the water mains still in service are up to 80 years old, the average age is less than 25 years; in result, the utility has not experienced frequent water mains breaks in the past. In spite of that, it has received several customer complaints every year related to different issues described in the following sections. For instance, a GIS map showing complaints for the year 2012 can be seen in Figure 6.4. Baseline Data collection 7.4.2 Data collection and analyses are important for identification of FMs and associated root causes to develop a rationale for detailed risk assessment. In this regard, a four year record of work orders containing more than 1500 customer complaints was obtained from the participating water utility to evaluate the proposed risk based model. Data for first three years (2011 to 2013) is used for model development: whereas, data for the year 2014 is used for the validation purposes. 176 Risk Identification 7.4.3 From detailed evaluation of the work orders, it was found that the customer complaints can be categorized into four major types: i) pressure, ii) water quality, iii) structural, and iv) general complaints. Each type of complaint was originated due to multiple FMs. A distribution of complaints over different components of the WSS is shown in Figure 7.4a&b. Figure 7.4a reveals that 74% of the total complaints were related to structural failure. From work orders six types of structural complaints are observed: i) SC inspection/ repair, ii) major water leak, iii) minor water leak, iv) dole valve repair, v) curb stop repair, and vi) meter repair. 46.3% of the total structural complaints were simple seasonal turn on/off requests. Out of the remaining, 51.5% were related to SC issues, and few eventually turned out to be a drainage issue, instead of a water infrastructure failure. About 90% of times, the pressure complaints were reported due to ‘low pressure’; the reason for remaining complaints was ‘nonoperational booster station’. More than 80% of the complaints in WQ category were caused by different issues associated with source WQ. Figure 7.4b shows that general complaints were mainly related to meter reading investigations (i.e., around 85%), and most of the remaining complaints were the ‘overwatering in neighbourhood’. Even though, the regulations for the water restrictions are well disseminated through the utility’s website, still, the neighbours feel it is their social responsibility to report such complaints. In response, the utility drops off an additional copy of regulations to the offending customer. Remaining complaints (general) were related to landscaping compensations after the utility’s repair works, visible ponding on surface, frozen pipes, etc. Root Cause Analysis 7.4.4 The visual presentation of RCA with the help of Fishbone diagram is presented in Figure 7.5. Each failure possesses different FMs, which are connected to the secondary bones of each failure. For example in Figure 7.5, the FM resulting from a WQ complaint can occur due to, bad taste, odour, dirty water, staining of clothes, health related issues, inquiring in-house treatment methods, and water quality test results. Subsequently, each mode of failure can occur due to multiple root causes; for instance, a complaint reported as ‘bad taste’ can be caused by: i) high chlorine levels, ii) poor source water quality (smell of algae), iii) possible growth of biofilm in the distribution mains, and iv) clogged filters. Likewise, the FMs for the remaining failure types are established in Figure 7.5. 177 (a) Distribution of customer complaints (b) Customers’ complaints along the system Figure 7.4 A vignette of customers’ complaints with respect to their causes Water Main Corporation Stop Adjustable curb box Stainless steel rod Home Owner’s boundary In-house plumbing Utility’s responsibility Home owner’s responsibility Utility’s responsibility Service connection pipe LAKE RIVER DISINFECTION P TRANSMISSION DISTRIBUTION STORAGE 81% Source WQ BS 13% Chlorine WQ 6% HO WQ 9% BS off PR 17.3% 10.3% Curb stop repair SC inspection/ repair 2.8% Utility Major water leak 2.8% HO 5.6% Utility Minor water leak 7.1% HO 1.15% Both 1.5% Dole valve repair 3% Meter repair 4.6% Utility SC on/off + repair 1.7% HO ST 40% SC on/off ST 2.2% Drainage ST 51.5% SCC ST 61.5% Utility Low pressure 29.5% HO PR PRV 85.2% Meter GE 14.8% Other LEGEND ST – Structural complaints PR – Pressure complaints WQ – Water quality complaints GE – General complaints SCC – service connection complaints HO – Home owner PRV – Pressure releasing valve P – Pumping station BS – Booster pumping station Pressure, 6.60% Structural, 74.10% Water quality, 8.40% General, 10.90% 178 Figure 7.5 Root cause analysis (RCA) for customer complaints in SMWU Customer Complaint Pressure Water quality General Structural Low Pressure (System) Pressure release valve failure Low pressure zone Booster pump non-operational No water Booster pump station failure Main pump station failure Unplanned maintenance activities Water main break Meter line frozen Low pressure (Plumbing) Membrane / filter clogged / leaking Sprinkler system failure Frozen plumbing pipes Odour High chlorine Clogged filters In-house storage requires cleaning Algae due to poor source water quality Colour (dirty/ brown/ yellow water) High source water turbidity Water main flushing In-house plumbing issues Inquiring user end treatment Inadequate source water quality Bad taste Poor source water quality Biofilm growth High chlorine Health issues Source water quality Presence of elderly, children or sick residents Staining of clothes/ appliances/ fixtures Source water quality Plumbing issues Asking for water tests results Elderly, children or sick residents Water visible at surface Possible water leak Drainage issue Overwatering in neighbourhood Social responsibleness Booster station in turned OFF Vigilant by nature Instigate meter Service connection inspection/ repair Gate valve failure Broken/ rusted service line needs replacement Plumbing issue on homeowner side ON/ OFF service connection Plumbing repair Seasonal ON/OFF Minor repair found during ON/OFF Pipes Banging Water hammer Adjust/ Repair/ Relocate curb box Minor repair (steel rod) Hit by a vehicle Gate valve broken Buried and difficult to locate Meter repair Dole valve repair Faulty meter needs to be replaced Clogged strainer Faulty impeller Gate valve failure Water Leak Leaking gate valve Leaking minor fittings Other (water visible on surface) Leak on home owner’s side Leaking service line Snow melt Sewerage line cracked Drainage issue Wrong perception Partially opened hydrant Meter reading 179 Fuzzy-FMEA 7.4.5 Fuzzy-FMEA is used to conduct detailed assessment of the CS in SMWU. Different ranges for ‘P’ have been defined depending on the category of complaints, due to the fact that lower number of WQ and pressure complaints can cause the same risk as compared to a relatively larger number of structural and general complaints. The linguistically defined trapezoidal functions, for all types of complaints covering the universe of discourse (UOD) for ‘P’,‘C’ and ‘D’ are presented in Table 7.1. Based on the results, the FMs are ranked between 1 and 20 to prioritize their respective improvement actions. The detailed results of fuzzy-FMEA analysis for assessing CS are presented in Table 7.2. Risk Prioritization 7.4.6 The RPN ranges established for different risk priority levels are listed in Table 7.3. A graphical illustration for the results of fuzzy-FMEA in Figure 7.6 shows the priority levels for each category of complaints. This figure reveals that all the FMs with RPN values higher than 500 (extreme priority ‘EP’) fall under WQ category. As per Table 7.2, these modes of failure originated from colour complaints (FMs, 3.2.1 & 3.2.2), health issues due to presence of elderly, sick or children (FM, 3.4.1), and clogging of in-house filters (FM, 3.6.1). The results also shows that the pressure complaints also cause higher risk of CS; from 12 FMs in this category, 2 of them are at ‘VP’ with RPN value varying between 400 and 500 (Figure 7.6). These FMs are: i) booster station was not operational resulting in ‘No Water’ complaint (FM, 1.2.2), and ii) when the meter required repairs and the complaint was responded to after 24 hours (FM, 1.1.8). Further, the FMs with ‘HP’ in pressure complaints category were caused by; i) unplanned maintenance activities that left customers without water for some duration (FM, 1.2.1), and ii) low pressure complaint, caused by in-house plumbing failure, and the complaint was responded to after 24 hours (FM, 1.1.10). There are no FMs with ‘EP’ in the category of structural complaints. However, there are 7 FMs with ‘VP’ having RPN values ranging between 400 and 500. The root cause of these FMs are: i) different types of SC repairs and inspections (refer to Figure 4 for details), and ii) delays in resolving complaints due to non-availability of equipment on the day complaint was responded or due to complexity of the field situation (e.g., it was hard to find the location of buried service lines or curb box). In the general category, all the complaints lie within low priority (RPN less than 150. 180 Table 7.1 Linguistically defined probability of occurrence (P), consequence (C), and detectability (D) Linguistic description Universe of Discourse (UOD) Probability of occurrence (P) (%) Consequence (C) Likelihood of detectability of failure (D) Structural and general complaints Pressure and water quality complaints Criteria Consequence of failure Very low 1, 1, 2, 3 < 0.15 < 0.05 Issue is minor/ major/ emergency and resolved without delay Can cause very low dissatisfaction to the customer Very high likelihood of implementing risk mitigation Low 2, 3, 4, 5 0.15 - 0.3 0.05 - 0.1 Issue if minor and resolved with some delay Can cause low dissatisfaction to the customer High likelihood of implementing risk mitigation Medium 4, 5, 6, 7 >0.3 - 0.5 > 0.1 - 0.3 Issue is minor and resolved with longer delay Can cause moderate dissatisfaction to the customer Moderate likelihood of implementing risk mitigation High 6, 7, 8, 9 >0.5 - 1 > 0.3 - 0.5 Issue is major/ emergency and resolved with some delay Can cause high dissatisfaction to the customer Low likelihood of implementing risk mitigation Very High 8, 9, 10, 10 >1 > 0.5 Issue is major/ emergency and resolved with longer delay Can cause very high dissatisfaction to the customer Very low likelihood of implementing risk mitigation 181 Table 7.2 Results of fuzzy-FMEA Failure (Complaint) No Mode of failure P1 C2 D3 RPN Category 1 - Pressure Complaints 1.1 Low Pressure 1.1.1 Pressure releasing valve (PRV) clogged, and the complaint was responded and resolved within 24 hours H4 L5 M6 189 1.1.2 Pressure releasing valve (PRV) clogged, and the complaint was responded after 24 hours H M M 272 1.1.3 Booster pump needed to turn ON or be repaired and the complaint was resolved within 24 hours M L M 146 1.1.4 Booster pump needed to turn ON or be repaired and the complaint was resolved after 24 hours due to non-availability of equipment, difficulty in identifying the problem, or busy schedule of staff. L M M 146 1.1.5 Service connection repair required, and the complaint was resolved within 24 hours L L L 132 1.1.6 Service connection repair required, the complaint was resolved after 24 hours it was hard to locate the location of the problem, e.g., curb box was installed under the driveway, etc.) L M L 189 1.1.7 Meter repair required (dirt, parts replacement, dole valve repair, strainer clogged) and the complaint was resolved within 24 hours VH7 L L 285 1.1.8 Meter repair required (dirt, parts replacement, dole valve repair, strainer clogged) and the complaint was resolved after 24 hours VH M L 414 1.1.9 In-housing plumbing issue on home owner's side; i) filter clogged; ii) sprinkler system (flow test was conducted to find the cause); iii) frozen pipes, and the complaint was resolved within 24 hours VH VL8 L 152 1.1.10 In-housing plumbing issue on home owner's side; i) filter clogged; ii) sprinkler system (flow test was conducted to find the cause); iii) frozen pipes, and the complaint was resolved after 24 hours VH L L 285 1.2 - No water (respond immediately) 1.2.1 Unplanned maintenance activities (i.e., hydrant flushing, repair works, pump station maintenance etc.), and the customer was not informed 48 hours before the start of activity M H M 272 1.2.2 Booster station not operational, and the complaint was resolved within 4 hours M H L 451 Category 2 -Structural complaints 2.1 - Turn ON/OFF service connection 2.1.1 ON/OFF done at asked time and found OK VH VL H 83 2.1.2 ON/OFF not done at asked (scheduled) time, Complaint was responded after the scheduled time or 24 hours when asked as soon as possible VH M H 220 2.1.3 ON/OFF with minor repair on the same day, and the complaint was resolved within 24 hours VH L L 285 2.1.4 ON/OFF done at asked time and found a problem on home owners side, which was conveyed to the home owner VH VL L 152 2.1.5 ON/OFF done at asked time but a minor repair found which could not be resolved on the same day due to non-availability of parts or equipment H H L 451 2.2 - Service connection inspection/ Repair or replace service line 2.2.1 Inspection done at scheduled time, and connection was found OK VH VL L 152 2.2.2 Inspection done at scheduled time, and minor repair found, and the complaint was resolved within 24 hours VH L L 285 2.2.3 Inspection was done after 24 hours, and connection was found OK or a minor repair was done on the same day VH M L 414 2.2.4 Inspection was done at scheduled time, but major repair found, and the complaint was resolved after a week time due to non-availability of equipment or due to greater extent of repair H H L 451 2.3 - Repair/ locate/ adjust curb box 2.3.1 Curb box needs to be raised or repair or adjusted, and the complaint was resolved within 24 hours VH VL L 152 2.3.2 Curb box needs to be raised or repaired or adjusted (major repair with excavation), and the repair was completed after the scheduled time or 24 hours after the complaint VH M L 414 2.3.3 Curb box needs major repair, adjustment, or raised (with excavation), and the complaint was done after a week time M H L 350 2.4 - Meter repair/ replace/ locate meter 2.4.1 Meter needs repair, or replacement, dirt removal, parts, and the meter was repaired within a week VH L L 285 2.4.2 Meter needs repair, or replacement, dirt removal, parts, and the repair was done within 2 weeks L H L 244 2.5 - Check flow/ dole valve 2.5.1 Flow is low, dole valve needs repair/ replacement, and the complaint was resolved within 24 hours H VL L 132 2.5.2 Flow is low, dole valve needs repair/ replacement, and the complaint was responded within 24 hours but resolved at other time due to non-availability of parts L H L 244 182 Table 7.2 (Cont’d) Results of fuzzy-FMEA Type of complaint No Mode of failure P C D RPN 2.6 - Minor water leak 2.6.1 Minor leak on service connection line on Utility's side; i) stainless steel rod; ii) Gate valve, iii) Galvanized nipple; iv) minor leak in service line, or, v) any other minor repair VH L L 285 2.6.2 Minor leak on service connection line on Utility's side, and the resolved after 24 hours H M L 350 2.6.3 Minor leak on service connection line on Utility's side, and the complaint was responded within 24 hours but the repair was scheduled to some other time due to unavailability of equipment , as the customer is informed so dissatisfied to some extent H H L 451 2.6.4 Leak on home owner side due to in-house plumbing failure, and the complaint was resolved within 24 hours VH VL L 152 2.6.5 Leak on home owner side due to in-house plumbing failure, and the complaint was resolved within 48 hours L M L 189 2.6.6 Problem observed on both sides (i.e., system and home owner), and the complaint was resolved within 24 hours and H VL L 132 2.6.7 Problem observed on both sides (i.e., system and home owner), and the complaint was resolved after 24 hours VL M L 103 2.7- Major water leak (respond immediately) 2.7.1 Major leak on service connection line on Utility's side; i) entire service line changed; ii) pressure gage replaced (excavation required for repair activities), and the complaint was resolved within 4 hours VH L L 285 2.7.2 Major leak on service connection line on Utility's side, and the complaint was responded within 4 hours but could not be resolved within 24 hours, due to unavailability of equipment , or due to extent of repair work (excavation) as the customer is informed so dissatisfied to some extent VH M L 414 2.7.3 Leak on home owner side due to in-house plumbing failure, and the complaint was resolved within 4 hours VH VL L 152 2.7.4 Leak on home owner side due to in-house plumbing failure, and the complaint was responded after 4 hours VH M L 414 2.8 - Water visible on surface (possible water leak) 2.8 Problem related to drainage issues, e.g., snow melt, wrong perception, sewerage, partially opened hydrant, etc., and the problem was resolved within 24 hours VH VL L 152 Category 3 - Water Quality Complaints 3.1 Complaint related to taste of water 3.1.1 Deteriorated source water quality/ growth of organic matter in distribution system/ high chlorine; the complaint was responded within 4 hours, and the customer was informed about the source water quality, and planned future improvements M H L 350 3.1.2 Deteriorated source water quality/ growth of organic matter in distribution system/ high chlorine, and the customer was found highly dissatisfied about water quality and the rates of water they are paying for M VH L 414 3.2 Complaint related to colour of water (dirty, brown, yellow, sediments, etc) 3.2.1 Deteriorated source water quality; the complaint was responded within 4 hours, and the customer was informed about the source water quality, and planned future improvements VH H L 537 3.2.2 Deteriorated source water quality, and the customer was found highly dissatisfied about water quality and the rates of water they are paying for VH VH L 648 3.2.3 Plumbing failure on home owners side; the complaint was responded within 24 hours, and customer was informed that the problem lies in his in-house plumbing system (clogged filters, leaking service line on home owner's sider, etc.) H M L 350 3.3 -Complaint related to odour of water 3.3.1 Chlorine smell observed from high chlorine dose due to high turbidity in source water; the complaint was responded within 4 hours, and the customer was informed about the source water quality, and planned future improvements VH H M 414 3.3.2 Chlorine smell observed from high chlorine dose due to high turbidity in source water; and the customer was found highly dissatisfied about water quality and the rates of water they are paying for H VH M 414 3.3.3 Smell in water due to clogged filters or membranes or dirty in-house storage reservoirs; the complaint was responded within 24 hours, and customer was informed that the problem lies in his in-house plumbing system L H L 244 3.4 – Health issues presence of elderly people, children or ill people 3.4 Deteriorated source water quality, or high chlorine levels; the complaint was responded within 4 hours and the customer was informed about the source water quality and planned future improvements, but the customer was found highly dissatisfied about water quality and the rates of water they are paying for VH VH L 648 183 Table 7.2 (Cont’d) Results of fuzzy-FMEA Type of complaint No Mode of failure P C D RPN 3.5- Staining of clothes, electric appliances, and washroom fixtures 3.5.1 Deteriorated source water quality, dirty water; the complaint was responded within 4 hours, and the customer was informed about the source water quality and planned future improvements H H L 451 3.5.2 Deteriorated source water quality, dirty water; the complaint was responded within 4 hours, but the customer was found highly dissatisfied about water quality and the rates of water they are paying for L VH L 285 3.6 - Quick clogging of filters and membranes installed at consumer's end 3.6.1 Deteriorated source water quality/ flushing of mains/ plumbing issue, the complaint was responded within 4 hours, and the customer was informed about the source water quality and planned future improvements VH H L 537 3.6.2 Deteriorated source water quality/ flushing of mains/ plumbing issue, the complaint was responded within 4 hours, but the customer was found highly dissatisfied about water quality and the rates of water they are paying for L VH L 285 3.7 -Complaints asking about water quality test results 3.7.1 Inquisitive customers (living with elderly people, children, or ill persons) or obvious deteriorated source water quality; the complaint was responded immediately but the results were delivered within a week H H M 350 3.7.2 Inquisitive customers (living with elderly people, children, or ill persons) or obvious deteriorated source water quality; the complaint was responded immediately but the results were delivered after a week M VH M 319 3.8 - Inquiry about treatment options at costumers end 3.8 Deteriorated source water quality, and the customer was immediately informed about the possible treatment options VH M M 319 Category 4 - General Complaints 4.1 - Over watering complaint by the neighbours 4.1.1 Regulations dropped within a week VH VL H 83 4.1.2 Regulations dropped after a week L M H 102 4.2 Instigate meter reading 4.2 Meter running fast or stopped, and the complaint was resolved within a week VH VL H 46 4.3 Frozen pipes 4.3 Very low temperatures during winters, unprotected pipes, and the complaint was resolved within 24 hours VL L M 56 4.4 Banging pipes 4.4 Water hammer - in-house PRV not installed or faulty, and the complaint was responded within 24 hours VL L VH 56 4.5 Cracks on ground (possible leak) 4.5 Drainage issues, and the complaint was responded within 24 hours VL L M 39 4.6 Booster station is non-operational 4.6 Vigilant customers felt responsible for community, and the complaint was responded within 24 hours L M M 146 1P-Probability of occurrence, 2C-Consequence, 3D-Detectibility, 4H-High priority, 5L-Low priority, 6M-Medium priority,7VH-very high priority, 8VL-very low priority Table 7.3 Priority levels and RPN range Rank RPN range Priority level 1 - 2 >500 Extreme priority (EP) 3 - 4 400 - 500 Very high priority (VP) 5 - 8 250 - 400 High priority (HP) 9 - 13 150 - 250 Moderate priority (MP) 14 - 20 <150 Acceptable risk 184 Figure 7.6 Number of failure modes with risk priority levels Customer Satisfaction Management 7.4.7 The results of risk assessment for the existing scenario (i.e., No Action) presented in Appendix A reveal that the utility needs risk management actions to improve its CS. The FMs evaluated in the above section needs two types of risk mitigation actions: i) before the occurrence of failure to reduce the ‘P’, and ii) after the occurrence of failure to reduce the ‘C’. The former, is more related to management, inspection and improvements of infrastructure facilities; whereas, the later, can be achieved by hiring more trained staff for efficient response. The proposed actions with tentative costs required to reduce the overall risk of CS are listed in Table 7.4. In order to reduce the risk of all the FMs to less than the acceptable level (i.e., RPN less than 150), the utility managers need to prioritize the FMs for effective use of limited available resources. In this regard, Pareto analysis and risk clustering are performed with the help of a scatter plot between RPN values calculated above and Risk score (RS) calculated from equation (1) for all the FMs in Table 7.2. The risk mitigation should be started with the first cluster of FMs with EP and VP as shown in Figure 7.7. The risk mitigation actions are described below. 0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30PressureStrcuturalWater qualityGeneralNumber of failure modes EP VP HP MP LPLegend EP: Extreme priority VP: Very high priority HP: High priority MP: Medium priority LP: Low priority 185 Table 7.4 Proposed mitigation actions based on risk assessment results Proposed Action Description Implementation stage Tentative cost (million $) Before failure to reduce P1 After failure to reduce C2 A1 - Automation of booster stations Due to the geographical limitation of the utility, booster stations are required to meet desirable residual heads at consumers’ end.   0.13 A2 - Source water change The utility has been planning to improve the source water to meet the adequacy of both the quantity and quality of source water.   12.4 A3 -Implementation of routine SC4 inspection program and/ or increasing staff The utility needs to hire additional staff to implement a program of routine inspection of SCs on periodic basis. Additional staff will also improve the response time to complaints. Utility manager also need to ensure the availability of equipment and replaceable parts required during the inspection.   2.03 A4 - Installing complete filtration systems Installation of conventional filtration plants will significantly improve the water quality around the whole year.   15 Cumulative cost of risk mitigation 29.5 1Probability of occurrence, 2Consequence of failure, 3cost includes 10 year O&M expenses, 4service connection Figure 7.7 Risk clustering for customers’ satisfaction assessment for the existing situation to take Action-1&2 Action 1: Automation of booster stations The utility can initiate risk mitigation with the simplest and inexpensive action. Booster station was found non-operational (i.e., FM, 1.2.2 in Table 7.2) on occasions, which resulted ‘No water’ complaints. Although the response from the utility staff was quick, the consequence (customer dissatisfaction level) was high. Therefore, the utility needs to automate their booster stations. The 01002003004005006007000 20 40 60 80 100Risk Priority Number (RPN) Risk Score (RS) 3.2.2, 3.4 3.2.1, 3.6.1 Failure modes with extreme and very high priority Threshold for acceptable risk 1.2.2 186 revised RPN values after the implementation of Action 1 shown in Figure 7.8 reveals that the overall reduction in RPN is not significant; however the minimum risk value has been reduced. Figure 7.8 Results of risk mitigation, A-1: Automation of booster stations, A-2: Source water change, A-3: Implementation of a routine service connections inspection program, A-4: Increasing level of water treatment Action 2- Source water improvement Figure 7.7 shows that the FMs, 3.2.1, 3.2.2, 3.4.1, and 3.6.1 (from Table 7.2) are at extreme priority risk due to inadequate source water quality. Prior to 2013, the source of the storage reservoir was a creek with high turbidity (Figure 6.4). Moreover, the amount of available water from this source was inadequate to meet the future water demand. Consequently, the utility management planned to improve this water source by transmitting additional water from Okanagan Lake in 2013. Subsequently, the work orders for the year 2014 are used to assess the impact of this improvement action on CS. A significant overall reduction in WQ complaints was observed during the year 2014. Figure 7.8 reveals exhibit a notable reduction in revised RPN. However, the results clearly suggest the need for further improvement actions. Action 3: Implementation of a routine service connections inspection program Table 7.2 presents that out of 63 detected FMs, 28 are related to SC complaints. The current practice in the utility is to simply respond to all these complaints. Occasionally, the response was delayed, as the field staff was deployed on other maintenance activities. In response to SC on/off complaints, the field staffs are required to check the SC as well, as per utility’s policy. Nearly, 10% of the times, a repair was also found during this unplanned inspection (Figure 7.4b). However, on average less than 0100200300400500600700No Action Action-1 Action-2 Action-3 Action-4RPN Legend: EP – Extreme priority risk VP – Very high priority risk HP – High priority risk MP – Moderate priority risk AR – Acceptable risk Minimum 75 percentile Average 25 percentile Maximum EP VP HP MP AR 187 3% of the total SCs have been inspected annually in this way. In addition, the FMs related to SC issues in pressure category can also be reduced with the help of a routine SC inspection. Therefore, it is sustainable to conduct a routine inspection of all the SCs annually to reduce the risk. In this way, with additional dedicated staff hired for the purpose, the existing staff would be able to resolve the other complaints without objectionable delays. The revised RPN shown in Figure 7.8 reveals that this action can significantly reduce (i.e., the acceptable RPN falls within 75 percentile) the risk with affordable cost. However, some of the FMs in the WQ category are still higher than the acceptable risk and yet to be mitigated (Figure 7.8). Action 4: Increasing level of water treatment As stated above, since March 2014, a notable decrease has been observed in WQ complaints. However, the work orders after this action revealed an interesting shift from colour to odour complaints. Although, the storage reservoir is now receiving better quality water directly from the lake (Figure 6.4), the treatment facilities prior to distribution are the still inadequate, i.e., screening and disinfection. After discussions with the utility’s personnel, the reason of this transition was found to be higher growth of algae during summer season resulting in higher chlorine dosses. In result, the utility received, odour complaints mentioning the chlorine and algae smell. Albeit, the cost of this action is high, the analysis clearly suggests the requirement of a complete filtration system for sustainable quality of service. The revised RPN values meeting the acceptable risk (i.e., RPN < 150), for all the FMs are shown in Figure 7.8. Discussions 7.4.8 CS is one of the prime objectives of any water utility, which has been conventionally assessed through performance benchmarking and customers interviews. SMWU cannot interview their customers on routine basis to evaluate CS; also, such utilities have not been participating in the benchmarking process (Haider et al., 2014b). Therefore, a practical modeling approach shown in Figure 7.3, based on customer complaints is developed to manage the CS in SMWU. Similar to other SMWU in Canada, the utility under study is facing different challenges to provide a desirable level of service, and receives several complaints. Risk identification revealed that the complaints are mainly related to structural category. Most of these complaints are related to different types of SC inspection and repairs (Figure 7.4b). The important SC complaints were either reported as ‘minor water leak’ or ‘major water leak’. However, 188 half of the former turned out to be a plumbing issue on the homeowner (HO)’s side; occasionally, the field staff found the repairs on both the utility and HO’s side. Later (5% of SC complaints) are sometimes difficult to identify, and are also expensive to repair. Similar to minor leaks, half of major leaks were also found to be on the HO’s side (Figure 7.4b). Although, a quick response informing customer that ‘it is your responsibility’, satisfies the customer to some extent; sometimes, it is found difficult when the customer is not willing to spend such a higher repair cost. Moreover, the customer is expected to be more satisfied when informed immediately about the extent of problem which needed longer time to be completely resolved. In contrast, the CS is at higher risk when even a minor complaint was responded to with a delay. Only 3% of the SC complaints were associated with meter repairs. The response to such complaints is important, because these are directly related to water bills. The remaining 50% of structural complaints were primarily seasonal on/off requests, or when the HOs planned some plumbing work in their home. Sometimes such complaints become a pro-active measure (i.e., an inspection of SC); for example, there was a very minor leak, which the HO was not aware of, and the field staff identified and repaired it. However, these could have caused much higher consequences (e.g., emergency leak) in future. In SMWU, the advantage of this pro-active control points to a planned SC inspection program on periodic basis. In BC, the criteria value is 40psi for minimum (i.e., dynamic head during peak hour demand) desirable pressure (BCMOH, 2012). Pressures more than 100psi, are generally considered to be high and may cause water main breaks. Therefore, due to significant elevation differences in the study area, the HOs have been recommended to install pressure release valves (PRVs) in their buildings. In this category, customers register either written or oral complaints for ‘low pressure’ or ‘no water’. The statistics of pressure complaints are captivating, because the actual cause of 30% of such complaints was found to be some type of plumbing failure, e.g., water leak, clogged filters or membranes, etc. (Figure 7.4b); however, the response should be prompt to ensure CS. In general, WQ complaints originate due to deteriorated aesthetic WQ (i.e., taste, colour, and odour); other types also exist where treatment levels are low. Figure 4b shows that most of them are associated with source water quality. In fact, SMWU are required to increase the chlorine dose due to higher turbidly of the source water, which caused several odour complaints. Occasionally, the actual issues are found to be on HO’s side, for instance clogged filters, fouling membranes, dirty in-house water storage tank, etc. Although, these issues are also somehow caused by inadequate WQ supplied by the utility; the HOs are required to install in-house treatment systems. In principal, it is the 189 responsibility of the utility to provide safe drinking water up to the consumers tap. Therefore, all such types of complaints place CS at risk. As mentioned earlier that at several occasions the actual root cause of SC complaints turned out to be a plumbing issue. Therefore, the actual complaint and the corresponding response need to be carefully recorded in the field notes to identify the actual root cause for realistic risk assessment. Figure 7.5 provides a visual support to the utility management for understanding the overall problems, and the actual root causes of the customer complaints. It also gives an opportunity to use this support tool to include or exclude the root causes after implementing the improvement actions in future, and/ or when a new root cause will be identified. Furthermore, RCA is used to assist the development of the main matrix of FMEA. The results of risk analysis in Appendix A reveals that most of the other FMs in WQ category fall under extreme priority (EP), very high priority (VP) and high priority (HP) levels. The other FMs with ‘HP’ in pressure complaints category were caused by unplanned maintenance activities and low pressure complaint, and identify operational inefficiencies related to equipment and personnel. Certainly, a water utility is responsible for a continuous water supply. Therefore, the customers should be informed in advance about any possible discontinuity due to maintenance activities to ensure service reliability. RCA revealed two possibilities for discontinuity: i) unplanned interruptions, and ii) unplanned maintenance hours. An unplanned interruption means any event when the customers are left without water without a 48 hours notification; this includes situations where the duration of a planned interruption exceeds the initially notified duration (QUU, 2012). Such interruptions occur due to different types of system failure, e.g., water main break, pump failure, hydrant break, treatment plant failure, etc. Maintenance activities as a result of reduced performance of the pumps, renewal of mains, and flushing of water mains should also be notified in advance. In study area, the customers are always informed about the planned maintenance activities; however, in case of unplanned interruptions the utility needs to respond as quickly as possible to reduce the risk of CS. Therefore, the high frequency of structural complaints necessitates a planned SC inspection program on a periodic basis to reduce the ‘P’ and consequently the risk. Moreover, the field staff should be well-equipped, with spare parts and fittings, to deal with any circumstances immediately. Otherwise, the customer may need to wait for more than 24 hours, which will result in customer dissatisfaction. The cumulative RPN reduction vs. the cumulative cost of risk mitigation is presented in Figure 7.9. 190 The figure concludes that half of the cumulative risk can be reduced by improving inspection and maintenance activities with affordable cost (i.e., less than 10% of the total risk mitigation cost). The Canadian Council of Ministers of the Environment (CCME) suggested complete conventional water treatment facility for surface waters to reduce the turbidity, colour, organics, and pathogens (CCME, 2004). Therefore, to keep the risk within the acceptable limits for all the FMs, such treatment facility is inevitable to secure public health. Therefore, to reduce the remaining half of cumulative risk, about 90% of the total risk reduction cost is required. It is recommended to involve the community, particularly in decision actions, which can significantly increase the water rates. For the purpose, willingness-to-pay studies through customer interviews or web based surveys should be conducted before finalizing such decisions. Figure 7.9 Cumulative risk reduction vs. cumulative cost of mitigation actions The study results reveal that the proposed risk based framework based on the perception and experience of both the customers and the utility’s field staff can be effectively used to manage the customer satisfaction in SMWU. 050001000015000200000 5 10 15 20 25 30Cummulative RPN 'reduction' Cummulative cost (million$ CAN) Major infrastructure improvements reduced remaining half of unacceptable risk with 93% of total risk reduction cost 50% risk reduction by spending 7% of total risk reduction cost on inspection and maintenance Acceptable risk (Cumulative) 191 7.5 Summary Small to medium sized water utilities (SMWU) in Canada, face severe challenges to achieve customers’ satisfaction (CS), because of unavailability of sophisticated water treatment options, lack of trained operational field personnel, and financial limitations. The conventional assessment methods for CS are based on performance benchmarking and customer satisfaction interviews which might not be technically and financially feasible for SMWU. In this research, a risk-based framework is developed to manage CS, which is primarily based on the evaluation of customer complaints’ data. For effective improvement actions to support decision making, the proposed framework also incorporates the experience of the operational staff. Customer dissatisfaction is evaluated in terms of risk of not meeting CS, which starts when a customer reports a complaint to the utility; however, a complete evaluation of CS depends on the duration between the time of the report and the response till the complete resolution of the complaint. Different categories of complaints are identified from exhaustive records of customer complaints obtained from a medium sized utility in Okanagan Basin (British Columbia, Canada). Possible modes of failures are identified using root cause analysis; and then risk assessment is performed using failure mode effect analysis (FMEA). In order to address inherent uncertainties associated with data limitations and vagueness in expert judgment, fuzzy set theory is integrated with FMEA. The study results reveal that about half of the cumulative risk can be reduced with the help of affordable interventions such as inspection and maintenance actions, while remaining risk reduction requires large scale improvements in the infrastructure facilities. 192 Chapter 8 Conclusions and Recommendations 8.1 Summary and Conclusions To ensure safe and secure water supply, water utilities are experiencing challenges of increasing population, climate change, socio-economic viability, and rapid rate of environmental degradation. Core of water utility business deals with managing physical assets and related services which can be divided into functional components such as water resource management and environmental stewardship, operational practices, personnel training, physical infrastructure, customer service, water quality and public health, socio-economic issues, as well as financial viability. Each of these components further consist of different sub-components; for example, the component of ‘personnel productivity’ may comprise of staff health and safety, overtime culture, training hours, etc. Besides, a water utility might be operating more than one water supply systems (WSSs) at a time because of geographical limitations and availability of source water. To be a sustainable water utility, major impetus is to enhance performance efficiency and effectiveness of functional components to ensure high level of customer satisfaction (CS). Therefore, it is important for a water utility to evaluate the performance of all these components and the WSSs for prioritizing their short- and long-term investments. So far, the participation of small and medium sized water utilities (SMWU) has almost been negligible in National Water and Wastewater Benchmarking Initiative (NWWBI). The possible reasons seem to be: i) there is no well-structured performance assessment framework available for such utilities which can simply (though comprehensive) be implemented under given technical and financial constraints, and ii) due to less economies of scale, SMWU may avoid to participate with large utilities which may delineate deficiency performance. This non-participation of SMWU has developed a gap (performance data) to establish the benchmarks and the desired level of service. Consequently, SMWU are managing their assets without knowing whether they are meeting their primary performance objectives or not. Based on the performance assessment results, a comprehensive performance management plan can help the utility to achieve its overall sustainability objectives for all of the functional components, such as: i) water resources and environmental sustainability, ii) efficacy of physical assets, iii) operational integrity, iv) personnel productivity, v) provision of safe drinking water to the customers, vi) reliable quality of service, and vii) economic and financial viability. In order to comply with these 193 objectives, Canadian SMWU are presently facing specific challenges associated with data availability and accuracy, lack of technical personnel, lower treatment levels, inefficient management systems, as well as lack of financial resources. So far, little work has been done on performance evaluation of the SMWU. Therefore, there is a need of a pragmatic and comprehensive performance management framework for SMWU to resolve the above stated issues. In this research five comprehensive models have been developed, Chapters 2 to 7, for performance management of SMWU at utility, system, and component levels. Figure 8.1 shows the integrated framework that processes the information at different levels and can predict the improved performance. Figure 8.1 shows the interaction between different models, and describes the possible changes/ improvements, inputs, process, outputs, and outcomes of each model systematically. Apart from changes/ improvements, all the other factors have already been defined in respective chapters in detail. The rationality of the proposed integrated framework (Figure 8.1) is described for the performance management of SMWU, and how the utility managers and concerned organizations can continuously improve this processes are described in the following. In Chapter 3, the PIs of SMWU are identified based on a comprehensive and critical review. Due to growing challenges of 21st century (i.e., global climate change, water resources limitations, population growth, etc.) and technological advances, development of new PIs can certainly be expected in future. In this regard, SMWU should continue the review process to include the state-of-the-art PIs, which should be further investigated using the detailed selection method described in Chapter 4. The PIs are ranked with the use of MCDA method (ELECTRE) in Chapter 4. The decision makers selected the most useful PIs by encompassing them in a boundary. This selection has been influenced by the existing data limitations as well, which is addressed with the use of measurability criteria. A continuous performance assessment process with an improvement in data management practices in SMWU is expected in future. Consequently, the decision makers (utility managers) can revise their boundary to include additional PIs. 194 Figure 8.1 Integrated framework for performance management of SMWU Customer surveys about existing practices M3- Inter-utility performance benchmarking ‘IU-PBM’ (Chapter 5) Performance assessment (Performance Indices) - Data Variables - Establish LOS Check all components Sustainable SMWU High Revised LOS based on regional benchmarking M4. Intra-utility performance management ‘In-UPM’ (Chapter 6) Evaluate all the functional and sub-functional components at utility level Check all components ≥ Medium < High -Data Variables -Define UOD Add additional PIs and PMs as per need Evaluate all the sub-functional components at system level Check all Systems ≤ Medium OR Low Decisions at utility level High Sustainable SMWU ≤ High Decisions at system level High Sustainable SMWU Inputs Process Outputs Changes/ Improvements Outcomes/ Objectives M1 - Review and Identification of PIs (Chapter 2 & 3) State-of-the-art review of literature of PIs - Reports - Research Additional PIs in future Identification of potential PIs (Initial screening) M2- Selection of PIs (Chapter 4) Application of MCDA (Ranking of PIs) Expert opinion (scoring) Decision maker boundary Participation of more SMWU PIs for SMWU Select suitable PIs for SMWU M5. Risk based customer satisfaction management (RCA and FMEA Application) Risk based assessment of customer satisfaction • -Customer complaints • -Utility’s response Perform willingness-to-pay surveys for large projects Check Risk < Acceptable Risk management > Acceptable Sustainable SMWU END M5 195 In Chapter 5, a comprehensive inter-utility performance benchmarking model (IU-PBM) for SMWU has been developed. In the absence of an existing benchmarking data, transformation functions to convert the calculated PIs into the performance scores have been developed with the help of NWWBI-PR, literature, and expert opinion. It is expected that with the participation of SMWU in the benchmarking process, the level of service will be improved. This improvement should be appreciated by recalibrating the transformation functions established in this research. Practical benefits of this model can be achieved with the help of a web-based platform provided by a relevant governmental organization, where SMWU can obtain performance indices of their functional components. If all the functional components are performing ‘High’, it can be assumed that the utility will obtain the ‘High’ performance from Intra-utility performance management model (In-UPM) simulations as well. In this situation, utility manager need to check the risk of customer satisfaction using the model developed in Chapter 7. If one or more functional components of the utility are not performing ‘High’, the managers need to evaluate the performance of the underperforming functional components and sub-components at utility level using In-UPM developed in Chapter 6. Subsequently, if required all the water supply systems operating within the utility also need to be evaluated for effective decision making. As a result of performance benchmarking process, the universe of discourse (UOD) should also be revised with the inclusion of new PIs and PMs, or when the level of service is improved. After managing all the functional components and the WSSs of the utility, the last step is to assess the risk of customer satisfaction. In this regard, a detailed model has been developed and validated in Chapter 7. The model evaluates the customer satisfaction using the record of complaints (work orders). Unlike conventional customer satisfaction assessment methods, the proposed method includes the observations and findings of the operational field staff, in addition to the response of customers. This approach helps to identify the root causes of complaints and supports effective decision making for risk mitigation actions. Root cause analysis and fuzzy-FMEA techniques are efficiently used to cover the entire risk management process for CS in SMWU. The results of this research revealed that about 50% of the cumulative risk can be reduced with the help of affordable inspection and maintenance actions; while remaining risk reduction requires large scale improvements in infrastructure facilities. The model found to be useful for SMWU, which cannot conduct detailed interviews surveys due to limited resources. 196 There are several socio-economic benefits of this research; for instance, the benchmarking model developed provides the basis to initiate this process amongst utilities in any region with similar geographic and socio-economic conditions with existing data limitation in SMWU. Secondly, the integrated performance management framework can be effectively used for future planning and decision making for optimal utilization of limited resources in SMWU. Without comparing the performance before and after the implementation of decision actions, the utility cannot assess the impacts of improvement actions on performance, and thus unable to rationally justify the benefits of their spending. In general, the utility managers in SMWU take decisions without conducting a detailed performance assessment; they certainly need a quantitative rationale for their actions geared towards improvements in decision making process. The results of the models developed in this research can help the utility managers at strategic level in obtaining financial approvals from government agencies and can satisfy their customers and general public as well. Thirdly, in addition to the identification of underperforming processes, such quantitatively demonstrated results can also motivate the managers for consistent efforts by showing the processes with high performance. 8.2 Originality and Contribution Main contribution of this research is to guide managers of small to medium sized water utilities to enhance performance at component, system and utility levels. Developed framework has a potential to be successfully implemented in SMWU which are consistently functioning under limited resources. This research has developed innovative tools which transform complex decision-making process into a systematic decision strategy. Outcome of this research will help SMWU to: i) identify the underperforming functional components and suggest relevant corrective actions, and ii) manage customers’ satisfaction with efficient inventory management and data analyses. 8.3 Limitations and Recommendations Due to limited data availability in SMWU, the performance management models have been developed to analyze the snapshot condition over a given assessment period (e.g., quarterly, six monthly, or annual). However, the modeling approach used for In-UPM can accommodate the data of dynamic nature, when available in future. 197 For future research, following recommendations are made:  The PIs identified and selected provides guidelines to initiate and/or improve the performance assessment process of the SMWU using appropriate PIs. Consistent review and improvement of the selected PIs is recommended over time as per the site specific requirements of the utilities under study, including changes in international standards and environmental protocols, and increasing customer expectations. The variability in decision making for the selection of PIs can also be addressed in future research.  The benchmarking relationships and the reference system developed in this work (in the absence of extensive data of SMWU) cannot replace the actual benchmarking and performance management processes involving similar sized utilities in the same region participating for several years. These relationships and reference system developed in this study need to be re-calibrated through a continuous benchmarking process. IU-PBM cannot handle these uncertainties and needs to be enhanced in future.  The proposed hierarchical framework of In-UPM is flexible to include additional performance factors. It is recommended that with expected changes in infrastructure, availability of additional data and increased participation of SMWU in national benchmarking process in future, additional data/ decisions variables, PIs, and performance measures should be included to further facilitate in the decision making process.  If sufficient resources are available, the surveys based on customer interviews should never be overlooked for managing customer satisfaction. Such surveys might be inevitable, particularly to determine customers willingness-to-pay for the improvement in infrastructure such as installation of conventional water treatment plants which can drastically increase water rates. 198 References Abdelgawad, M., and Fayek, A.R. 2010. Risk Management in the construction industry using combined fuzzy FMEA and fuzzy AHP. Journal of Construction Engineering and Management, 136:1028-1036. ACWUA. 2010a. Key performance indicators and benchmarking for water utilities in the MENA/Arab Region. Arab countries water utilities association (ACWUA) 1st regional training course, supported by Alexandria water company, Alexandria, Egypt. ACWUA. 2010b. Key performance indicators and benchmarking for water utilities in the MENA/Arab Region. Arab countries water utilities association, 1st Arab Water Week, December 8-9, 2010, Amman, Jordan. ADB. 2012. Handbook for selecting performance indicators for ADB-funded projects in the PRC. Asian Development Bank, cited on July 2012. www.adb.org/prc. AECOM. 2014. Water utility master plan. Final report prepared for the District of West Kelowna, Kelowna, BC, Canada, cited on 17th July 2014. http://www.districtofwestkelowna.ca/Modules/ShowDocument.aspx?documentid=12962. AECOM. 2013. National water and wastewater benchmarking initiative - 2013 Public Report, Canada. www.nationalbenchmarking.ca AECOM. 2012. National water and wastewater benchmarking Initiative - 2012 Public Report, Canada. www.nationalbenchmarking.ca Al-Assa’d, T., Sauer J. 2010. The performance of water utilities in Jordan. Water Science and Technology, 62.4: 803-808. Alegre, H. 2010. Is strategic asset management applicable to small and medium utilities? Water Science and Technology-WST, 62.9:2051-2058. Alegre, H. 1999. Performance indicators for water supply systems. E. Cabrera and J. Garcla-Serra (eds). Drought Management Planning in Water Supply Systems, 148-178. Alegre, H., Hirner, W., Baptista, J.M., and Parena, R. 2000. Performance indicators for water supply services. Manual of Best Practice Series, IWA Publishing, London. Alegre, H., Bapista, J.M., Cabrera, E.Jr., Cubillo, F., Duarte P., Hirner, W., Merkel, W., and Parena, R. 2006. Performance indicators for water supply services. Manual of Best Practice Series, IWA Publishing, London, UK. Alegre H., Coelho S.T. 2012. Infrastructure asset management of urban water systems. Chapter 3, Water Supply System Analysis – Selected Topics, Book edited by Avi Ostfeld, INTEC. 199 Alegre, H., Cabrera, E., and Merkel, W. 2009. Performance assessment of urban utilities: the case of water supply, wastewater and solid waste. Journal of Water Supply: Research and Technology-AQUA, 58(5):305-315. AWWA. 2004. Selection and definition of performance indicators for water and wastewater utilities. Water://Stats 2002 Distribution Survey. Denver, CO: American Water Works Association, USA. Antonioli, B., and Filippini, M. 2001. The use of a variable cost function in the regulation of the Italian water industry. Utilities Policy, 10(3-4):181-187. Anton, J.M., and Grau, J.B. 2004. Madrid-Valencia high-speed rail line: A route selection. Proceedings of the Institution of Civil Engineers, Transport, 157(3):153-161. Arscott, A., and Grimshaw, F. 1996. Evaluating investment in the water network on leakage detection. In IIR conference on cost effective management of water pipelines and networks, London, 24-25. Artley, W., and Stroh, S. 2001. The performance based management handbook – A six volume compilation of techniques and tools for implementing the government performance and results act of 1993-Establishing and integrated performance measurement system. Volume 2, Performance-Based Management Special Interest Group (PBM SIG), USA. AWWA. 2008. Benchmarking - performance indicators for water and wastewater utilities: 2007 annual survey data and analysis report. American Water Works Association, USA. AWWA. 2007. Distribution system inventory, integrity and water quality. Office of groundwater and Drinking Water, Total Coliform Rule Issue Paper, USEPA, USA. AWWA. 2004. Water://Stats 2002 Distribution Survey. Denver, CO: AWWA. AWWA. 2004. Selection and definition of performance indicators for water and wastewater utilities. Water://Stats 2002 Distribution Survey. Denver, CO: American Water Works Association, USA. AWWA. 1996. Performance benchmarking for water utilities. American Research Foundation and American Water Works Association, Denver, USA. Ayoob, S., and Gupta, A.K. 2007. Fluoride in drinking water: a review on the status and stress effects. Critical Reviews in Environmental Science and Technology, 36(6):433-487. Bari, M.A., Berti, M.L., Charles, S.P., Hauck, E.J., and Pearcey, M. 2005. Modelling of stream flow reduction due to climate change in Western Australia – A case study. International Congress on Modelling and Simulation, 12-15 December 2005, Modelling and Simulation Society of Australia and New Zealand, Melbourne, 482-488. BCMOH. 2012. Water system assessment user’s guide: Appendices. Version 1.0, Health Protection Branch, B.C. Ministry of Health, British Columbia, Canada. BCMoE. 1997. Ambient water quality criteria for turbidity, suspended and benthic sediments, Technical Appendix, British Columbia Ministry of Environment, Canada, cited on 15th May 2014, http://www.env.gov.bc.ca/wat/wq/BCguidelines/turbidity/turbiditytech.pdf. 200 Benayoun, R., Roy B., and Sussman N. 1966. Manual de reference du programme electre, Note de Synthese et. Formation n.25, Paris: Direction scientifique SEMA. Berg, C., and Danilenko, A. 2011. The IBNET water supply and sanitation performance Blue Book, The International Benchmarking Network for Water and Sanitation Utilities Data book, Water and Sanitation Program, The World Bank, Washington D.C., 58849. Benson, A.S., Dietrich A.M., and Galagher, D.L. 2011. Evaluation of iron models for water distribution system. Critical Reviews in Environmental Science and Technology, 42(1):44-97. Blackhurst, J.V., Scheibe K.P., Johnson, D.J. 2008. Supplier risk assessment and monitoring for the automotive industry. International Journal of Physical Distribution and Logistics Management, 38(2):143 – 165. Bodimeade, C., and Renzetti S. 2013. Full-cost rates for water and the chimera of “Affordability”. Water Canada, cited on 23rd August 2014. http://watercanada.net/2013/full-cost-rates-for-water-and-the-chimera-of-affordability/ Braden, J.B., and Mankin, P.C. 2004. Economic and financial management of small water systems: Issue introduction. Journal of Contemporary Water Research and Education, 128:1-5. Brown, C.E. 2004. Making small water systems strong. Journal of Contemporary Water Research & Education, 128:27-30. Butler, M., and West, J. 1987. Leakage prevention and system renewal. In Pipeline management seminar, 1987, Pipeline Industries Guild. Carbone, T.A., and Tippett, D.D. 2004. Project risk management using the project risk FMEA. Engineering Management Journal, 16(4):28-35. Cousins, R., Mackay, C., Clarke, S., Kelly, C., Kelly, P., and McCaig, R. 2004. Management standards’ and work-related stress in the UK: Practical development. Work and Stress, 18:113–136. CCME. 2004. From source to tap: Guidance on the multi-barrier approach to safe drinking water. Canadian Council of Ministers of the Environment, Manitoba, Canada. CCME. 1999. Canadian water quality guidelines for the protection of aquatic life – Reactive chlorine species. Canadian Council of Ministers of the Environment, Winnipeg. CCPPP. 2003. Civil infrastructure systems – Technology road map 2003-2013. A national consensus on preserving Canadian community lifelines, The Canadian Council of Public-Private Partnership, cited on 15th January 2015, http://www.pppcouncil.ca/pdf/trm_062003.pdf. CDM. 2011. Water efficiency potential study for Wisconsin. Prepared for the Public Service Commission of Wisconsin and Wisconsin Department of Natural Resources, Wisconsin , USA. Chartres, C., and Williams, J. 2006. Can Australia overcome its water scarcity problems. Journal of Development in Sustainable Agriculture, 1:17-24. 201 CIRC. 2012. Canadian infrastructure report card, Volume 1: 2012 Municipal Roads and Water Systems, cited on 25th September 2014, http://www.canadainfrastructure.ca/. Corton, M.L., and Berg S.V. 2009. Benchmarking Central American water utilities. Utilities Policy, 17(2009):267–275. Corton, M.L., and Berg S.V. 2007. Benchmarking Central American water utilities. Public Utility Research Center, Final Report. University of Florida. Coelli, T.J., Estache, A., Perelman, S. and Trujillo, L. 2003. A primer on efficiency measurement for utilities and transport regulators, World Bank Publications. Coelho, S.T. 1997. Performance in water distribution – a system approach. Jhon Wiley and Sons Inc., New York, USA. Correia, T., Brochado, A., and Marques, R.C. 2008. Benchmarking the performance of Portuguese water utilities. Performance Assessment of Urban Infrastructure Services, Edited by Cabrera E. Jr. & Pardo M.Á., IWA Publishing, Great Britain, 273-283. Coulibaly, H.D., and Rodriguez, M.J. 2004. Development of performance indicators for small Quebec drinking water utilities. Journal of Environmental Management, 73:243-255. Coutinho-Rodrigues, J., Simao, A., and Antunes, C.H. 2011. A GIS-based multicriteria spatial decision support system for planning urban infrastructures. Decision Support Systems, 51(3): 720-726. Criminisi, A., Fontanazza, C.M., Freni, G., and Loggia, G.La. 2009. Evaluation of the apparent losses caused by water meter under-registration in intermittent water supply. Water Science and Technology, 60(9):2373-2382. Cromwell, J., G. Nestel, R. Albani, L. Paralez, A. Deb, and F. Grablutz. 2001. Financial and economic optimization of water main replacement programs. Denver, CO: AWWA Research Foundation. CSA. 2010. Activities relating to drinking water and wastewater services – Guidelines for the management of drinking water utilities and for the assessment of drinking water services, CAN/CSA-Z24512-10, National Standard of Canada, ISO, Standards Council of Canada & Canadian Standards Association. CABAŁA, P. 2010. Using the Analytic Hierarchy Process in evaluating decision alternatives, Operations Research and Decisions, 1(2010):5-25. CWA. 2011. National greenhouse account factors. Department of Climate Change and Energy Efficiency, Australian Government. CWW. 2013. Columbus Water Works, Five Year Strategic Plan FY 2013-2017, cited on 14th September 2014, www.cwwga.org. 202 CWWA. 2009. Water conservation and efficiency performance measure and benchmarks within the municipal sector; An introduction of current practices and assessment of the feasibility of expanding their use, Municipal water conservation efficiency performance measures and benchmarks, Canadian Water Works Association, A report to the Ontario Ministry of Environment, Ontario, Canada. CWWA. 1997. Municipal water and wastewater infrastructure: Estimated Investment Needs 1997 to 2012, A Report Partially Sponsored by the Canada Mortgage and Housing Corporation, Canadian Water and Wastewater Association, Canada. Defaria, A. L., and Alegre, H. 1996. Paving the way to excellence in water-supply systems -a national framework for levels of service assessment based on consumer satisfaction. Journal of Water Supply Research and Technology-Aqua, 45(1):1-12. Dembe, A.E., Erickson, J.B., Delbos, R.G., and Banks, S.M. 2005.The impact of overtime and long work hours on occupational injuries and illnesses: new evidence from the United States. Occupational Environmental Medicine, 62:588-597. Donald, P.R., and Dorothy, L.R. 1977. Actue toxicity of residual chlorine and ammonia to some native Illinois fishes, Report of Investigation 85. State of Illinois, Department of Registration and Education, Illinois State Water Survey, Urbana. DWPA. 2013. Drinking water protection act - Drinking water protection regulation, B.C. Reg. 200/2003, O.C. 508/2003, Victoria, BC, Canada, cited on 8th May 2013, http://www.bclaws.ca/EPLibraries/bclaws_new/document/ID/freeside/200_2003. Dziegielewski, B., and Bik, T. 2004. Technical Assistance Needs and Research Priorities for Small Community Water Systems. Journal of Contemporary Water, 13-20. El-Baroudy, I., and Simonovic, S.P. 2006. Application of the fuzzy performance measures to the City of London water supply system. Canadian Journal of Civil Engineering, 33:255–265. Farley, M., and Trow, S. 2003. Losses in water distribution networks, London, UK: IWA Publishing. FCM and NRC. 2005. Decision Making and Investment Planning: Managing Infrastructure Assets, Federation of Canadian Municipalities and National Research Council, Ottawa, Ontario. Fisher, L., Kastl G., Sathasivan, A., and Jegatheesan, V. 2011. Suitability of chlorine bulk decay models for planning and management of water distribution system. Critical Reviews on Environmental Science and Technology, 41(20):1843-1882. Figueira, J., Mousseau, V., and Roy, B., 2005. ELECTRE methods - Multiple criteria decision analysis: State of the Art Surveys, International Series in Operations Research & Management Science. New York: Springer Science + Business Media, Inc. Ford, T., Rupp, G., Butter, F.P., and Camper, A. 2005. Protecting public health in small water systems, Report of an International Colloquium, Montana Water Center, USA. 203 Francisque, A., Shahriar, A., Islam N., Betrie, G., Siddiqui R.B., Tesfamariam S., and Sadiq R. 2014. A decision support tool for water mains renewal for small to medium sized utilities: a risk index approach. Journal of Water Supply: Research and Technology—AQUA, 63(4):281-302. Galar, D., Berges, L., Sandborn, P., and Kumar, U. 2014. The need for aggregated indicators in performance asset management. Eksploatacja I Niezawodnosc – Maintenance and Reliability, 16(1):120-127. Gang, D.C., Clevenger T. E., and Banerji S. K. 2003. Modeling chlorine decay in surface water. Journal of Environmental Informatics, 1(1):21- 27. Giff, G.A., and Crompvoets, J. 2008. Performance indicators a tool to support spatial data infrastructure assessment, Computers. Environment and Urban Systems, 32:365-376. Gurung, T.R., Stewart, R.A., Beal C.D., and Sharma, A.K. 2015. Smart meter enabled water end-use demand data: platform for the enhanced infrastructure planning of contemporary urban water supply networks. Journal of Cleaner Production, 87(2015):642-654. Haider, A. 2007. Information systems for engineering and infrastructure asset management, Dissertation University of South Australia, Adelaide. Haider, H., Sadiq, R., and Tesfamariam, S. 2014a. Performance indicators for small and medium sized water supply systems: A Review. Environmental Reviews, 22(1):1-40. Haider, H., Sadiq, R., and Tesfamariam, S. 2015a. Selecting performance indicators for small to medium sized water utilities: Multi-criteria analysis using ELECTRE method. Urban water Journal, 12(4):305-327. Haider, H., Sadiq, R., and Tesfamariam, S. 2014c. Performance assessment framework for small to medium sized water utilities – A case for Okanagan Basin. The proceedings of the Canadian Society of Civil Engineers (CSCE) General Conference (2014), Halifax, NS, Canada. Haider, H., Sadiq, R., and Tesfamariam, S. 2015b. Inter-utility performance benchmarking model (IU-PBM) for small to medium sized water utilities: Aggregated performance indices. ASCE’s Journal of Water Resources Planning and Management, doi: 10.1061/(ASCE)WR.1943-5452.0000552. Haider, H., Sadiq, R., and Tesfamariam, S. 2015c. Intra-utility performance management model (In-UPM) for the sustainability of small to medium sized water utilities: Conceptualization to development. Journal of Cleaner Production. Under review, Submitted on 14.01.2015. Haider, H., Sadiq, R., and Tesfamariam, S. 2015d. Customer satisfaction management model for small to medium sized water utilities: A risk-based approach. Reliability Engineering and System Safety, Under review, Submitted on 13.03.2015. Haider, H., Sadiq, R., and Tesfamariam, S. 2015e. Multilevel performance management framework: A case of small to medium sized water utilities in BC, Canada. Canadian Journal of Civil Engineering, Under Review, Submitted on 17.05.2015. Hamilton, S., Mckenzie, R., and Seago, C. 2006. A review of performance indicators for real loses from water supply systems, UK House of Commons Report. 204 Hamilton, P. A., Miller, T. L., and Myers, D. N. 2004. Water quality in the nation’s streams and aquifers–Overview of selected findings, 1991-2001. USGS Circular 1265. Reston, VA, U.S. Geological Survey. Hanson, J.J., Murrill, S.D. 2013. South Tahoe public utility district 2012 customer satisfaction and perceptions survey report of results, Meta Research, Inc., California, US. Health Canada. 2013. Cited on 15th June 2013, http://www.hc-sc.gc.ca/ewh-semt/pubs/water-eau/nitrate_nitrite/index-eng.php. Health Canada. 2012. Guidelines for drinking water quality. Federal Provincial Territorial Committee on Drinking Water, Ottawa, Ontario, Canada. Helton, J.C., Johnson, J.D., Sallaberry, C.J., and Storlie, C.B. 2006. Survey of sampling-based methods for uncertainty and sensitivity analysis, Reliability Engineering and System Safety, 91(10-11): 1414-1434. Hirner, W., and Lambert, A. 2000. Losses from water supply systems: Standard Terminology and Recommended Performance Measures. International Water Association, cited on 15th August 2012, www.iwahq.org.uk/bluepages. Hokstad, P., Røstum, J., Sklet, S., Rosén, L. 2009. Methods for risk analysis of drinking water systems from source to tap. Techneau: An Integrated Project Funded by the European Commission. Hwang, C.L., and Yoon, K. 1981. Multiple attribute decision making: Methods and applications. Berlin/ Heidelberg/New York: Springer-Verlag. IDB. 2014. AquaRating – An overview. Inter-American Development Bank in collaboration with International Water Association, (Feb. 26, 2015). IIMM. 2006. International infrastructure management manual, Association of Local Government Engineering NZ Inc, National Asset Management Steering Group, New Zealand, Thames, ISBN 0-473-10685-X. Interior Health Canada. 2013. Cited on 28th June 2013, http://www.interiorhealth.ca/YourEnvironment/InspectionReports/Pages/WaterNotifications.aspx Kabir, G., Sadiq R., and Tesfamariam S. 2013. A review of multi-criteria decision-making methods for infrastructure management. Structure and Infrastructure Engineering: Maintenance, Management. Life-cycle Design and Performance, doi: 10.1080/15732479.2013.795978. Kalulu, K., and Hoko, Z. 2010. Assessment of the performance of a public water utility: A case study of Blantyre Water Board in Malawi. Physics and Chemistry of the Earth, 35:806-810. Kanakoudis, V., and Tsitsifli, S. 2010. Results of an urban water distribution network performance evaluation attempt in Greece. Urban Water Journal, 7(5):267-285. 205 KWR. 2008, Consumer satisfaction, preference and acceptance regarding drinking water services: An overview of literature findings and assessment methods. Kiwa Water Research, The Netherlands. Lafferty, A.K., and Lauer, W.C. 2005. Benchmarking – Performance indicators for water and wastewater utilities: Survey data and analysis report. American Water Works Association (AWWA), USA. Lambert, A. 2003. Assessing non-revenue water and its components: a practical approach to water loss reduction. The IWA Water Loss Task Force, Water 21, Article No 2. Lambert, A., Charalambous, B., Fantozzi, M., Kovac J., Rizzo, A., and St John, S. G. 2014. 14 years experience of using IWA best practice water balance and water loss performance indicators in Europe. Paper presented in IWA WaterLoss 2014 Conference, 30th April to 2nd May, Vienna. Lambert, A.O. 2002. International report: water losses management and techniques. Water Supply, 2(4):1-20. Lambert, A., and Hirner, W.H. 2000. Losses from water supply systems: Standard terminology and performance measures, IWSA Blue pages. Lambert, A.O., McKenzie, R.D. 2002. Practical experience in using the Infrastructure Leakage Index, Proceedings of IWA Conference – Leakage Management: A Practical Approach., Lemesos, Cyprus. Lambert, A.O., Brown, T.G., Takizawa, M., and Weimer, D. 1999. A review of performance indicators for real losses from water supply systems. Journal of Water Science Research and Technology – Aqua, 48(2):227-237. Lambert, A., and Morrison, J.A.E. 1996. Recent developments in application of ‘bursts and background estimates’ concepts for leakage management. Water Environmental Management, 10(2):100-104. Lambert, A., and Taylor, R. 2010. Water loss guidelines, Water New Zealand. Lambert, A.O., Brown, T.G., Takizawa, M., and Weimer, D. 1999. A review of performance indicators for real losses from water supply systems. Journal of Water Science Research and Technology – Aqua, 48(2):227-237. Lawton, R.W. 1997. A customer-based quality-of-service approach for regulating water utilities, The National Regulatory Research Institute, The Ohio State University, Ohio, USA. Lee, M.C. 2010. The analytic hierarchy and the network process in multicriteria decision making: Performance evaluation and selecting key performance indicators based on ANP model, Convergence and Hybrid Information Technologies, INTECH, Croatia. Liemberger, R. 2002. DO you know how misleading the use of wrong performance indicators can be? IWA Specialized Conference. Leakage Management – A Practical Approach, Cyprus. 206 Liemberger, R., and McKenzie, R. 2005. Accuracy limitations of the ILI: Is It an appropriate indicator for developing countries?, Conference Proceedings, IWA Leakage 2005 Conference in Halifax, Nova Scotia, Canada. Lindhe, A., Rosén, L., Norberg, T., and Bergstedt, O. 2009. Fault tree analysis for integrated and probabilistic risk analysis of drinking water systems. Water Research, 43(2009):1641-1653. Liu, Z., Kleiner, Y., and Rajani B. 2012. Condition assessment technologies for water transmission and distribution system, Office of Research and Development, National Risk Management Research Laboratory – Water Supply and Water Resources Division, USEPA, EPA/600/R-12/017. Liu, Z., Sadiq, R., Rajani, B., and Najjaran, H. 2010. Exploring the relationship between Soil properties and deterioration of metallic pipes using predictive data mining methods. Journal of Computing in Civil Engineering, 24(3): 289. Loganathan, G. V., Park, S., Agbenowosi, N., and Sherali, H. D. 2001. A threshold break rate for scheduling optimal pipeline replacement. World Water Congress 2001, ASCE, 111, 407-407. Lounis, Z., Vanier, D.J., Daigle, L., Sadiq, R. and Almansor, H. 2010. Framework for assessment of state, performance and management of Canada’s core public infrastructure. Infrastructure Canada, Ottawa, ON, Canada. Mamdani, E. H. 1977. Application of fuzzy logic to approximate reasoning using linguistic systems. Fuzzy Set Systems, 26:1182–1191. Marques, R., and Witte, K. De. 2008. Towards a benchmarking paradigm in the European public water and sewerage services, Public Money & Management 30(10):42. Marques, R.C., and Monterio, A.J. 2001. Application of performance indicators in water utilities management – a case-study in Portugal. Water Science and Technology, 44(2–3):95–102. Malaysian Water Association. 1996. Performance indicators for water supply, a proposal from the Malaysian Water Association for the consideration of member countries of ASPAC. Maras, J. 2004. Economic and financial management capacity of small water systems. Journal of Contemporary Water Research and Education, 128: 31-34. Marques, R., and Monteiro A.J. 2001. Application of performance indicators in water utilities management- a case-study in Portugal. Water Science and Technology, 44(2-3):95-102. Marques, R., and Witte K.De. 2010. Towards a benchmarking paradigm in the European public water and sewerage services. Public Money and Management, 30(10):42. Marzouk, M., Hamid, S.A., and El-Said, M. 2015. A methodology for prioritizing water mains rehabilitation in Egypt, Housing and Building National Research Center. HBRC Journal, 11(1):114-128. 207 Matos, M.R., Bicudo, J.R., and Alegre, H. 1993. Technical and social-economical indicators in the scope of water supply, wastewater and solid waste systems, in Preparatory study for the definition of selection criteria for environmental projects to be funded by the European Communities, Vol 2 Project contracted by the European Commission, report 107/93, LNEC, Lisbon, Portugal. Mcintosh, A.C., and Ynoguez C. 1997. Second water utilities data book- Asian and Pacific Region, Asian development Bank’s regional Technical Assistance No. 5694, Manila Philippines. McKenzie, R., and Lambert, A. 2002. Development of a simple and pragmatic approach to benchmark real losses in potable water distribution systems in South Africa: BENCHLEAK, Report TT159/01, published by the South African Water Research Commission. MHLS. 2010. Comprehensive drinking water source-to-tap assessment guideline, Module 1 delineate and characterize drinking water source(s), Ministry of Healthy Living and Sport, British Columbia, Canada. Milani, A.S., Shanian, A., and El-Lahham, C. 2006. Using different ELECTRE methods in strategic planning in the presence of human behavioral resistance. Journal of Applied Mathematics and Decision Sciences, 2006: 1-19. Mitchell, V.W. 1999. Consumer perceived risk: Conceptualizations and models. European Journal of Marketing, 33(1/2):163-195. Mitrich, S. 1999. Price and subsidy policies for urban public transport and water utilities in transition economies. Working Paper No. 5, Washington, D.C.: The World Bank. Mkhitaryan, L. 2009. Towards performance based utility sector in Armenia: Case of dirking water supply scenario, A Program of EURASIA partnership. Program, financed by the Carnegie Corporation of New York, Grants to Support Social Science and Policy-Oriented, Caucasus Research Resource Centers (CRRC), Armenia, Research # C08- 0198. MMWR. 1999. Public Health dispatch – Outbreak of Escherichia coli O157:H7 and Campylobacter among attendees of the Washington County Fair. New York. MMWR, 48(36):803-804. Mutikanga, H.E., Sharma, S.K., and Vairavamoorthy, K. 2009. Apparent water losses assessment: the case of Kampala city, Uganda, International Conference of Water Loss-2009, IWA, Cape Town, South Africa, 1:36-42. Nikl, L., and Nikl, D. 1992. Environmental impacts associated with monochloramine-disinfected municipal potable water. National Conference on Drinking Water, Winnipeg. NRC. 2010. Framework for Assessment of State, Performance and Management of Canada’s Core Public Infrastructure, National Research Council Canada, Final Report B5332.5. NRC. 1995. Committee on measuring and improving infrastructure performance, National Research Council, National Academy Press, Washington D.C., ISBN 0-309-050987. NWC. 2012. National performance report 2010-2011: Urban water utilities. National Water Commission, Australian Government. 208 NWC/DoE. 1980. Leakage control policy and practice. National Water Council, NWC/DoE. London. Cited on 12th Sep 2014, http://dwi.defra.gov.uk/research/completed-research/reports/dwi0190.pdf. Nurnberg, W.H. 2001. Performance assessment of water supply systems, German National Report, IWA- World Water Congress-Berlin 2001. OBWB. 2014. Okanagan board water basin, cited on 2nd December 2014, http://www.obwb.ca/wsd/about/state-of-the-basin OFWAT. 2012. Key performance indicators – guidance. Cited on August 2012, www.ofwat.gov.uk. OFWAT. 2010. Service and delivery – performance of the water companies in England and Wales 2009-10, cited on July 2012, www.ofwat.gov.uk. OFWAT. 2010a. Financial performance and expenditure of the water companies in England and Wales 2009-10, cited on August 2012, www.ofwat.gov.uk. OFWAT. 2003. Regulating economic level of leakage in England and Wales, World Water Week. Washington, DC. Ojo, V.O. 2011. Customer satisfaction: a framework for assessing the service quality of urban water service providers in Abuja, Nigeria, A Doctoral Thesis, School of Civil and Building Engineering, Loughborough University, Loughborough. OPM. 1990. Health improvement health service planning kit. New South Wales, Australia, Office of Public Management New South Wales (OPM), cited on 25th August 2012, http://www.audit.wa.gov.au/pubs/ASD_2-99_98. PAGS. 2008. Water utility service quality monitoring for water systems in Armenia, USAID, Program Report No. 93. Palme, U., and Tillman, A. 2008. Sustainable development indicators: how are they used in Swedish water utilities. Journal of Cleaner Production, 16:1346-1357. Pang, J., Zhang, G., and Chen, G., 2011. ELECTRE 1 decision model of reliability design scheme for computer numerical control machine. Journal of Software, 6(5):894-900. Parfitt, B., Baltutis, J., and Brandes, O.M. 2012. From stream to steam emerging challenges for BC’s interlinked water and energy resources. Canadian Center of Policy Alternatives (CCPA), Policy Project on Ecological Governance, University of Victoria. PWWA. 2012. Pacific water and wastewater utilities – Benchmarking Report. Pacific Water and Wastewater Association, World Bank. QUU. 2012. Queensland urban utilities- Customer service standards, Queensland Urban Utilities, Brisbane, Australia. Radivojević, D., Milićević, D., and Blagojević, B. 2008. IWA best practice and performance indicators for water utilities in Serbia – Case Study Pirot. Facta Universitatis, 6(1): 37-50. 209 Rahill-Marier B., and Lall, U. 2010. America’s water: An exploratory analysis of municipal water survey. American Water Works Association (AWWA), Unites States, (Mar.31,2015). Ritchie, B., and Brindley, C. 2007. Supply chain risk management and performance. A guiding framework for future development. International Journal of Operations and Production Management, 27(3):303-322. Rosén, L., Bergstedt, O., Lindhe, A., Pettersson, T.J.R., Johansson, A., and Norberg, T. 2008. Comparing raw water options to reach water safety targets using an integrated fault tree model, Paper presented in the International Water Association Conference, Water Safety Plans: Global Experiences and Future Trends, Lisbon, 12-14 May. Ross, T. 2004. Fuzzy logic with engineering applications. Second ed. John Wiley & Sons, New York. Rouxel, A., Brofferio, S., and Guerin-Schneider, L. 2008. Performance indicators and customer management: ACEA benchmarking experiences in water services in Latin America. Journal of Water Supply Research and Technology – AQUA, 57(4):273-278. Sadiq, R., Rodríguez, M.J., and Tesfamariam, S. 2010, Integrating indicators for performance assessment of small water utilities using ordered weighted averaging (OWA) operators, Expert Systems with Applications, 37(2010):4881–4891. Sadiq, R., Kleiner, Y., and Rajani, B. 2009. Q-WARP Proof-of-concept model to predict water quality failures in distribution pipe networks. Water Research Foundation, Denver, CO, USA. Sadiq, R., and Khan, F.I. 2006. An integrated approach for risk-based life cycle assessment and multi-criteria decision-making: Selection, design and evaluation of cleaner and greener processes. Business Process Management Journal, 12(6):770-792. Saaty, T.L. 1980. The analytic hierarchy process. McGraw-Hill, New York. Saaty, T.L., and Vargas, L.G. 1991. Prediction, projection and forecasting. Kluwer Academic Publishers, Dordrecht. Sharma, S. 2008. Performance indicators of water losses in water distribution system, UNESCO-IHE Institute for Water Education, The Netherlands. Sindhe, V.R., Hirayama, N., Mugita, A., and Itoh, S. 2013. Revising the existing performance indicator system for small water supply utilities in Japan. Urban Water Journal, 10(6):377-393. Singh, M.R., Mittal, A.K., and Upadhyay, V. 2014. Efficient water utilities: use of performance indicator system and data envelopment analysis. Water Science and Technology, 14(5):787-294. Stahre, P., Adamsson, J., and Mellstrom, G. 2008. A new approach for assessment of the performance of water distribution and sewerage networks, Performance Assessment of Urban Infrastructure Services, Edited by Cabrera E. Jr. & Pardo M.Á., IWA Publishing, Great Britain. 210 Statistics Canada. 2009. Environment accounts and statistics division, special tabulation. Table 3.4 drinking water plants and sewerage treatment plants in Canada, by population served, 2009. Cited on 28th Feb 2015, www.statca.gc.ca/pub/16-201-x/2010000/t236-eng.htm. Stinchcombe, K. 2013. Affordability of water rates is not incompatible with full cost pricing, economics, cited on 23rd August 2014, http://www.econics.com/affordability-of-water-rates-is-not-incompatible-with-full-cost-pricing/. Statistics Canada. 2013. Income per capita, Sources: Statistics Canada, OECD, The Conference Board of Canada, cited on 25th August 2014, http://www.conferenceboard.ca/hcp/provincial/economy/ income-per-capita.aspx. Stoeckel, K., and Abrahams, H. 2007. Water reform in Australia: the National Water Initiative and the role of the National Water Commission\". In Karen H., Stephen D., Managing water for Australia: the social and institutional challenges. Collingwood, CSIRO Publishing, Victoria. Stone, S., Dzuray, E.J., Meisegeier, D., Dahlborg, A.S., and Erickson, M. 2002. Decision-support tools for predicting the performance of water distribution and wastewater collection systems. USEPA. Sun, C. 2010. A performance evaluation model by integrating fuzzy AHP and fuzzy TOPSIS method. Expert Systems with Applications, 37(2010):7745-7754. Tesfamariam, S., and Sadiq, R. 2006. Risk-based environmental decision-making using fuzzy analytic hierarchy process (F-AHP). Stochastic Environmental Research and Risk Assessment, 21:35–50. Theuretzbacher-Fritz, H., Schielein, J., Kiesl, H., Kölbl, J., R. Neunteufel, R., and Perfler, R. 2005. Trans-national water supply benchmarking: the cross-border co-operation of the Bavarian EFFWB project and the Austrian OVGW project. Water supply, 5(6):273-280. Toor, S.R., and Ogunlana, S.O. 2010. Beyond the ‘iron triangle’: Stakeholder perception of key performance indicators (KPIs) for large scale public sector development projects. International Journal of Project Management, 28:228-236. Tynan, N, Kingdome, B. 2002. A water scoreboard, Setting performance targets for water utilities. Public policy for the private sector, Note number 242, The World Bank, Washington DC. Ugwu, O.O., and Haupt, T.C. 2007. Key performance indicators and assessment methods for infrastructure sustainability- a South African construction industry perspective. Building and Environment, 42(2): 665-680. UNC. 2013. Environmental Finance Center, cited on 22nd August 2013, http://efc.boisestate.edu/SustainableInfrastructure/FinancialSustainability/tabid/150/Default.aspx USAID. 2008. Water utility service quality monitoring for water systems in Armenia., United Sates Agency for International Development, Program Report No. 93, PA Government Services Inc. USEPA. 2013. National drinking water regulations, Drinking water contaminants, United States Environmental Protection Agency, cited on 24th January 2013 http://water.epa.gov/drink/contaminants/index.cfm#Microorganisms. 211 USEPA. 2009. Drinking water infrastructure needs survey and assessment. United States Environmental Protection Agency, Fourth Report to Congress, Office of Water, Washington DC. EPA-816-R-09-001. USEPA. 2006. Distribution system indicators of drinking water quality. United States Environmental Protection Agency (USEPA) Office of Groundwater and Drinking Water. Washington DC, cited on 26th August 2012, http://www.epa.gov/ogwdw/disinfection/tcr/pdfs/issuepaper_tcr_indicators.pdf. USEPA. 2006a. Much effort and resources needed to help small drinking water systems overcome challenges, Report No. 2006-P-00026. USEPA. 2005. Drinking water infrastructure needs survey and assessment. United States Environmental Protection Agency, Third Report to Congress, Office of water, Washington DC, EPA-816-R-05-001. USEPA. 2003. Analysis and findings of the Gallup organization’s drinking water customer satisfaction survey, United States Environmental Protection Agency, Office of Groundwater and Drinking Water, Washington D.C., USA. USEPA. 2001. Drinking water infrastructure needs survey and assessment. United States Environmental Protection Agency, Second Report to Congress, Office of water, Washington DC, EPA-816-R-01-004. USEPA. 2001. Risk assessment guidance for superfund: Volume III - Part A. Process for Conducting Probabilistic Risk Assessment Office of Emergency and Remedial Response, United States Environmental Protection Agency, Washington, DC, USA. USEPA. 1997. Drinking water infrastructure need survey, First Report to Congress, Office of water, Washington DC, EPA 812-R-97-001. USEPA. 1984. Ambient water quality criteria, United State Environmental Protection Agency, Report No. EPA 440/5-84-030. Office of Water, Criteria and Standards Division, Washington DC, USA. UNSCEAR. 2008. Report: Sources and effects of ionizing radiation. Vol. I. Sources. New York, NY, United Nations, United Nations Scientific Committee on the Effects of Atomic Radiation, cited on August 2012, http://www.unscear.org/unscear/en/publications/2008_1.html. Van Der Willigen, F. 1997. Duth experience and viewpoints on performance indicators. IWSA Workshop on Performance Indicators for Transmission and Distribution Systems, Lisbon, Portugal. VHA. 2011. National patient safety improvement handbook. Department of Veterans Affairs VHA Handbook 1050.01, Veterans Health Administration Transmittal Sheet Washington, DC. 2011. Water Canada. 2013. Cited on 12th September 2013. http://www.water.ca/textm.asp. WB. 2010. A review in Bangladesh, India and Pakistan, benchmarking for performance improvement in urban utilities, Water and Sanitation Program, World Bank Report. 212 WHO. 2012. UN-water global annual assessment of sanitation and drinking water (GLASS) 2012 report: the changes of extending sustaining services, UN Water Report 2012, Switzerland. WHO. 2011. Guidelines for drinking-water quality, Fourth Edition, Geneva, Switzerland. Wong, J., Li H., and Lai, J. 2008. Evaluating the system intelligence of the intelligent building systems part 1: Development of key intelligent indicators and conceptual analytical framework. Automation in Construction, 17(3):284-302. WOP-Africa. 2009. Water operators partnership African utility performance assessment, Final Report, Water and Sanitation Program. Worthington, A.C., and Higgs, H. 2014. Economies of scale and scope in Australian urban water utilities. Utilities Policy, 31(2014):52-62. WSP. 2012. Using credit ratings to improve water utility access to market finance in Sub-Saharan Africa, sustainable services for domestic private sector participation. Public Private Infrastructure Advisory Facility (PPIAF), Water and Sanitation Program: Learning Note. Cited on 3rd August 2013, https://www.wsp.org/sites/wsp.org/files/publications/WSP-Credit-Assessment-Kenya-Learning-Note.pdf. WSP. 2009. Benchmarking for improving water supply delivery, Water and Sanitation Program, Bangladesh Water Utilities Data Book, 2006-2007. WSSC. 2008. Customer satisfaction study & focus groups, Executive summary reports, Washington Suburban Sanitary Commission, Maryland, USA. WUG. 2014. Key performance indicators, Strategic Report, Water Utilities Group, cited on 27th November 2014, http://annualreport2014.unitedutilities.com/strategic-report/key-performance-indicators . Wyatt, A.S. 2010. Non-revenue water: Financial model for optimal management in developing countries. RTI Press Methods Report series, USA. Yeppes, G., and Dianderas, A. 1996. Water & wastewater utilities: Indicators, 2nd Edition Washington, World Bank. Yoon, K.P., and Hwang, C. 1995. Multiple attribute decision making-An introduction. SAGE Publications, California. Zadeh, L. A., 1978. Fuzzy sets as a basis for a theory of possibility. Fuzzy Set Systems, 1:3-28. Zardari, N.H., Cordery, I., and Sharma, A. 2010. An objective multiattribute analysis approach for allocation of scarce irrigation water resources. Journal of the American Water resources Association, 46(2):412-428. Zhang, Y., Love, N., and Edwards, M. 2009. Nitrification in drinking water systems. Critical Reviews in Environmental Science and Technology. 39(3):153-208. 213 Zoratti, S. 2009. Managing municipal assets for regulatory compliance with PASAB 3150, White Paper, The Createch Group, a Bell Canada Company, Canada. Zsidisin, G.A., and Ritchie, B. 2009. Supply chain risk, A handbook of assessment, management and performance. Springer, New York, USA. 214 Appendices Appendix A: Summary of Performance Indicator Systems Appendix A-1: IWA manual of performance indicators for water supply services (IWA 2006) No Indicator Category Indicator Sub-group Description of sub-groups with number of PIs in each sub-group* Total No of PIs in each indicator category 1. Water resources 4  Water resources availability (2)  Usage efficiency and reuse (2) 4 2. Personnel 7  Personnel data (2)  Personnel per function (7)  Technical services personnel per activity (6)  Personnel qualification (3)  Personnel training (3)  Personnel health and safety (4)  Overtime work (1) 26 3. Physical 6  Treatment (1)  Storage (2)  Pumping (4)  Transmission and distribution (2)  Metering coverage (4)  Automation and control (2) 15 4. Operational 9  Inspection and maintenance of physical assets (6)  Instrumentation calibration (5)  Electrical and signal transmission equipment installation (4)  Mains, valves and service connection rehabilitation (5)  Pumps rehabilitation (2)  Water losses (7)  Failures (6)  Water metering efficiency (4)  Water quality monitoring (5) 44 5. Quality of Service 6  Coverage (5)  Public taps and standpipes (4)  Pressure and continuity of supply (8)  Quality of supplied water (5)  Service connection and meter installation and repairs (3)  Customer complaints (9) 34 6. Financial and Economic 13  Revenues (3)  Costs (3)  Composition of running costs for type of costs (5)  Composition of running costs per main function of the water undertaking (5)  Composition of running costs per technical function activity (6)  Composition of capital costs (2)  Investments (3)  Average water charges (2)  Efficiency indicators (9)  Leverage indicators (2)  Liquidity (1)  Profitability (4)  Water losses (2) 47 Total 45 170 *only the types of indicators are described with number of sub-indicators in ‘brackets’ 215 Appendix A-2: IBNET system of performance indicators (World Bank 2011) No Indicator Category Total No of indicators Performance Indicators* 1. Process indicators 19  Utility planning description (1No)  Management of utility including training strategy, appraisal setting, etc (5No)  Higher management (1 No)  Types of financial resources (4 No)  Level of services (LOS) offered by the utility (4No)  Utility’s procedures to assess customer satisfaction (4No) 2. Service coverage 3  Percentage of population covered with easy access (1 No)  Population coverage per household connection and per public point (2 No) 3. Water consumption & production 11  Water production and consumption per person per day (2No)  Water production and consumption per connection per month (2 No)  Water consumption sub-divided into customer type categories, etc (4No)  Residential water consumption per person per connection, etc (3 No) 4. Non-revenue water 3  Non-revenue water as %; volume/km/day; and volume/connection/day (3No) 5. Meters 2  Metering level  Volume of sold water to metered connections 6. Network performance 1  Pipe breaks 7. Operating costs & staff 12  Operational costs of water produced and sold (4 No)  Staff per 1000 connections, and per 1000 persons (5 No)  Percentage of labor cost as total operational cost (1 No)  Percentage of energy cost as total operational cost (1 No)  Percentage of service contracted-out to the private sector (1 No) 8. Quality of service 5  Service continuity (2 No)  Quality of water (2 No)  Total number of complaints (1 No) 9. Billing & collections 20  Revenue (10No)  Tariff structure (6No)  Connection charges (2 No)  Collection period and efficiency (2 No) 10. Financial performance 2  Operating cost coverage  Debt service ratio 11. Assets 1  Fixed total assets per population served  Fixed water assets per population served 12. Affordability/ Purchasing power parity (PPP) 1  This indicator includes the indicators of revenue, tariff and connection charges and is estimated by converting gross national income (GNI) into US dollars for international comparison. PPP takes account of what can be purchased locally and should be considered for indicators of what customers pay. Total 80 *only the types of indicators are described with number of indicators mentioned in ‘brackets’ 216 Appendix A-3: Performance indicators developed by NWC, Australia (NWC 2012) No Indicator Category Total indicators Performance Indicators* Comments 1. Water Resources 23 indicators + 23 sub-indicators  Volume of water sources from surface water  Volume of water sourced from groundwater  Volume of water sourced from desalination (3 sub-indicators)  Volume of water sourced from recycling  Volume of water received from bulk supplier (2 sub-indicators)  Volume of bulk recycled water purchased  Total sourced water  Volume of water supplied residential (2 sub-indicators)  Volume of water supplied commercial, municipal and industrial (2 sub-indicators)  Volume of water supplied other (4 sub-indicators)  Total urban water supplied (3 sub-indicators)  Average annual residential water supplied  Volume of water supplied-environmental  Volume of the bulk water exports (2 sub-indicators)  Volume of the bulk recycled water exports  Volume of the recycled water supplied – residential  Volume of the recycled water supplied – Commercial  Volume of the recycled water supplied – agriculture  Volume of the recycled water supplied – environmental  Volume of the recycled water supplied - onsite  Total recycled water supplied (2 sub-indicators)  Recycled water (% of effluent recycled)  Total volume of urban Stormwater discharges (4 sub-indicators)  7 indicators of water sources with 3 sub-indicators (desalination of marine, ground and surface waters) and 2 sub-indicators for potable and non-potable water received from bulk supplier.  6 indicators of water supplied with 9 sub-indicators classifying various types of supply and water quality (i.e., potable or non-potable).  8 indicators of recycled water  Indicator of Stormwater supply to different users has further sub0indicators for various users (i.e., infrastructure operators, aquifer recharge, urban Stormwater and urban use). 2. Asset 7  Number of water treatment plant  Length of water mains  Properties served per km of water main  Water main breaks per 100km of water mains  Infrastructure leakage index (ILI)  Real losses per service connection per day  Real losses per km of water main per day  First three indicators represent the data of a utility and essentially are not the indicators of its performance. 3. Customer 12  Population receiving water supply services  Connected residential properties  Connected non-residential properties  Total connected properties  Water quality complaints per 1000 properties  Water service complaints per 1000 properties  Billing and account complaints per 1000 properties  Percentage of calls answered by an operator within 30 sec  Average duration of an unplanned interruption in minutes  Average frequency of unplanned interruptions  Costumers with restrictions applied for non-payment of bill per 1000 properties  Costumers faced legal action on non-payment of bill per 1000 properties  First four provide the information about customer’s data of the water utility 4. Environment 3  Greenhouse gas emissions ( tones CO2 equivalents per 1000 properties)**  Greenhouse gas emissions (tones CO2 equivalents per ML)  Net greenhouse gas emissions – other (tones CO2 equivalents per ML)  Relatively new indicator , importance to assess impacts of water utilities on global warming 5. Pricing 3 indicators + 16 sub-indicators  Tariff Structure (13 sub-indicators)  Annual bill based on 200kL/a (1 sub-indicator)  Typical residential bill (2 sub-indicators)  All the indicators are developed for a specific situation and therefore have limited applicability. 6. Finance 18 indicators + 4 sub-indicators  Total revenue from water utility  Percentage residential revenue from usage charges  Revenue per property for water supply services (I sub-indicator)  Income per property for whole of utility (I sub-indicator)  Revenue from community service obligations (CSO)  Nominal written-down replacement cost of the fixed assets  Operating cost of water per property (I sub-indicator)  Total capital expenditure  Total capital expenditure property (I sub-indicator)  Economic real rate of return  8 indicators covering, dividend, debt, interest net profit and capital grants  Revenue from CSO is an indicator for utilities where water is supplied at lower rates than its cost.  Total revenue, replacement cost, CSO, total capital expenditure, dividend, community service obligations and grants are the data elements. 7. Public health 7  Water quality guidelines  Number of zones where microbiological compliance was achieved  % of population where microbiological compliance was achieved  Number of zones where chemical compliance was achieved  Risk-based drinking water management plan externally assessed? (YES/NO)  Name of the risk based drinking water management plan used.  Public disclosure of drinking water performance? (YES/NO)  Water quality guidelines, risk based water management plan and public disclosure are data elements or overall process indicator. Total 73 *only the types of indicators are described with number of sub-indicators mentioned in ‘brackets’ ** CO2-equivalents are the amounts of carbon dioxide that would have the same relative warming effect as the greenhouse gases actually emitted. 217 Appendix A-4: AWWA system of performance indicators (AWWA 2004) No Indicator Group Total indicators Performance Indicators* Description 1. Organizational Development 11  Organizational Best Practices Index (7 indicators)  Employee Health and Safety Severity Rate  Training Hours Per Employee  Customer Accounts Per Employee (2 indicators)  Organizational Best Practices Index is a self-assessment of the degree to which different management practices including strategic planning, long-term financial planning and risk management planning are implemented by a utility.  Employee Health and Safety Severity Rate is a measure of lost workdays per employee per year.  Customer Accounts per Employee, MGD Water Delivered per Employee is a measure of employee efficiency and accounts for contributions completed through contracts. 2. Customer relations 8  Customer Service Complaints and Technical Quality Complaints per 1000 customer. (2 indicators practically)  Disruptions of Water Service per 1000 active customer accounts (2 indicators)  Residential Cost of Water Service (2 indicators)  Customer Service Cost Per Account  Billing Accuracy  Customer Service Complaints and Technical Quality Complaints per 1000 customer accounts. Practically two indicators; the first is associated to service; and the second quantifies complaints related to technical quality.  Disruptions of Water Service are measured for both planned and unplanned interruptions with respect to time (less than 4hours, 4 to 8 hours and greater than 12 hours).  Residential Cost of Water is a suite of six indicators, following two of which are specific to water utility; - Bill amount for monthly residential water service for a customer using 7500 gallons per month. - Average residential water bill amount for one month of service.  Customer Service Cost per Account is a measure of the cost a utility bears to manage a single customer account in a year.  Billing Accuracy is a measure of the number of error-driven bill adjustments per 10,000 issued bills per year. 3. Business Operations 4  Debt Ratio  System Renewal / Replacement Rate (2 indicators)  Return on Assets  Debt Ratio is a measure of utility indebtedness.  System Renewal / Replacement Rate is a measure of the degree to which a utility is renewing or replacing its infrastructure. Rates are provided for water distribution and treatment.  Return on Assets indicates the financial effectiveness of the utility. 4. Water Operations 8  Drinking Water Compliance  Distribution System Water Loss  Water Distribution System Integrity  Operations and Maintenance Cost Ratios (3 indicators)  Planned Maintenance Ratio (2 indicators)  Water loss essentially is unaccounted for water.  Water Distribution System Integrity is an assessment measure of the condition of the water distribution system in terms of number of repairable breaks and leaks per 100 miles of distribution mains.  Operations and Maintenance Cost Ratios is the ratio between the cost of operations and maintenance and number of accounts per millions of gallons of produced water.  Planned Maintenance Ratio measures how effectively utilities are investing in planned maintenance. Two separate indicators in terms of cost and hours invested in maintenance activities. Total 35 *only the types of indicators are described with number of indicators mentioned in ‘brackets’ 218 Appendix A-5: OFWAT system of performance indicators (OFWAT 2012 No Indicator Group Total indicators Performance Indicators Description 1. Customer experience 3  Properties at risk of low pressure  Service incentive mechanism (SIM)  Water supply interruptions of 12 hours or more  SIM is a financial mechanism to incentivize optimum levels of customer service through the price control process. 2. Reliability, availability and security 5  Service water non-infrastructure  Service water infrastructure  Leakage  Security of supply index (SoSI)  Population with hosepipe restriction;  Service water non-infrastructure and infrastructure cover a wide range of indicators relating to customer service, public health, the environment and asset performance.  Total leakage measures the sum of distribution and mains losses in mega litres per day (Ml/d).  SoCI indicates the extent to which a company is able to guarantee provision of its levels of service for restrictions of supply. 3. Environmental Impact 2  Greenhouse Gas Emissions  Pollution Incidents  Measurement of the annual operational GHG emissions of the regulated business.  The total number of pollution incidents in a calendar year emanating from a discharge or escape of a contaminant from a water company asset related to a water-related premise (also known as assets under the ‘water service’). 4. Financial 4  Post-tax return on capital,  Credit rating,  Gearing  Interest cover  Post-tax return is the current cost operating profit less tax as a return on regulatory capital value (e.g., 4.4%).  Credit rating is the current cost operating profit less tax as a return on regulatory capital value (e.g., AAA). It is the assessment from the rating agencies. The company gets a certificate showing its rating with all the customer agencies.  Gearing is the net debt as a percentage of the total regulatory capital value at the financial year. Total 14 219 Appendix A-6: Performance indicator system proposed by NRC, Canada (NRC 2010) No Objective/ Indicator Group Total indicators Indicators Assessment Criteria 1. Public safety 7  Percentage of affected population days/ year for which service pressures do not meet standards (A/S)  Percentage of critical service areas with more than one distribution system connection (S)  Percentage of average water delivery maintained during power failure (S)  Percentage of total population served capable of receiving emergency water supplies to meet minimum sanitation needs (S)  Condition rating of assets (A/S)  Remaining service life (A/S)  Protection against deliberate/vandalism acts (A/S)  Health impacts  Safety impacts  Security impacts  Environmental  Economic  Quality of service  Access to service  Adaptability  Asset P/R/D  Reliability of service  Capacity to meet demand 2. Public health 11  % of population days/Total Population days with Boil Water Advisories (BWA’s) (S)  Percentage of design capacity remaining on maximum demand days (A/S)  Storage capacity as a % of average daily demand (A/S)  Number of breaks/year/ km pipe (A/S)  Condition rating of assets (A/S)  Remaining service life (A/S)  Number of planned service interruptions as % of total service interruptions(S)  Rated capacity vs. Actual average service load (A/S)  Rated capacity vs. Actual daily maximum service load (A/S)  Reduction in number of illnesses, injuries and deaths / population served (S)  Protection against climate change impacts (A/S)  Health impacts  Safety impacts  Environmental  Economic  Quality of service  Reliability of service  Capacity to meet demand 3. Environmental quality 4  Percentage of water treatment water that is recycled (S)  Percentage of back flush water that meets environmental discharge standards (S)  Percentage of current water allocation used to meet current demand (S)  Reduction in Total GHG emissions /population served (A/S)  Health impacts  Environmental  Economic  Adaptability  Reliability of service  Capacity to meet demand 4. Social equity 4  Monthly average cost of service / median income (S)  Fee structure ( cost/consumption structure) (S)  Benefit/cost ratio of service and assets (A/S)  Percentage of total population in jurisdiction connected to central water services (S)  Health impacts  Safety impacts  Economic  Quality of service  Access to service  Capacity to meet demand 5. Economy 8  Reserve funds as a % of total present value of infrastructure (S)  Total cost (capitals, operations, maintenance, labour, materials, etc) of service / population in service area (S)  Reduction in total energy used / population in service area (A/S)  Remaining service life (A/S)  Percentage change in number of assets vs. % change in operational funding (S)  Agency costs / agency revenues (A/S)  Value of assets (S)  Percentage of underground infrastructure renewed or rehabilitated annually (S)  Environmental  Economic  Quality of service  Access to service  Adaptability  Asset P/R/D  Reliability of service  Capacity to meet demand 6. Public security 3  Number of acts of vandalism against agency’s assets / population served (A/S)  Security measures costs / number of security breaches / population served (S)  Protection against deliberate/vandalism acts (A/S)  Safety impacts  Security impacts  Environmental  Economic  Quality of service  Reliability of service Total 37 220 Appendix A-7: Performance indicator system proposed by ADB (ADB 2012) Target Total indicators Performance Indicators* Comments Impact Level 1. Meeting the needs for urban development to ensure water supply for living, production and other construction in urban areas 8  Newly increased no. of household and people (10,000 persons)  Total volume of water supplied, produced and used (6 indicators)*  Per capita ownership of urban maintenance and construction fund (RMB)  Indicators related to water resources and economic 2. Improving the life quality of residents to reduce the occurrence of water-born diseases 7  Water outage - persons affected and time of outage (3 indicators)  Water borne diseases – persons affected and time lost (2 indicators)  Economic loss from sudden events of drinking water born diseases (measured @ RMB10,000)  Per capita vol. of water for daily living (liter)  Operational indicators related to interruptions, water quality and economics Outcome Level 1. Improving water supply capacity through developing and improving water sources and building and rebuilding water plants and water supply pipeline networks 8  Water demand and production capacity (2 indicators)  Length of newly built and rebuilt water supply pipelines (km)  Coverage rate (%)  Proportions of water for household, usage in production, public and others (4 indicators)  Water resources, physical, quality of service indicators 2. Improving the quality of drinking water through improving and protecting water source, building new water treatment facilities and updating pipelines. 8  Water source biochemical water quality (4 indicators)  Biochemical water quality in the network (4 indicators)  Water quality 3.Improving the efficiency in water supply through building and rebuilding water plants and water supply pipelines 7  Rate of water consumed and recycled (2 indicators)  Volume of water sold (%)  Volume of water loss (%)  Power consumption by water plant due to water supply (KWH)  Comprehensive quality rate of water quality in plant and water (%)  Comprehensive quality rate of water quality in network and water (%)  Water resources, operational and customer service indicators 4.Improving the efficiency in Management and operation of the water supplying organization 7  Total industrial output value and growth (%)  Total profit and growth rate (%)  Fiscal subsidies and growth rate (%)  Employed people in organizations (10,000 persons)  Attainment rate of living drinking water (%)  Construction & operation of the water supply dispatch and monitoring system  Construction and operation of early warning, emergency response and handling information systems  Financial, personnel, quality of service and physical indicators 5.Improving the water Supplying conditions for poverty stricken population in urban areas 4  No. of water householder users with the lowest income (households)  Average drinking water vol. of households with the lowest income (ton)  Average drinking water expense of households with the lowest income (RMB)  Preferential policy on water charge subsidies for poverty-stricken households  Indicators related to poverty alleviation 6.Promoting the private sector′s participation in the construction and operation of urban water supply. 5  Water pricing (2 indicators)  Total annual water supply of non-state tap water companies (10,000 tons)  Length of water supplying pipelines of non-state tap water companies (km)  Non-state investment and its proportion (%)  Pricing and economic indicators Total 54 *only the types of indicators are described with number of indicators mentioned in ‘brackets’ 221 Appendix A-8: Performance indicator system proposed by ISO (CSA 2010) Objective/ Goal Total indicators Performance Indicators Comments/ Possible classification Utility Sufficient Capacity 1  Treated water storage capacity at average day demand (hours)  Physical (1) Minimum Sustainable Cost 4  Cost of water quality monitoring / Population served  Cost of customer billing / Service connection  (Total water operating cost + Cost of bulk water purchased) / Population Served  Rate of water for a typical size residential connection using 250 m³/year  Financial (4)* Protection of Environment 3  Cost of water conservation program / Population served  No. of days of water restrictions  Per capita average day consumption for residential customers  Financial (1)  Water resources (2) Customer Satisfaction 2  Cost of customer communication / Population served  No. of water quality customer complaints / 1,000 People served  Customer satisfaction (2) Public Health 1  No. of Boil-Water Advisory Days * Capita affected / Population served  Water quality (1) Water Distribution System Provide Reliable Service and Infrastructure 9  No of main breaks per 100 km Length  % of valves cycled  % of inoperable or leaking valves  Non-Revenue Water (L/connection/day)  % of hydrants checked & inspected 2010  % of inoperable or leaking hydrants  No. of emergency service connection repairs & replacements / total number of service connections  No. of unplanned system interruptions per 100 km Length  5 Year running average capital reinvestment / Replacement value  Physical (4)  Operational (3)  Customer Satisfaction (1)  Financial (1) Meet Service Requirements with Economic Efficiency 11  No. of Field Full Time Employees (FTEs) per 100 km Length  No. of O&M FTEs per 100 km Length  No. of In-house metering field FTEs per 1,000 Meters  25 Total Operating Cost with Actual Indirect Charge-back ('000) per km Length  O&M Cost ('000) per km Length  Pump Station O&M Cost ('000) / Total pump station horsepower  Pipes O&M Cost ('000) per km pipe length  Metering O&M cost / number of total meters  Pump station energy consumed kWh / Total pump station horsepower  Cost of fire hydrant O&M / # of fire hydrants  Unplanned maintenance hours / Total maintenance hours  Operational (1)  Financial (6)  Environment (1)  Personnel (3) Provide a Safe and Productive Workplace 6  No. of field accidents with lost time per 1,000 field labour hours  No. of lost hours due to field accidents per 1,000 field labour hours  no. of sick days taken per field employee  Total available field hours / total paid field hours  Total overtime field hours / total paid field hours  % of field employees eligible for retirement per year category 2010  Personnel (6) Have Satisfied and Informed Customers 1  No. of water pressure complaints by customers per 1,000 people served  Customer Satisfaction (1) Protect Public Health and Safety 4  % of main length cleaned  Average turbidity (NTU)  No. of Total Coliform (TC) Occurrences  Average THMs (mg/L)  Water Quality (3)  Operational (1) Water Treatment System Reliability 1  5 Year Running Average Capital Reinvestment / Replacement Value  Financial (1) Sufficient Capacity 2  Average Day Demand / Existing Water Licence Capacity  No. of Days the Plant Operated at >90% Capacity  Physical (2) Minimum Sustainable Cost 7  No. of Field FTEs / 1,000 ML Treated  No. of O&M FTEs / 1,000 ML Treated  O&M Cost / ML Treated  Total Operating Cost with Actual Indirect Charge-back / ML Treated  Energy Consumed in kWh / ML Treated  Chemical Cost / ML Treated  Unplanned Maintenance Hours / Total Maintenance Hours  Financial (3)  Environment (1)  Personnel (2)  Operational (1) Public Health 3  Median Turbidity (NTU)  No. of occurrences of TC  Median value of Nitrates (mg/L)  Water Quality (3) Safe & Productive Workplace 6  No. of field accidents with lost time per 1,000 field labour hours  No. of lost hours due to field accidents per 1,000 field labour hours  No. of sick days taken per field employee  No. total available field hours / total paid field hours  No. total overtime field hours / total paid field hours  % of field employees eligible for retirement per year category 2010  Personnel (6) Protect the Environment 1  % Residuals  Environment (1) Total 62 * number of PIs in possible conventional classification of PIs 222 Appendix B: Ethics Approval Certificate and Forms Appendix B-1: Ethics approval certificate 223 Appendix B-2: Ranking signed performa filled by Utility-I 224 225 226 Appendix B-3: Ranking signed performa filled by Utility-II 227 228 229 Appendix B-4: Ranking signed performa filled by Utility-III 230 231 232 Appendix B-5: Ranking signed performa filled by Expert-I 233 234 235 Appendix B-6: Ranking signed performa filled by Expert-II 236 237 238 Appendix B-7: Ranking signed performa filled by Expert-III 239 240 241 Appendix C: Fuzzy Rules for Intra-Utility Performance Management Model (In-UPM) Appendix C-1: Matrix defining fuzzy rules for ‘water resources and environmental sustainability Rule No Water resources sustainability (WRS) Existing water license capacity exhausted (WE3) Environmental protection (ENP) Water resources and environmental sustainability (WES) 1 Low Low Low Low 2 Low Low Medium Low 3 Low Low High Medium 4 Low Medium Low Low 5 Low Medium Medium Low 6 Low Medium High Medium 7 Low High Low Low 8 Low High Medium Low 9 Low High High Low 10 Medium Low Low Low 11 Medium Low Medium Medium 12 Medium Low High Medium 13 Medium Medium Low Medium 14 Medium Medium Medium Medium 15 Medium Medium High Medium 16 Medium High Low Low 17 Medium High Medium Medium 18 Medium High High Medium 19 High Low Low Medium 20 High Low Medium Medium 21 High Low High High 22 High Medium Low Medium 23 High Medium Medium Medium 24 High Medium High High 25 High High Low Low 26 High High Medium Medium 27 High High High Medium 242 Appendix C-2: Matrix defining fuzzy rules for ‘environmental protection’ Rule No Discharge of water treatment plant residuals (WE4) Impact of flushing water (IFW) Environmental Protection (ENP) 1 Low Low High 2 Low Medium Medium 3 Low High Low 4 Medium Low Medium 5 Medium Medium Medium 6 Medium High Low 7 High Low Low 8 High Medium Low 9 High High Low Appendix C-3: Matrix defining fuzzy rules for ‘impact of flushing water’ Rule No Dilution available in receiving water body (WE5) Distance between the flushing point and natural drain – (WE7) Impact of flushing water on aquatic life (IFW) 1 Low Low High 2 Low Medium Medium 3 Low High Low 4 Medium Low Medium 5 Medium Medium Medium 6 Medium High Low 7 High Low Low 8 High Medium Low 9 High High Low Appendix C-4: Matrix defining fuzzy rules for ‘water resources sustainability’ Rule No Water resources management (WRM) Implementation level of water conservation plan (WE6) Water resources sustainability (WRS) 1 Low Low Low 2 Low Medium Medium 3 Low High High 4 Medium Low Medium 5 Medium Medium Medium 6 Medium High High 7 High Low Medium 8 High Medium Medium 9 High High High 243 Appendix C-5: Matrix defining fuzzy rules for ‘water resources management’ Rule No Restrictions, consumption, and management (RCM) Non-revenue water (FE7) Water resources management 1 Low Low Medium 2 Low Medium Low 3 Low High Low 4 Medium Low Medium 5 Medium Medium Medium 6 Medium High Low 7 High Low High 8 High Medium Medium 9 High High Low 244 Appendix C-6: Matrix defining fuzzy rules for restrictions, ‘conservation and management’ Rule No Water restrictions (WE1) Per capita water consumption (WE2) Water resources and catchment management personnel (PE8) Restrictions, conservations, and management (RCM) 1 Low Low Low Medium 2 Low Low Medium Medium 3 Low Low High Medium 4 Low Medium Low Low 5 Low Medium Medium Medium 6 Low Medium High Medium 7 Low High Low Low 8 Low High Medium Low 9 Low High High Low 10 Medium Low Low Medium 11 Medium Low Medium Medium 12 Medium Low High Medium 13 Medium Medium Low Medium 14 Medium Medium Medium Medium 15 Medium Medium High Medium 16 Medium High Low Low 17 Medium High Medium Low 18 Medium High High Low 19 High Low Low Medium 20 High Low Medium Medium 21 High Low High High 22 High Medium Low Medium 23 High Medium Medium Medium 24 High Medium High Medium 25 High High Low Low 26 High High Medium Low 27 High High High Low 245 Appendix C-7: Matrix defining fuzzy rules for the ‘catchment and treatment employees’ Rule No Field FTEs - Treatment (PE5) Field FTEs - water resources and catchment management (PE8) Catchment and treatment employees (CTE) 1 Low Low Low 2 Low Medium Medium 3 Low High Medium 4 Medium Low Medium 5 Medium Medium Medium 6 Medium High High 7 High Low Medium 8 High Medium High 9 High High High Appendix C-8: Matrix defining fuzzy rules for the ‘metering and distribution employees’ Rule No Field FTEs - Distribution (PE1) Field FTEs - Metering (PE2) Metering and distribution employees (MDE) 1 Low Low Low 2 Low Medium Medium 3 Low High Medium 4 Medium Low Medium 5 Medium Medium Medium 6 Medium High High 7 High Low Medium 8 High Medium High 9 High High High Appendix C-9: Matrix defining fuzzy rules for the ‘loss due to field accidents’ Rule No Hours lost in field accidents – Distribution (PE3) Hours lost in field accidents – Treatment (PE3) Loss due to field accidents (LFA) 1 Low Low Low 2 Low Medium Medium 3 Low High Medium 4 Medium Low Medium 5 Medium Medium Medium 6 Medium High Medium 7 High Low Medium 8 High Medium Medium 9 High High High 246 Appendix C-10: Matrix defining fuzzy rules for the ‘personnel healthiness’ Rule No Sick days per employee – Distribution (PE4) Sick days per employee – Treatment (PE7) Personnel Healthiness (PEH) 1 Low Low High 2 Low Medium Medium 3 Low High Medium 4 Medium Low Medium 5 Medium Medium Medium 6 Medium High Medium 7 High Low Medium 8 High Medium Medium 9 High High Low Appendix C-11: Matrix defining fuzzy rules for the ‘overtime culture’ Rule No Overtime hours – Distribution (PE9) Overtime hours – Treatment (PE10) Overtime culture (OTC) 1 Low Low Low 2 Low Medium Medium 3 Low High Medium 4 Medium Low Medium 5 Medium Medium Medium 6 Medium High Medium 7 High Low Medium 8 High Medium Medium 9 High High High Appendix C-12: Matrix defining fuzzy rules for the ‘Productivity ratio’ Rule No Staff productivity (PE12) Degree of automation (PH2) Productivity ratio (PRR) 1 Low Low Low 2 Low Medium Low 3 Low High Low 4 Medium Low Low 5 Medium Medium Medium 6 Medium High Medium 7 High Low Medium 8 High Medium Medium 9 High High High 247 Appendix C-13: Matrix defining fuzzy rules for the ‘working environment efficacy’ Rule No Overtime culture – (OTC) Personnel training (PE11) Working environment efficacy (WEE) 1 Low Low Medium 2 Low Medium Medium 3 Low High High 4 Medium Low Medium 5 Medium Medium Medium 6 Medium High Medium 7 High Low Low 8 High Medium Medium 9 High High Medium 248 Appendix C-14: Matrix defining fuzzy rules for the ‘personnel adequacy’ Rule No Catchment and management employees (CTE) Metering and distribution employees (MDE) Productivity ration (PRR) Personnel adequacy (PEA) 1 Low Low Low Low 2 Low Low Medium Low 3 Low Low High Medium 4 Low Medium Low Low 5 Low Medium Medium Medium 6 Low Medium High Medium 7 Low High Low Medium 8 Low High Medium Medium 9 Low High High Medium 10 Medium Low Low Low 11 Medium Low Medium Medium 12 Medium Low High Medium 13 Medium Medium Low Medium 14 Medium Medium Medium Medium 15 Medium Medium High Medium 16 Medium High Low Medium 17 Medium High Medium Medium 18 Medium High High Medium 19 High Low Low Low 20 High Low Medium Medium 21 High Low High Medium 22 High Medium Low Medium 23 High Medium Medium Medium 24 High Medium High High 25 High High Low Medium 26 High High Medium Medium 27 High High High High 249 Appendix C-15: Matrix defining fuzzy rules for the ‘personnel health and safety’ Rule No Loss due to field accidents (LFA) Personnel healthiness (PEH) Implementation level of health and safety plan (PE13) Personnel health and safety (PHS) 1 Low Low Low Low 2 Low Low Medium Medium 3 Low Low High Medium 4 Low Medium Low Medium 5 Low Medium Medium Medium 6 Low Medium High Medium 7 Low High Low Medium 8 Low High Medium Medium 9 Low High High High 10 Medium Low Low Low 11 Medium Low Medium Medium 12 Medium Low High Medium 13 Medium Medium Low Low 14 Medium Medium Medium Medium 15 Medium Medium High Medium 16 Medium High Low Medium 17 Medium High Medium Medium 18 Medium High High Medium 19 High Low Low Low 20 High Low Medium Low 21 High Low High Medium 22 High Medium Low Low 23 High Medium Medium Medium 24 High Medium High Medium 25 High High Low Low 26 High High Medium Medium 27 High High High Medium 250 Appendix C-16: Matrix defining fuzzy rules for the ‘personnel productivity’ Rule No Personnel adequacy (PEA) Personnel health and safety (PHS) Working environment efficacy (WEE) Personnel Productivity 1 Low Low Low Low 2 Low Low Medium Low 3 Low Low High Medium 4 Low Medium Low Low 5 Low Medium Medium Medium 6 Low Medium High Medium 7 Low High Low Medium 8 Low High Medium Medium 9 Low High High Medium 10 Medium Low Low Low 11 Medium Low Medium Medium 12 Medium Low High Medium 13 Medium Medium Low Medium 14 Medium Medium Medium Medium 15 Medium Medium High Medium 16 Medium High Low Medium 17 Medium High Medium Medium 18 Medium High High High 19 High Low Low Medium 20 High Low Medium Medium 21 High Low High Medium 22 High Medium Low Medium 23 High Medium Medium Medium 24 High Medium High Medium 25 High High Low Medium 26 High High Medium Medium 27 High High High High 251 Appendix C-17: Matrix defining fuzzy rules for the ‘rehabilitation and replacement of pipes’ Rule No Pipe age (OP13) Pipes replaced (OP2) Pipes rehabilitated (OP3) Rehabilitation and replacement of pipes (RRP) 1 Low Low Low Medium 2 Low Low Medium Medium 3 Low Low High Medium 4 Low Medium Low Medium 5 Low Medium Medium High 6 Low Medium High High 7 Low High Low Medium 8 Low High Medium High 9 Low High High High 10 Medium Low Low Low 11 Medium Low Medium Medium 12 Medium Low High Medium 13 Medium Medium Low Medium 14 Medium Medium Medium Medium 15 Medium Medium High Medium 16 Medium High Low Medium 17 Medium High Medium Medium 18 Medium High High High 19 High Low Low Low 20 High Low Medium Low 21 High Low High Low 22 High Medium Low Low 23 High Medium Medium Medium 24 High Medium High Medium 25 High High Low Low 26 High High Medium Medium 27 High High High High 252 Appendix C-18: Matrix defining fuzzy rules for the ‘distribution system maintenance’ Rule No Valves replaced (OP7) Rehabilitation and replacement of pipes (RRP) Implementation level of risk based rehabilitation and replacement plan (OP14) Distribution system maintenance (DSM) 1 Low Low Low Low 2 Low Low Medium Low 3 Low Low High Medium 4 Low Medium Low Low 5 Low Medium Medium Medium 6 Low Medium High Medium 7 Low High Low Medium 8 Low High Medium Medium 9 Low High High Medium 10 Medium Low Low Low 11 Medium Low Medium Low 12 Medium Low High Medium 13 Medium Medium Low Medium 14 Medium Medium Medium Medium 15 Medium Medium High Medium 16 Medium High Low Medium 17 Medium High Medium Medium 18 Medium High High Medium 19 High Low Low Low 20 High Low Medium Low 21 High Low High Medium 22 High Medium Low Medium 23 High Medium Medium Medium 24 High Medium High Medium 25 High High Low Medium 26 High High Medium High 27 High High High High 253 Appendix C-19: Matrix defining fuzzy rules for the ‘delivery point maintenance’ Rule No Service connection rehabilitation (OP5) Operational meters (OP11) Delivery point maintenance (DPM) 1 Low Low Low 2 Low Medium Low 3 Low High Medium 4 Medium Low Low 5 Medium Medium Medium 6 Medium High Medium 7 High Low Low 8 High Medium Medium 9 High High High Appendix C-20: Matrix defining fuzzy rules for the ‘inspection and cleaning routine’ Rule No Hydrants inspection (OP8) Cleaning of storage tanks (OP9) Inspection and cleaning routine (ICR) 1 Low Low Low 2 Low Medium Medium 3 Low High Medium 4 Medium Low Medium 5 Medium Medium Medium 6 Medium High Medium 7 High Low Medium 8 High Medium Medium 9 High High High 254 Appendix C-21: Matrix defining fuzzy rules for the ‘distribution system integrity’ Rule No Distribution system maintenance (DSM) Delivery point maintenance (DPM) Inspection and cleaning routine (ICR) Distribution system integrity (DSI) 1 Low Low Low Low 2 Low Low Medium Low 3 Low Low High Low 4 Low Medium Low Low 5 Low Medium Medium Medium 6 Low Medium High Medium 7 Low High Low Low 8 Low High Medium Medium 9 Low High High Medium 10 Medium Low Low Low 11 Medium Low Medium Medium 12 Medium Low High Medium 13 Medium Medium Low Medium 14 Medium Medium Medium Medium 15 Medium Medium High Medium 16 Medium High Low Medium 17 Medium High Medium Medium 18 Medium High High Medium 19 High Low Low Low 20 High Low Medium Medium 21 High Low High Medium 22 High Medium Low Medium 23 High Medium Medium Medium 24 High Medium High Medium 25 High High Low Medium 26 High High Medium Medium 27 High High High High 255 Appendix C-22: Matrix defining fuzzy rules for the ‘distribution system failure’ Rule No Pipes breaks (OP1) Inoperable of leaking hydrants (OP6) Distribution system failure (DSF) 1 Low Low Low 2 Low Medium Low 3 Low High Medium 4 Medium Low Medium 5 Medium Medium Medium 6 Medium High High 7 High Low High 8 High Medium High 9 High High High Appendix C-23: Matrix defining fuzzy rules for the ‘distribution system performance’ Rule No Distribution system failure (DSF) Non-revenue water (OP4) Distribution system performance (DSP) 1 Low Low High 2 Low Medium Medium 3 Low High Low 4 Medium Low Medium 5 Medium Medium Medium 6 Medium High Low 7 High Low Medium 8 High Medium Low 9 High High Low Appendix C-24: Matrix defining fuzzy rules for the ‘distribution network productivity’ Rule No Network efficiency (OP11) Customers density (OP12) Distribution network productivity (DNP) 1 Low Low Low 2 Low Medium Medium 3 Low High Medium 4 Medium Low Medium 5 Medium Medium Medium 6 Medium High High 7 High Low Medium 8 High Medium High 9 High High High 256 Appendix C-25: Matrix defining fuzzy rules for the ‘operational integrity’ Rule No Distribution system integrity (DSI) Distribution system performance (DSP) Distribution network productivity (DNP) Operational integrity (OPI) 1 Low Low Low Low 2 Low Low Medium Low 3 Low Low High Low 4 Low Medium Low Low 5 Low Medium Medium Medium 6 Low Medium High Medium 7 Low High Low Low 8 Low High Medium Medium 9 Low High High Medium 10 Medium Low Low Low 11 Medium Low Medium Medium 12 Medium Low High Medium 13 Medium Medium Low Medium 14 Medium Medium Medium Medium 15 Medium Medium High Medium 16 Medium High Low Medium 17 Medium High Medium Medium 18 Medium High High Medium 19 High Low Low Low 20 High Low Medium Medium 21 High Low High Medium 22 High Medium Low Medium 23 High Medium Medium Medium 24 High Medium High Medium 25 High High Low Medium 26 High High Medium Medium 27 High High High High 257 Appendix C-26: Matrix defining fuzzy rules for the ‘public health safety’ Rule No Length of pipes cleaned (WP9) Boil water advisories (WP1) Residual chlorine in distribution system (WP4) Public health safety (PHS) 1 Low Low Low Medium 2 Low Low Medium Low 3 Low Low High Low 4 Low Medium Low Medium 5 Low Medium Medium Low 6 Low Medium High Low 7 Low High Low Low 8 Low High Medium Low 9 Low High High Low 10 Medium Low Low Medium 11 Medium Low Medium Medium 12 Medium Low High Low 13 Medium Medium Low Medium 14 Medium Medium Medium Medium 15 Medium Medium High Low 16 Medium High Low Medium 17 Medium High Medium Low 18 Medium High High Low 19 High Low Low High 20 High Low Medium Medium 21 High Low High Medium 22 High Medium Low Medium 23 High Medium Medium Medium 24 High Medium High Medium 25 High High Low Medium 26 High High Medium Medium 27 High High High Low 258 Appendix C-27: Matrix defining fuzzy rules for the ‘distribution systems water quality’ Rule No Turbidity in distribution system (WP2) Total coliforms occurrences in distribution system (WP3) Trihalomethanes in distribution system (WP7) Distribution system water quality (DWQ) 1 Low Low Low High 2 Low Low Medium Medium 3 Low Low High Medium 4 Low Medium Low Medium 5 Low Medium Medium Medium 6 Low Medium High Low 7 Low High Low Medium 8 Low High Medium Low 9 Low High High Low 10 Medium Low Low Medium 11 Medium Low Medium Medium 12 Medium Low High Low 13 Medium Medium Low Medium 14 Medium Medium Medium Medium 15 Medium Medium High Low 16 Medium High Low Low 17 Medium High Medium Low 18 Medium High High Low 19 High Low Low Medium 20 High Low Medium Low 21 High Low High Low 22 High Medium Low Low 23 High Medium Medium Low 24 High Medium High Low 25 High High Low Low 26 High High Medium Low 27 High High High Low 259 Appendix C-28: Matrix defining fuzzy rules for the ‘treatment systems water quality’ Rule No Turbidity of Treated water (WP5) Total coliforms occurrences in treated water (WP6) Nitrates as Nitrogen in treated water (WP8) Treatment system water quality (TWQ) 1 Low Low Low High 2 Low Low Medium Medium 3 Low Low High Medium 4 Low Medium Low Medium 5 Low Medium Medium Medium 6 Low Medium High Low 7 Low High Low Medium 8 Low High Medium Low 9 Low High High Low 10 Medium Low Low Medium 11 Medium Low Medium Medium 12 Medium Low High Low 13 Medium Medium Low Medium 14 Medium Medium Medium Medium 15 Medium Medium High Low 16 Medium High Low Low 17 Medium High Medium Low 18 Medium High High Low 19 High Low Low Medium 20 High Low Medium Low 21 High Low High Low 22 High Medium Low Low 23 High Medium Medium Low 24 High Medium High Low 25 High High Low Low 26 High High Medium Low 27 High High High Low 260 Appendix C-29: Matrix defining fuzzy rules for the ‘water quality compliance’ Rule No Distribution system water quality (DWQ) Treatment system water quality (TWQ) Water quality compliance (WQC) 1 Low Low Low 2 Low Medium Low 3 Low High Medium 4 Medium Low Low 5 Medium Medium Medium 6 Medium High Medium 7 High Low Medium 8 High Medium Medium 9 High High High Appendix C-30: Matrix defining fuzzy rules for the ‘water quality and public health safety’ Rule No Public health safety (PHS) Water quality compliance (WQC) Water quality and public health safety (WPS) 1 Low Low Low 2 Low Medium Low 3 Low High Medium 4 Medium Low Low 5 Medium Medium Medium 6 Medium High Medium 7 High Low Medium 8 High Medium Medium 9 High High High Appendix C-31: Matrix defining fuzzy rules for the ‘treated water storage capacity’ Rule No Per capita water consumption - residential (WE2) Implementation level of water conservation plan (WE6) Treated water storage capacity (PH5) 1 Low Low Medium 2 Low Medium Medium 3 Low High High 4 Medium Low Low 5 Medium Medium Medium 6 Medium High Medium 7 High Low Low 8 High Medium Low 9 High High Medium 261 Appendix C-32: Matrix defining fuzzy rules for the ‘storage capacity’ Rule No Raw water storage capacity (PH3) Treated water storage capacity (PH5) Storage capacity (SRC) 1 Low Low Low 2 Low Medium Low 3 Low High Medium 4 Medium Low Low 5 Medium Medium Medium 6 Medium High Medium 7 High Low Medium 8 High Medium Medium 9 High High High Appendix C-33: Matrix defining fuzzy rules for the ‘storage and treatment systems capacity’ Rule No Storage capacity (SRC) Treatment plant capacity (exceeding 90%) (PH4) Storage and treatment systems capacity (STC) 1 Low Low Medium 2 Low Medium Low 3 Low High Low 4 Medium Low Medium 5 Medium Medium Medium 6 Medium High Low 7 High Low High 8 High Medium Medium 9 High High Medium Appendix C-34: Matrix defining fuzzy rules for the ‘monitoring systems integrity’ Rule No Metering level (PH1) Degree of automation (PH2) Monitoring systems integrity (MSC) 1 Low Low Low 2 Low Medium Low 3 Low High Medium 4 Medium Low Low 5 Medium Medium Medium 6 Medium High Medium 7 High Low Medium 8 High Medium Medium 9 High High High 262 Appendix C-35: Matrix defining fuzzy rules for the ‘physical systems efficacy’ Rule No Storage and treatment systems capacity (STC) Monitoring systems integrity (MSC) Physical systems efficacy (PSE) 1 Low Low Low 2 Low Medium Low 3 Low High Medium 4 Medium Low Low 5 Medium Medium Medium 6 Medium High Medium 7 High Low Medium 8 High Medium Medium 9 High High High Appendix C-36: Matrix defining fuzzy rules for the ‘customers’ information level’ Rule No Unplanned interruptions (QS4) Unplanned maintenance hours (QS5) Customers’ information level (CIL) 1 Low Low High 2 Low Medium Medium 3 Low High Medium 4 Medium Low Medium 5 Medium Medium Medium 6 Medium High Low 7 High Low Medium 8 High Medium Low 9 High High Low 263 Appendix C-37: Matrix defining fuzzy rules for the ‘water quality adequacy’ Rule No Aesthetic water quality tests compliance (QS9) Microbiological water quality tests compliance (QS10) Physico-chemical water quality tests compliance (QS11) Water quality adequacy (WQA) 1 Low Low Low Low 2 Low Low Medium Low 3 Low Low High Low 4 Low Medium Low Low 5 Low Medium Medium Low 6 Low Medium High Low 7 Low High Low Low 8 Low High Medium Low 9 Low High High Low 10 Medium Low Low Low 11 Medium Low Medium Low 12 Medium Low High Low 13 Medium Medium Low Low 14 Medium Medium Medium Medium 15 Medium Medium High Medium 16 Medium High Low Low 17 Medium High Medium Medium 18 Medium High High Medium 19 High Low Low Low 20 High Low Medium Low 21 High Low High Low 22 High Medium Low Low 23 High Medium Medium Medium 24 High Medium High Medium 25 High High Low Low 26 High High Medium Medium 27 High High High High 264 Appendix C-38: Matrix defining fuzzy rules for the ‘customer service reliability’ Rule No Customers’ information level (CIL) Water quality adequacy (WQA) Customer service reliability (CSR) 1 Low Low Low 2 Low Medium Low 3 Low High Medium 4 Medium Low Low 5 Medium Medium Medium 6 Medium High Medium 7 High Low Low 8 High Medium Medium 9 High High High Appendix C-39: Matrix defining fuzzy rules for the ‘response to complaints’ Rule No Total response to reported complaints ((QS7) Average time of response (QS6) Response to complaints (RTC) 1 Low Low Low 2 Low Medium Low 3 Low High Low 4 Medium Low Medium 5 Medium Medium Medium 6 Medium High Low 7 High Low High 8 High Medium Medium 9 High High Low 265 Appendix C-40: Matrix defining fuzzy rules for the ‘complaints related to system integrity’ Rule No Pressure complaints (QS2) Water quality complaints (QS3) Service connection complaints (QS8) Complaints related to system integrity (CSI) 1 Low Low Low Low 2 Low Low Medium Low 3 Low Low High Low 4 Low Medium Low Low 5 Low Medium Medium Medium 6 Low Medium High Medium 7 Low High Low Low 8 Low High Medium Medium 9 Low High High Medium 10 Medium Low Low Low 11 Medium Low Medium Medium 12 Medium Low High Medium 13 Medium Medium Low Medium 14 Medium Medium Medium Medium 15 Medium Medium High Medium 16 Medium High Low Medium 17 Medium High Medium Medium 18 Medium High High Medium 19 High Low Low Low 20 High Low Medium Medium 21 High Low High Medium 22 High Medium Low Medium 23 High Medium Medium Medium 24 High Medium High Medium 25 High High Low Medium 26 High High Medium Medium 27 High High High High 266 Appendix C-41: Matrix defining fuzzy rules for the ‘customer satisfaction level’ Rule No Response to complaints (RTC) Billing complaints (QS1) Complaints related to systems integrity (CSI) Customers satisfaction level (CSL) 1 Low Low Low Medium 2 Low Low Medium Medium 3 Low Low High Low 4 Low Medium Low Medium 5 Low Medium Medium Medium 6 Low Medium High Low 7 Low High Low Low 8 Low High Medium Low 9 Low High High Low 10 Medium Low Low Medium 11 Medium Low Medium Medium 12 Medium Low High Medium 13 Medium Medium Low Medium 14 Medium Medium Medium Medium 15 Medium Medium High Low 16 Medium High Low Medium 17 Medium High Medium Low 18 Medium High High Low 19 High Low Low High 20 High Low Medium Medium 21 High Low High Medium 22 High Medium Low Medium 23 High Medium Medium Medium 24 High Medium High Medium 25 High High Low Medium 26 High High Medium Medium 27 High High High Low 267 Appendix C-42: Matrix defining fuzzy rules for the ‘service reliability and customer satisfaction’ Rule No Customer service reliability (CSR) Customers satisfaction level (CSL) Service reliability and customer satisfaction (SRCS) 1 Low Low Low 2 Low Medium Low 3 Low High Medium 4 Medium Low Low 5 Medium Medium Medium 6 Medium High Medium 7 High Low Medium 8 High Medium Medium 9 High High High Appendix C-43: Matrix defining fuzzy rules for the ‘customer water affordability’ Rule No Water rates for residential consumers (FE1) Affordability (FE8) Customers water affordability (CWA) 1 Low Low High 2 Low Medium High 3 Low High Medium 4 Medium Low High 5 Medium Medium Medium 6 Medium High Low 7 High Low Medium 8 High Medium Low 9 High High Low Appendix C-44: Matrix defining fuzzy rules for the ‘operation and maintenance cost sustainability’ Rule No Operation and maintenance cost of distribution (FE2) Operation and maintenance cost of treatment (FE4) Operation and maintenance cost sustainability (OMS) 1 Low Low Low 2 Low Medium Medium 3 Low High Medium 4 Medium Low Medium 5 Medium Medium Medium 6 Medium High High 7 High Low Medium 8 High Medium High 9 High High High 268 Appendix C-45: Matrix defining fuzzy rules for the ‘economic stability’ Rule No Debt service ratio (FE6) Customers water affordability (CWA) Economic stability (ECS) 1 Low Low Low 2 Low Medium Low 3 Low High Medium 4 Medium Low Low 5 Medium Medium Medium 6 Medium High Medium 7 High Low Medium 8 High Medium Medium 9 High High High Appendix C-46: Matrix defining fuzzy rules for the ‘revenue collection efficacy’ Rule No Revenue per unit of water sold (FE3) Non-revenue water (FE7) Revenue collection efficacy (RCE) 1 Low Low Medium 2 Low Medium Low 3 Low High Low 4 Medium Low Medium 5 Medium Medium Medium 6 Medium High Low 7 High Low High 8 High Medium Medium 9 High High Medium Appendix C-47: Matrix defining fuzzy rules for the ‘operational cost sustainability’ Rule No Operating cost coverage ratio (FE5) Operation and maintenance cost sustainability (OMS) Operational cost sustainability (OCS) 1 Low Low Low 2 Low Medium Low 3 Low High Low 4 Medium Low Medium 5 Medium Medium Medium 6 Medium High Low 7 High Low High 8 High Medium Medium 9 High High Medium 269 Appendix C-48: Matrix defining fuzzy rules for the ‘economic and financial viability’ Rule No Economic stability (ECS) Revenue collection efficacy (RCE) Operational cost sustainability (OCS) Economic and financial viability (EFV) 1 Low Low Low Low 2 Low Low Medium Low 3 Low Low High Low 4 Low Medium Low Low 5 Low Medium Medium Medium 6 Low Medium High Medium 7 Low High Low Low 8 Low High Medium Medium 9 Low High High Medium 10 Medium Low Low Low 11 Medium Low Medium Medium 12 Medium Low High Medium 13 Medium Medium Low Medium 14 Medium Medium Medium Medium 15 Medium Medium High Medium 16 Medium High Low Medium 17 Medium High Medium Medium 18 Medium High High Medium 19 High Low Low Low 20 High Low Medium Medium 21 High Low High Medium 22 High Medium Low Medium 23 High Medium Medium Medium 24 High Medium High Medium 25 High High Low Medium 26 High High Medium Medium 27 High High High High "@en ; edm:hasType "Thesis/Dissertation"@en ; vivo:dateIssued "2015-09"@en ; edm:isShownAt "10.14288/1.0074432"@en ; dcterms:language "eng"@en ; ns0:degreeDiscipline "Civil Engineering"@en ; edm:provider "Vancouver : University of British Columbia Library"@en ; dcterms:publisher "University of British Columbia"@en ; dcterms:rights "Attribution-NonCommercial-NoDerivs 2.5 Canada"@en ; ns0:rightsURI "http://creativecommons.org/licenses/by-nc-nd/2.5/ca/"@en ; ns0:scholarLevel "Graduate"@en ; dcterms:title "Performance management framework for small to medium sized water utilities : conceptualization to development and implementation"@en ; dcterms:type "Text"@en ; ns0:identifierURI "http://hdl.handle.net/2429/53582"@en .